MCP cover image
See in Github
2025-04-03

用于顶点AI搜索的MCP服务器

4

Github Watches

4

Github Forks

10

Github Stars

MCP Server for Vertex AI Search

This is a MCP server to search documents using Vertex AI.

Architecture

This solution uses Gemini with Vertex AI grounding to search documents using your private data. Grounding improves the quality of search results by grounding Gemini's responses in your data stored in Vertex AI Datastore. We can integrate one or multiple Vertex AI data stores to the MCP server. For more details on grounding, refer to Vertex AI Grounding Documentation.

Architecture

How to use

There are two ways to use this MCP server. If you want to run this on Docker, the first approach would be good as Dockerfile is provided in the project.

1. Clone the repository

# Clone the repository
git clone git@github.com:ubie-oss/mcp-vertexai-search.git

# Create a virtual environment
uv venv
# Install the dependencies
uv sync --all-extras

# Check the command
uv run mcp-vertexai-search

Install the python package

The package isn't published to PyPI yet, but we can install it from the repository. We need a config file derives from config.yml.template to run the MCP server, because the python package doesn't include the config template. Please refer to Appendix A: Config file for the details of the config file.

# Install the package
pip install git+https://github.com/ubie-oss/mcp-vertexai-search.git

# Check the command
mcp-vertexai-search --help

Development

Prerequisites

Set up Local Environment

# Optional: Install uv
python -m pip install -r requirements.setup.txt

# Create a virtual environment
uv venv
uv sync --all-extras

Run the MCP server

This supports two transports for SSE (Server-Sent Events) and stdio (Standard Input Output). We can control the transport by setting the --transport flag.

We can configure the MCP server with a YAML file. config.yml.template is a template for the config file. Please modify the config file to fit your needs.

uv run mcp-vertexai-search serve \
    --config config.yml \
    --transport <stdio|sse>

Test the Vertex AI Search

We can test the Vertex AI Search by using the mcp-vertexai-search search command without the MCP server.

uv run mcp-vertexai-search search \
    --config config.yml \
    --query <your-query>

Appendix A: Config file

config.yml.template is a template for the config file.

  • server
    • server.name: The name of the MCP server
  • model
    • model.model_name: The name of the Vertex AI model
    • model.project_id: The project ID of the Vertex AI model
    • model.location: The location of the model (e.g. us-central1)
    • model.impersonate_service_account: The service account to impersonate
    • model.generate_content_config: The configuration for the generate content API
  • data_stores: The list of Vertex AI data stores
    • data_stores.project_id: The project ID of the Vertex AI data store
    • data_stores.location: The location of the Vertex AI data store (e.g. us)
    • data_stores.datastore_id: The ID of the Vertex AI data store
    • data_stores.tool_name: The name of the tool
    • data_stores.description: The description of the Vertex AI data store

相关推荐

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • NiKole Maxwell
  • I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

  • Yusuf Emre Yeşilyurt
  • I find academic articles and books for research and literature reviews.

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • Carlos Ferrin
  • Encuentra películas y series en plataformas de streaming.

  • Bora Yalcin
  • Evaluator for marketplace product descriptions, checks for relevancy and keyword stuffing.

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • Contraband Interactive
  • Emulating Dr. Jordan B. Peterson's style in providing life advice and insights.

  • rustassistant.com
  • Your go-to expert in the Rust ecosystem, specializing in precise code interpretation, up-to-date crate version checking, and in-depth source code analysis. I offer accurate, context-aware insights for all your Rust programming questions.

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • Alexandru Strujac
  • Efficient thumbnail creator for YouTube videos

  • apappascs
  • 发现市场上最全面,最新的MCP服务器集合。该存储库充当集中式枢纽,提供了广泛的开源和专有MCP服务器目录,并提供功能,文档链接和贡献者。

  • Mintplex-Labs
  • 带有内置抹布,AI代理,无代理构建器,MCP兼容性等的多合一桌面和Docker AI应用程序。

  • modelcontextprotocol
  • 模型上下文协议服务器

  • n8n-io
  • 具有本机AI功能的公平代码工作流程自动化平台。将视觉构建与自定义代码,自宿主或云相结合,400+集成。

  • ShrimpingIt
  • MCP系列GPIO Expander的基于Micropython I2C的操作,源自ADAFRUIT_MCP230XX

  • WangRongsheng
  • 🧑‍🚀 llm 资料总结(数据处理、模型训练、模型部署、 o1 模型、mcp 、小语言模型、视觉语言模型)|摘要世界上最好的LLM资源。

  • OffchainLabs
  • 进行以太坊的实施

  • metorial
  • 数百个MCP服务器的容器化版本📡📡

    Reviews

    2 (1)
    Avatar
    user_Hmammfhh
    2025-04-15

    I recently started using the MCP TypeScript SDK and it's been a game-changer for my development projects. The integration was seamless and the documentation provided by ghassansalloum is clear and concise. It's evident that a lot of thought went into this SDK, making it extremely reliable and efficient. Highly recommend for anyone looking to streamline their TypeScript workflow! Check it out here: https://mcp.so/server/stripe-tax-mcp-server/ghassansalloum