Cover image
Try Now
2025-04-14

MCP服务器用于Chromadb与MCP兼容AI模型集成到光标中

3 years

Works with Finder

1

Github Watches

0

Github Forks

1

Github Stars

Chroma MCP Server

CI codecov PyPI - Version

A Model Context Protocol (MCP) server integration for Chroma, the open-source embedding database.

Motivation: AI Development Working Memory

In AI-assisted development workflows, particularly when using tools like Cursor or GitHub Copilot over multiple sessions, maintaining context from previous interactions is crucial but often manual. Developers frequently resort to creating temporary markdown files or other artifacts simply to capture and reload context into a new chat session.

The Chroma MCP Server aims to streamline this process by providing a persistent, searchable "working memory":

  • Automated Context Recall: Instead of manual context loading, AI assistants (guided by specific rules or instructions) can query this MCP server to retrieve relevant information from past sessions based on the current development task.
  • Developer-Managed Persistence: Developers can actively summarize key decisions, code snippets, or insights from the current session and store them in ChromaDB via the MCP interface. This allows building a rich, task-relevant knowledge base over time.
  • Separation of Concerns: This "working memory" is distinct from final user-facing documentation or committed code artifacts, focusing specifically on capturing the transient but valuable context of the development process itself.

By integrating ChromaDB through MCP, this server facilitates more seamless and context-aware AI-assisted development, reducing manual overhead and improving the continuity of complex tasks across multiple sessions.

Overview

The Chroma MCP Server allows you to connect AI applications with Chroma through the Model Context Protocol. This enables AI models to:

  • Store and retrieve embeddings
  • Perform semantic search on vector data
  • Manage collections of embeddings
  • Support RAG (Retrieval Augmented Generation) workflows

See the API Reference for a detailed list of available tools and their parameters.

Installation

Choose your preferred installation method:

Standard Installation

# Using pip
pip install chroma-mcp-server

# Using UVX (recommended for Cursor)
uv pip install chroma-mcp-server

Full Installation (with embedding models)

# Using pip
pip install chroma-mcp-server[full]

# Using UVX
uv pip install "chroma-mcp-server[full]"

Usage

Starting the server

# Using the command-line executable
chroma-mcp-server

# Or using the Python module
python -m chroma_mcp.server

Checking the Version

chroma-mcp-server --version

Configuration

The server can be configured with command-line options or environment variables:

Command-line Options

chroma-mcp-server --client-type persistent --data-dir ./my_data --log-dir ./logs --embedding-function accurate

Environment Variables

export CHROMA_CLIENT_TYPE=persistent
export CHROMA_DATA_DIR=./my_data
export CHROMA_LOG_DIR=./logs
export CHROMA_EMBEDDING_FUNCTION=accurate
chroma-mcp-server

Available Configuration Options

  • --client-type: Type of Chroma client (ephemeral, persistent, http, cloud)
  • --data-dir: Path to data directory for persistent client
  • --log-dir: Path to log directory
  • --host: Host address for HTTP client
  • --port: Port for HTTP client
  • --ssl: Whether to use SSL for HTTP client
  • --tenant: Tenant ID for Cloud client
  • --database: Database name for Cloud client
  • --api-key: API key for Cloud client
  • --cpu-execution-provider: Force CPU execution provider for local embedding functions (auto, true, false)
  • --embedding-function: Name of the embedding function to use. Choices: 'default'/'fast' (Local CPU, balanced), 'accurate' (Local CPU/GPU via sentence-transformers, higher accuracy), 'openai' (API, general purpose), 'cohere' (API, retrieval/multilingual focus), 'huggingface' (API, flexible model choice), 'jina' (API, long context focus), 'voyageai' (API, retrieval focus), 'gemini' (API, general purpose). API-based functions require corresponding API keys set as environment variables (e.g., OPENAI_API_KEY).

See Getting Started for more setup details.

Cursor Integration

To use with Cursor, add the following to your .cursor/mcp.json:

{
  "mcpServers": {
    "chroma": {
      "command": "uvx",
      "args": [
        "chroma-mcp-server",
        "--embedding-function=default" // Example: Choose your desired embedding function
      ],
      "env": {
        "CHROMA_CLIENT_TYPE": "persistent",
        "CHROMA_DATA_DIR": "/path/to/data/dir", // Replace with your actual path
        "CHROMA_LOG_DIR": "/path/to/logs/dir",   // Replace with your actual path
        "LOG_LEVEL": "INFO",
        "MCP_LOG_LEVEL": "INFO",
        // Add API keys here if using API-based embedding functions
        // "OPENAI_API_KEY": "your_openai_key",
        // "GOOGLE_API_KEY": "your_google_key"
      }
    }
  }
}

See Cursor Integration for more details.

Development

For instructions on how to set up the development environment, run tests, build the package, and contribute, please see the Developer Guide.

Working Memory and Thinking Tools

This server includes specialized tools for creating a persistent, searchable "working memory" to aid AI development workflows. Learn more about how these tools leverage embeddings to manage context across sessions in the Embeddings and Thinking Tools Guide.

Testing the Tools

A simulated workflow using the MCP tools is available in the MCP Test Flow document.

License

MIT License (see LICENSE)

相关推荐

  • NiKole Maxwell
  • I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • Alexandru Strujac
  • Efficient thumbnail creator for YouTube videos

  • J. DE HARO OLLE
  • Especialista en juegos de palabras en varios idiomas.

  • Daren White
  • A supportive coach for mastering all Spanish tenses.

  • albert tan
  • Japanese education, creating tailored learning experiences.

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • Yusuf Emre Yeşilyurt
  • I find academic articles and books for research and literature reviews.

  • Carlos Ferrin
  • Encuentra películas y series en plataformas de streaming.

  • apappascs
  • 发现市场上最全面,最新的MCP服务器集合。该存储库充当集中式枢纽,提供了广泛的开源和专有MCP服务器目录,并提供功能,文档链接和贡献者。

  • ShrimpingIt
  • MCP系列GPIO Expander的基于Micropython I2C的操作,源自ADAFRUIT_MCP230XX

  • pontusab
  • 光标与风浪冲浪社区,查找规则和MCP

  • av
  • 毫不费力地使用一个命令运行LLM后端,API,前端和服务。

  • rahulc0dy
  • 测试您的MCP服务器。

  • jae-jae
  • MCP服务器使用剧作《无头浏览器》获取网页内容。

  • ravitemer
  • 一个功能强大的Neovim插件,用于管理MCP(模型上下文协议)服务器

  • patruff
  • Ollama和MCP服务器之间的桥梁,使本地LLMS可以使用模型上下文协议工具

  • HiveNexus
  • 一个适用于中小型团队的AI聊天机器人,支持DeepSeek,Open AI,Claude和Gemini等车型。 专为中小团队设计的ai聊天应用,支持deepSeek,打开ai,claude,双子座等模型。

  • JackKuo666
  • 🔍使AI助手可以通过简单的MCP接口搜索和访问PYPI软件包信息。

    Reviews

    4 (1)
    Avatar
    user_Tlal9dQC
    2025-04-16

    GemSuite MCP is an invaluable tool for anyone using the Model Context Protocol. This comprehensive Gemini API integration by PV-Bhat has streamlined my workflow significantly. It’s user-friendly, efficient, and reliable. Highly recommended for all MCP users looking to enhance their productivity. Check it out at https://mcp.so/server/gemsuite-mcp/PV-Bhat!