Cover image
Try Now
2025-04-10

3 years

Works with Finder

3

Github Watches

0

Github Forks

2

Github Stars

Contextual MCP Server

A Model Context Protocol (MCP) server that provides RAG (Retrieval-Augmented Generation) capabilities using Contextual AI. This server integrates with a variety of MCP clients. In this readme, we will show integration with the both Cursor IDE and Claude Desktop.

Overview

This MCP server acts as a bridge between AI interfaces (Cursor IDE or Claude Desktop) and a specialized Contextual AI agent. It enables:

  1. Query Processing: Direct your domain specific questions to a dedicated Contextual AI agent
  2. Intelligent Retrieval: Searches through comprehensive information in your knowledge base
  3. Context-Aware Responses: Generates answers that are:
  • Grounded in source documentation
  • Include citations and attributions
  • Maintain conversation context

Integration Flow

Cursor/Claude Desktop → MCP Server → Contextual AI RAG Agent
        ↑                  ↓             ↓                         
        └──────────────────┴─────────────┴─────────────── Response with citations

Prerequisites

  • Python 3.10 or higher
  • Cursor IDE and/or Claude Desktop
  • Contextual AI API key
  • MCP-compatible environment

Installation

  1. Clone the repository:
git clone https://github.com/ContextualAI/contextual-mcp-server.git
cd contextual-mcp-server
  1. Create and activate a virtual environment:
python -m venv .venv
source .venv/bin/activate  # On Windows, use `.venv\Scripts\activate`
  1. Install dependencies:
pip install -e .

Configuration

Configure MCP Server

The server requires modifications of settings or use. For example, the single use server should be customized with an appropriate docstring for your RAG Agent.

The docstring for your query tool is critical as it helps the MCP client understand when to route questions to your RAG agent. Make it specific to your knowledge domain. Here is an example:

A research tool focused on financial data on the largest US firms

or

A research tool focused on technical documents for Omaha semiconductors

The server also requires the following settings from your RAG Agent:

  • API_KEY: Your Contextual AI API key
  • AGENT_ID: Your Contextual AI agent ID

If you'd like to store these files in .env file you can specify them like so:

cat > .env << EOF
API_KEY=key...
AGENT_ID=...
EOF

AI Interface Integration

This MCP server can be integrated with a variety of clients. To use with either Cursor IDE or Claude Desktop create or modify the MCP configuration file in the appropriate location:

  1. First, find the path to your uv installation:
UV_PATH=$(which uv)
echo $UV_PATH
# Example output: /Users/username/miniconda3/bin/uv
  1. Create the configuration file using the full path from step 1:
cat > mcp.json << EOF
{
 "mcpServers": {
   "ContextualAI-TechDocs": {
     "command": "$UV_PATH", # make sure this is set properly
     "args": [
       "--directory",
       "\${workspaceFolder}",  # Will be replaced with your project path
       "run",
       "multi-agent/server.py"
     ]
   }
 }
}
EOF
  1. Move to the correct folder location, see below for options:
mkdir -p .cursor/
mv mcp.json .cursor/

Configuration locations:

  • For Cursor:
  • Project-specific: .cursor/mcp.json in your project directory
  • Global: ~/.cursor/mcp.json for system-wide access
  • For Claude Desktop:
  • Use the same configuration file format in the appropriate Claude Desktop configuration directory

Environment Setup

This project uses uv for dependency management, which provides faster and more reliable Python package installation.

Usage

The server provides Contextual AI RAG capabilities using the python SDK, which can available a variety of commands accessible from MCP clients, such as Cursor IDE and Claude Desktop. The current server focuses on using the query command from the Contextual AI python SDK, however you could extend this to support other features such as listing all the agents, updating retrieval settings, updating prompts, extracting retrievals, or downloading metrics.

Example Usage

# In Cursor, you might ask:
"Show me the code for initiating the RF345 microchip?"

# The MCP client will:
1. Determine if this should be routed to the MCP Server

# Then the MCP server will:
1. Route the query to the Contextual AI agent
2. Retrieve relevant documentation
3. Generate a response with specific citations
4. Return the formatted answer to Cursor

Key Benefits

  1. Accurate Responses: All answers are grounded in your documentation
  2. Source Attribution: Every response includes references to source documents
  3. Context Awareness: The system maintains conversation context for follow-up questions
  4. Real-time Updates: Responses reflect the latest documentation in your datastore

Development

Modifying the Server

To add new capabilities:

  1. Add new tools by creating additional functions decorated with @mcp.tool()
  2. Define the tool's parameters using Python type hints
  3. Provide a clear docstring describing the tool's functionality

Example:

@mcp.tool()
def new_tool(param: str) -> str:
   """Description of what the tool does"""
   # Implementation
   return result

Limitations

  • The server runs locally and may not work in remote development environments
  • Tool responses are subject to Contextual AI API limits and quotas
  • Currently only supports stdio transport mode

For all the capabilities of Contextual AI, please check the official documentation.

相关推荐

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • Alexandru Strujac
  • Efficient thumbnail creator for YouTube videos

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • Yusuf Emre Yeşilyurt
  • I find academic articles and books for research and literature reviews.

  • Carlos Ferrin
  • Encuentra películas y series en plataformas de streaming.

  • lumpenspace
  • Take an adjectivised noun, and create images making it progressively more adjective!

  • https://zenepic.net
  • Embark on a thrilling diplomatic quest across a galaxy on the brink of war. Navigate complex politics and alien cultures to forge peace and avert catastrophe in this immersive interstellar adventure.

  • apappascs
  • 发现市场上最全面,最新的MCP服务器集合。该存储库充当集中式枢纽,提供了广泛的开源和专有MCP服务器目录,并提供功能,文档链接和贡献者。

  • ShrimpingIt
  • MCP系列GPIO Expander的基于Micropython I2C的操作,源自ADAFRUIT_MCP230XX

  • pontusab
  • 光标与风浪冲浪社区,查找规则和MCP

  • av
  • 毫不费力地使用一个命令运行LLM后端,API,前端和服务。

  • ravitemer
  • 一个功能强大的Neovim插件,用于管理MCP(模型上下文协议)服务器

  • jae-jae
  • MCP服务器使用剧作《无头浏览器》获取网页内容。

  • patruff
  • Ollama和MCP服务器之间的桥梁,使本地LLMS可以使用模型上下文协议工具

  • 1Panel-dev
  • 🔥1Panel提供了直观的Web接口和MCP服务器,用于在Linux服务器上管理网站,文件,容器,数据库和LLMS。

  • Mintplex-Labs
  • 带有内置抹布,AI代理,无代理构建器,MCP兼容性等的多合一桌面和Docker AI应用程序。

  • GeyserMC
  • 与Minecraft客户端/服务器通信的库。

    Reviews

    1 (1)
    Avatar
    user_mRYEjA1h
    2025-04-18

    As a dedicated user of contextual-mcp-server by ContextualAI, I can confidently say this tool has significantly enhanced our contextual data processing capabilities. Efficient and robust, it offers seamless integration and superior performance. Highly recommend checking it out! More details at their GitHub page: https://github.com/ContextualAI/contextual-mcp-server.