I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

MCP-Doc
Servidor de documentación de protocolo de contexto modelo para Langgraph y MCP.
3 years
Works with Finder
1
Github Watches
1
Github Forks
1
Github Stars
MCP Documentation Server
A customized version of the MCP documentation server that enables integration between LLM applications (like Cursor, Claude Desktop, Windsurf) and documentation sources via the Model Context Protocol.
Overview
This server provides MCP host applications with:
- Access to specific documentation files (langgraph.txt and mcp.txt)
- Tools to fetch documentation from URLs within those files
Supported Documentation
Currently set up for:
- LangGraph Documentation (from https://raw.githubusercontent.com/esakrissa/mcp-doc/main/docs/langgraph.txt)
- MCP Documentation (from https://raw.githubusercontent.com/esakrissa/mcp-doc/main/docs/mcp.txt)
Quick Start
Setup and Run
# Clone the repository
git clone https://github.com/esakrissa/mcp-doc.git
cd mcp-doc
# Create and activate a virtual environment
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
# Install the package in development mode
pip install -e .
Running the Server
You can run the server using the installed command:
# Run the server with the config file
mcpdoc \
--json config.json \
--transport sse \
--port 8082 \
--host localhost
Or if you prefer using UV:
# Install uv (if not already installed)
curl -LsSf https://astral.sh/uv/install.sh | sh
# Run the server with UV
uvx --from mcpdoc mcpdoc \
--json config.json \
--transport sse \
--port 8082 \
--host localhost
IDE Integration
Cursor
Add to ~/.cursor/mcp.json
{
"mcpServers": {
"mcp-doc": {
"command": "uvx",
"args": [
"--from",
"mcpdoc",
"mcpdoc",
"--urls",
"LangGraph:https://raw.githubusercontent.com/esakrissa/mcp-doc/main/docs/langgraph.txt",
"ModelContextProtocol:https://raw.githubusercontent.com/esakrissa/mcp-doc/main/docs/mcp.txt",
"--allowed-domains",
"*",
"--transport",
"stdio"
]
}
}
}
Then add these instructions to Cursor's Custom Instructions:
for ANY question about LangGraph and Model Context Protocol (MCP), use the mcp-doc server to help answer --
+ call list_doc_sources tool to get the available documentation files
+ call fetch_docs tool to read the langgraph.txt or mcp.txt file
+ reflect on the urls in langgraph.txt or mcp.txt
+ reflect on the input question
+ call fetch_docs on any urls relevant to the question
+ use this to answer the question
To test if the integration is working, ask Cursor a question about LangGraph or MCP, and check if it uses the documentation server tools to fetch information.
Security Note
For security reasons, strict domain access controls are implemented:
- Remote documentation files: Only the specific domain is automatically allowed
- Local documentation files: No domains are automatically allowed
- Use
--allowed-domains
to explicitly add domains or--allowed-domains '*'
to allow all (use with caution)
References
This project is based on the original mcpdoc by LangChain AI, modified to provide focused documentation access for LangGraph and MCP.
相关推荐
Confidential guide on numerology and astrology, based of GG33 Public information
Converts Figma frames into front-end code for various mobile frameworks.
Advanced software engineer GPT that excels through nailing the basics.
Take an adjectivised noun, and create images making it progressively more adjective!
Descubra la colección más completa y actualizada de servidores MCP en el mercado. Este repositorio sirve como un centro centralizado, que ofrece un extenso catálogo de servidores MCP de código abierto y propietarios, completos con características, enlaces de documentación y colaboradores.
Manipulación basada en Micrypthon I2C del expansor GPIO de la serie MCP, derivada de AdaFruit_MCP230xx
Servidor MCP para obtener contenido de la página web con el navegador sin cabeza de dramaturgo.
Un poderoso complemento Neovim para administrar servidores MCP (protocolo de contexto del modelo)
Un bot de chat de IA para equipos pequeños y medianos, que apoyan modelos como Deepseek, Open AI, Claude y Gemini. 专为中小团队设计的 ai 聊天应用 , 支持 Deepseek 、 Open ai 、 Claude 、 Géminis 等模型。
Puente entre los servidores Ollama y MCP, lo que permite a LLM locales utilizar herramientas de protocolo de contexto del modelo
🔍 Habilitar asistentes de IA para buscar y acceder a la información del paquete PYPI a través de una interfaz MCP simple.
Reviews

user_ULQhLqst
OpenAPI MCP Server by rahgadda is an absolutely fantastic product! It's user-friendly and integrates seamlessly into various applications. The server is highly efficient and the support from the author is outstanding. If you are looking for a reliable MCP server, look no further. Highly recommended!