I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

MCP-memgraph
Serveur memgraph mcp
4
Github Watches
2
Github Forks
8
Github Stars
🚀 Memgraph MCP Server
Memgraph MCP Server is a lightweight server implementation of the Model Context Protocol (MCP) designed to connect Memgraph with LLMs.
⚡ Quick start
1. Run Memgraph MCP Server
- Install
uv
and createvenv
withuv venv
. Activate virtual environment with.venv\Scripts\activate
. - Install dependencies:
uv add "mcp[cli]" httpx
- Run Memgraph MCP server:
uv run server.py
.
2. Run MCP Client
- Install Claude for Desktop.
- Add the Memgraph server to Claude config:
MacOS/Linux
code ~/Library/Application\ Support/Claude/claude_desktop_config.json
Windows
code $env:AppData\Claude\claude_desktop_config.json
Example config:
{
"mcpServers": {
"mpc-memgraph": {
"command": "/Users/katelatte/.local/bin/uv",
"args": [
"--directory",
"/Users/katelatte/projects/mcp-memgraph",
"run",
"server.py"
]
}
}
}
[!NOTE]
You may need to put the full path to the uv executable in the command field. You can get this by runningwhich uv
on MacOS/Linux orwhere uv
on Windows. Make sure you pass in the absolute path to your server.
3. Chat with the database
- Run Memgraph MAGE:
Thedocker run -p 7687:7687 memgraph/memgraph-mage --schema-info-enabled=True
--schema-info-enabled
configuration setting is set toTrue
to allow LLM to runSHOW SCHEMA INFO
query. - Open Claude Desktop and see the Memgraph tools and resources listed. Try it out! (You can load dummy data from Memgraph Lab Datasets)
🔧Tools
run_query()
Run a Cypher query against Memgraph.
🗃️ Resources
get_schema()
Get Memgraph schema information (prerequisite: --schema-info-enabled=True
).
🗺️ Roadmap
The Memgraph MCP Server is just at its beginnings. We're actively working on expanding its capabilities and making it even easier to integrate Memgraph into modern AI workflows. In the near future, we'll be releasing a TypeScript version of the server to better support JavaScript-based environments. Additionally, we plan to migrate this project into our central AI Toolkit repository, where it will live alongside other tools and integrations for LangChain, LlamaIndex, and MCP. Our goal is to provide a unified, open-source toolkit that makes it seamless to build graph-powered applications and intelligent agents with Memgraph at the core.
相关推荐
Evaluator for marketplace product descriptions, checks for relevancy and keyword stuffing.
I find academic articles and books for research and literature reviews.
Confidential guide on numerology and astrology, based of GG33 Public information
Advanced software engineer GPT that excels through nailing the basics.
Take an adjectivised noun, and create images making it progressively more adjective!
Découvrez la collection la plus complète et la plus à jour de serveurs MCP sur le marché. Ce référentiel sert de centre centralisé, offrant un vaste catalogue de serveurs MCP open-source et propriétaires, avec des fonctionnalités, des liens de documentation et des contributeurs.
Manipulation basée sur Micropython I2C de l'exposition GPIO de la série MCP, dérivée d'Adafruit_MCP230XX
Une passerelle API unifiée pour intégrer plusieurs API d'explorateur de blockchain de type étherscan avec la prise en charge du protocole de contexte modèle (MCP) pour les assistants d'IA.
L'application tout-en-un desktop et Docker AI avec chiffon intégré, agents AI, constructeur d'agent sans code, compatibilité MCP, etc.
Reviews

user_0guOvsmF
I have been using the TPC Server - Model Context Protocol (MCP) Implementation by suttonwilliamd, and it's excellent! The integration process was seamless, and its performance has been outstanding. The server has significantly improved our system's efficiency and reliability. Highly recommend this product to anyone looking for a robust MCP implementation!