I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

MCP-Qdrant-Memory
MCP Server fournissant une implémentation de graphique de connaissances avec des capacités de recherche sémantique alimentées par la base de données vectorielle QDrant
1
Github Watches
7
Github Forks
8
Github Stars
MCP Memory Server with Qdrant Persistence
This MCP server provides a knowledge graph implementation with semantic search capabilities powered by Qdrant vector database.
Features
- Graph-based knowledge representation with entities and relations
- File-based persistence (memory.json)
- Semantic search using Qdrant vector database
- OpenAI embeddings for semantic similarity
- HTTPS support with reverse proxy compatibility
- Docker support for easy deployment
Environment Variables
The following environment variables are required:
# OpenAI API key for generating embeddings
OPENAI_API_KEY=your-openai-api-key
# Qdrant server URL (supports both HTTP and HTTPS)
QDRANT_URL=https://your-qdrant-server
# Qdrant API key (if authentication is enabled)
QDRANT_API_KEY=your-qdrant-api-key
# Name of the Qdrant collection to use
QDRANT_COLLECTION_NAME=your-collection-name
Setup
Local Setup
- Install dependencies:
npm install
- Build the server:
npm run build
Docker Setup
- Build the Docker image:
docker build -t mcp-qdrant-memory .
- Run the Docker container with required environment variables:
docker run -d \
-e OPENAI_API_KEY=your-openai-api-key \
-e QDRANT_URL=http://your-qdrant-server:6333 \
-e QDRANT_COLLECTION_NAME=your-collection-name \
-e QDRANT_API_KEY=your-qdrant-api-key \
--name mcp-qdrant-memory \
mcp-qdrant-memory
Add to MCP settings:
{
"mcpServers": {
"memory": {
"command": "/bin/zsh",
"args": ["-c", "cd /path/to/server && node dist/index.js"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
"QDRANT_API_KEY": "your-qdrant-api-key",
"QDRANT_URL": "http://your-qdrant-server:6333",
"QDRANT_COLLECTION_NAME": "your-collection-name"
},
"alwaysAllow": [
"create_entities",
"create_relations",
"add_observations",
"delete_entities",
"delete_observations",
"delete_relations",
"read_graph",
"search_similar"
]
}
}
}
Tools
Entity Management
-
create_entities
: Create multiple new entities -
create_relations
: Create relations between entities -
add_observations
: Add observations to entities -
delete_entities
: Delete entities and their relations -
delete_observations
: Delete specific observations -
delete_relations
: Delete specific relations -
read_graph
: Get the full knowledge graph
Semantic Search
-
search_similar
: Search for semantically similar entities and relationsinterface SearchParams { query: string; // Search query text limit?: number; // Max results (default: 10) }
Implementation Details
The server maintains two forms of persistence:
-
File-based (memory.json):
- Complete knowledge graph structure
- Fast access to full graph
- Used for graph operations
-
Qdrant Vector DB:
- Semantic embeddings of entities and relations
- Enables similarity search
- Automatically synchronized with file storage
Synchronization
When entities or relations are modified:
- Changes are written to memory.json
- Embeddings are generated using OpenAI
- Vectors are stored in Qdrant
- Both storage systems remain consistent
Search Process
When searching:
- Query text is converted to embedding
- Qdrant performs similarity search
- Results include both entities and relations
- Results are ranked by semantic similarity
Example Usage
// Create entities
await client.callTool("create_entities", {
entities: [{
name: "Project",
entityType: "Task",
observations: ["A new development project"]
}]
});
// Search similar concepts
const results = await client.callTool("search_similar", {
query: "development tasks",
limit: 5
});
HTTPS and Reverse Proxy Configuration
The server supports connecting to Qdrant through HTTPS and reverse proxies. This is particularly useful when:
- Running Qdrant behind a reverse proxy like Nginx or Apache
- Using self-signed certificates
- Requiring custom SSL/TLS configurations
Setting up with a Reverse Proxy
- Configure your reverse proxy (example using Nginx):
server {
listen 443 ssl;
server_name qdrant.yourdomain.com;
ssl_certificate /path/to/cert.pem;
ssl_certificate_key /path/to/key.pem;
location / {
proxy_pass http://localhost:6333;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
}
}
- Update your environment variables:
QDRANT_URL=https://qdrant.yourdomain.com
Security Considerations
The server implements robust HTTPS handling with:
- Custom SSL/TLS configuration
- Proper certificate verification options
- Connection pooling and keepalive
- Automatic retry with exponential backoff
- Configurable timeouts
Troubleshooting HTTPS Connections
If you experience connection issues:
- Verify your certificates:
openssl s_client -connect qdrant.yourdomain.com:443
- Test direct connectivity:
curl -v https://qdrant.yourdomain.com/collections
- Check for any proxy settings:
env | grep -i proxy
Contributing
- Fork the repository
- Create a feature branch
- Make your changes
- Submit a pull request
License
MIT
相关推荐
I find academic articles and books for research and literature reviews.
Evaluator for marketplace product descriptions, checks for relevancy and keyword stuffing.
Confidential guide on numerology and astrology, based of GG33 Public information
Emulating Dr. Jordan B. Peterson's style in providing life advice and insights.
This GPT assists in finding a top-rated business CPA - local or virtual. We account for their qualifications, experience, testimonials and reviews. Business operators provide a short description of your business, services wanted, and city or state.
Your go-to expert in the Rust ecosystem, specializing in precise code interpretation, up-to-date crate version checking, and in-depth source code analysis. I offer accurate, context-aware insights for all your Rust programming questions.
Advanced software engineer GPT that excels through nailing the basics.
Découvrez la collection la plus complète et la plus à jour de serveurs MCP sur le marché. Ce référentiel sert de centre centralisé, offrant un vaste catalogue de serveurs MCP open-source et propriétaires, avec des fonctionnalités, des liens de documentation et des contributeurs.
L'application tout-en-un desktop et Docker AI avec chiffon intégré, agents AI, constructeur d'agent sans code, compatibilité MCP, etc.
Manipulation basée sur Micropython I2C de l'exposition GPIO de la série MCP, dérivée d'Adafruit_MCP230XX
Plateforme d'automatisation de workflow à code équitable avec des capacités d'IA natives. Combinez le bâtiment visuel avec du code personnalisé, de l'auto-hôte ou du cloud, 400+ intégrations.
Une passerelle API unifiée pour intégrer plusieurs API d'explorateur de blockchain de type étherscan avec la prise en charge du protocole de contexte modèle (MCP) pour les assistants d'IA.
Reviews

user_GEv7S6Gl
As a dedicated user of MCP applications, I highly recommend the CCXT MCP Server by MCP-Mirror. This server integrates seamlessly, is reliable, and supports multiple features essential for efficient crypto trading. Its user-friendly interface and comprehensive documentation make it a standout choice. Check it out at https://mcp.so/server/doggybee_mcp-server-ccxt/MCP-Mirror!