I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

qdrant_mcpserver
Ein einfacher MCP -Server zum Zugriff auf Qdrant
3 years
Works with Finder
1
Github Watches
0
Github Forks
0
Github Stars
A FastAPI client and a MCPServer client for Qdrant access as a service
The file main.py is the entry point and a command line argument selects which server you want to run.
main.py
import argparse
import uvicorn
from fastapi_server import app as fastapi_app
from fastmcp_server import app as fastmcp_app
from config import settings
def run_fastapi():
"""Run the FastAPI server"""
print(f"Starting FastAPI server on port {settings.port}")
uvicorn.run(
fastapi_app,
host="0.0.0.0",
port=settings.port,
log_level="info"
)
def run_fastmcp():
"""Run the FastMCP server"""
print(f"Starting FastMCP server on port {settings.mcp_port}")
fastmcp_app.run(port=settings.mcp_port)
if __name__ == "__main__":
parser = argparse.ArgumentParser(description="Run Qdrant MCP Server")
parser.add_argument(
"--server-type",
choices=["fastapi", "fastmcp"],
default="fastmcp",
help="Type of server to run (default: fastmcp)"
)
args = parser.parse_args()
if args.server_type == "fastapi":
run_fastapi()
else:
run_fastmcp()
Qdrant MCP Server
A dual-protocol server for Qdrant knowledge graph operations, supporting both FastAPI and FastMCP protocols.
Project Structure
src/qdrant_mcpserver/
├── __init__.py
├── config.py # Configuration settings
├── qdrant_client.py # Qdrant operations
├── fastapi_server.py # FastAPI implementation
├── fastmcp_server.py # FastMCP implementation
└── main.py # CLI entry point
File Descriptions
config.py
- Loads environment variables
- Contains settings for:
- Qdrant connection (URL, API key)
- OpenAI API key
- Collection names
- Server ports
- Uses pydantic for validation
qdrant_client.py
- Implements core Qdrant operations:
- Collection management
- Node upsert/delete
- Vector search
- Handles embedding generation via OpenAI
- Provides service layer for both server types
fastapi_server.py
- FastAPI implementation with:
- RESTful endpoints
- CORS middleware
- OpenAPI documentation
- Endpoints:
- POST /nodes/upsert
- POST /nodes/search
- DELETE /nodes
- GET /health
fastmcp_server.py
- FastMCP implementation with:
- MCP protocol compliance
- Authentication support
- Standardized response formats
- Same endpoints as FastAPI but with MCP envelope
main.py
- CLI entry point with:
- Server type selection (--server-type)
- Unified logging
- Port configuration
- Runs either FastAPI or FastMCP server
Installation
- Install Poetry (if not installed):
curl -sSL https://install.python-poetry.org | python3 -
- Clone repository:
git clone https://github.com/your-repo/qdrant-mcpserver.git
cd qdrant-mcpserver
- Install dependencies:
poetry install
- Configure environment:
cp .env.example .env
# Edit .env with your actual values
Usage
Running the Server
# Run FastMCP server (default)
poetry run python -m qdrant_mcpserver.main
# Run FastAPI server
poetry run python -m qdrant_mcpserver.main --server-type fastapi
Environment Variables
Variable | Required | Description |
---|---|---|
QDRANT_URL | Yes | Qdrant server URL |
QDRANT_API_KEY | No | Qdrant API key |
OPENAI_API_KEY | Yes | OpenAI API key |
COLLECTION_NAME | No | Default: "knowledge_graph" |
PORT | No | FastAPI port (default: 8000) |
MCP_PORT | No | FastMCP port (default: 8080) |
MCP_SECRET | No | Authentication secret |
API Endpoints
Both servers provide the same endpoints:
-
POST /nodes/upsert
- Upsert knowledge graph nodes -
POST /nodes/search
- Semantic search across nodes -
DELETE /nodes
- Delete nodes by IDs -
GET /health
- Health check
Development
Code Formatting
These commands ensure consistent code style:
# Formats code according to Black's style guide (PEP 8 compliant)
poetry run black .
Organizes imports properly (groups standard lib, third-party, local)
poetry run isort .Format code:
poetry run black .
poetry run isort .
Testing
Using pytest for comprehensive test coverage. Test files should mirror the main code structure:
Setup tests
poetry install --with test
poetry run pytest --cov --cov-report=html
# Run all tests
poetry run pytest -v
# Run with coverage report
poetry run pytest --cov=qdrant_mcpserver --cov-report=term-missing
Setup tests (one time):
Type checking:
poetry run mypy .
Deployment
Build production package:
poetry build
Install system-wide:
pip install dist/*.whl
Run as service:
python -m qdrant_mcpserver.main --server-type fastmcp
Key Features:
-
Flexible Server Selection:
- CLI argument chooses between FastAPI and FastMCP
- Shared configuration and Qdrant client
- Consistent endpoints across both
-
Comprehensive Documentation:
- Clear file structure explanation
- Installation and usage instructions
- Environment variable reference
- Development workflow
-
Production-Ready:
- Poetry for dependency management
- Configuration via environment variables
- Build and deployment instructions
-
Maintainable Structure:
- Separation of concerns
- Shared core functionality
- Clear development practices
The implementation allows you to switch between server protocols while maintaining the same underlying Qdrant operations.
相关推荐
Confidential guide on numerology and astrology, based of GG33 Public information
Converts Figma frames into front-end code for various mobile frameworks.
Advanced software engineer GPT that excels through nailing the basics.
Take an adjectivised noun, and create images making it progressively more adjective!
Siri Shortcut Finder – your go-to place for discovering amazing Siri Shortcuts with ease
I find academic articles and books for research and literature reviews.
Entdecken Sie die umfassendste und aktuellste Sammlung von MCP-Servern auf dem Markt. Dieses Repository dient als zentraler Hub und bietet einen umfangreichen Katalog von Open-Source- und Proprietary MCP-Servern mit Funktionen, Dokumentationslinks und Mitwirkenden.
Mirror ofhttps: //github.com/bitrefill/bitrefill-mcp-server
MCP -Server für den Fetch -Webseiteninhalt mit dem Headless -Browser von Dramatikern.
Ein KI-Chat-Bot für kleine und mittelgroße Teams, die Modelle wie Deepseek, Open AI, Claude und Gemini unterstützt. 专为中小团队设计的 ai 聊天应用 , 支持 Deepseek 、 Open ai 、 claude 、 Gemini 等模型。
Brücke zwischen Ollama und MCP -Servern und ermöglicht es lokalen LLMs, Modellkontextprotokoll -Tools zu verwenden
Ein leistungsstarkes Neovim -Plugin für die Verwaltung von MCP -Servern (Modellkontextprotokoll)
Reviews

user_ynqHDKke
I've been utilizing qdrant_mcpserver by davidwynter, and it's truly impressive. The seamless MCP integration and robust features make it a standout choice for anyone needing efficient, scalable data processing solutions. The detailed documentation and active repository on GitHub are extremely helpful. Highly recommend giving it a try!