I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

TXTAI-Asistente MCP
Implementación del servidor del Protocolo del Contexto del Modelo (MCP) para la búsqueda de vectores semánticos y la gestión de la memoria utilizando TXTAI. Este servidor proporciona una API robusta para almacenar, recuperar y administrar recuerdos basados en texto con capacidades de búsqueda de bases de datos vectoriales semánticas. También puedes usar Claude y Cline Ai.
3 years
Works with Finder
1
Github Watches
1
Github Forks
4
Github Stars
TxtAI Assistant MCP
A Model Context Protocol (MCP) server implementation for semantic search and memory management using txtai. This server provides a robust API for storing, retrieving, and managing text-based memories with semantic search capabilities.
About txtai
This project is built on top of txtai, an excellent open-source AI-powered search engine created by NeuML. txtai provides:
- 🔍 All-in-one semantic search solution
- 🧠 Neural search with transformers
- 💡 Zero-shot text classification
- 🔄 Text extraction and embeddings
- 🌐 Multi-language support
- 🚀 High performance and scalability
We extend txtai's capabilities by integrating it with the Model Context Protocol (MCP), enabling AI assistants like Claude and Cline to leverage its powerful semantic search capabilities. Special thanks to the txtai team for creating such a powerful and flexible tool.
Features
- 🔍 Semantic search across stored memories
- 💾 Persistent storage with file-based backend
- 🏷️ Tag-based memory organization and retrieval
- 📊 Memory statistics and health monitoring
- 🔄 Automatic data persistence
- 📝 Comprehensive logging
- 🔒 Configurable CORS settings
- 🤖 Integration with Claude and Cline AI
Prerequisites
- Python 3.8 or higher
- pip (Python package installer)
- virtualenv (recommended)
Installation
- Clone this repository:
git clone https://github.com/yourusername/txtai-assistant-mcp.git
cd txtai-assistant-mcp
- Run the start script:
./scripts/start.sh
The script will:
- Create a virtual environment
- Install required dependencies
- Set up necessary directories
- Create a configuration file from template
- Start the server
Configuration
The server can be configured using environment variables in the .env
file. A template is provided at .env.template
:
# Server Configuration
HOST=0.0.0.0
PORT=8000
# CORS Configuration
CORS_ORIGINS=*
# Logging Configuration
LOG_LEVEL=DEBUG
# Memory Configuration
MAX_MEMORIES=0
Integration with Claude and Cline AI
This TxtAI Assistant can be used as an MCP server with Claude and Cline AI to enhance their capabilities with semantic memory and search functionality.
Configuration for Claude
To use this server with Claude, add it to Claude's MCP configuration file (typically located at ~/Library/Application Support/Claude/claude_desktop_config.json
on macOS):
{
"mcpServers": {
"txtai-assistant": {
"command": "path/to/txtai-assistant-mcp/scripts/start.sh",
"env": {}
}
}
}
Configuration for Cline
To use with Cline, add the server configuration to Cline's MCP settings file (typically located at ~/Library/Application Support/Code/User/globalStorage/saoudrizwan.claude-dev/settings/cline_mcp_settings.json
):
{
"mcpServers": {
"txtai-assistant": {
"command": "path/to/txtai-assistant-mcp/scripts/start.sh",
"env": {}
}
}
}
Available MCP Tools
Once configured, the following tools become available to Claude and Cline:
-
store_memory
: Store new memory content with metadata and tags
{
"content": "Memory content to store",
"metadata": {
"source": "conversation",
"timestamp": "2023-01-01T00:00:00Z"
},
"tags": ["important", "context"],
"type": "conversation"
}
-
retrieve_memory
: Retrieve memories based on semantic search
{
"query": "search query",
"n_results": 5
}
-
search_by_tag
: Search memories by tags
{
"tags": ["important", "context"]
}
-
delete_memory
: Delete a specific memory by content hash
{
"content_hash": "hash_value"
}
-
get_stats
: Get database statistics
{}
-
check_health
: Check database and embedding model health
{}
Usage Examples
In Claude or Cline, you can use these tools through the MCP protocol:
# Store a memory
<use_mcp_tool>
<server_name>txtai-assistant</server_name>
<tool_name>store_memory</tool_name>
<arguments>
{
"content": "Important information to remember",
"tags": ["important"]
}
</arguments>
</use_mcp_tool>
# Retrieve memories
<use_mcp_tool>
<server_name>txtai-assistant</server_name>
<tool_name>retrieve_memory</tool_name>
<arguments>
{
"query": "what was the important information?",
"n_results": 5
}
</arguments>
</use_mcp_tool>
The AI will automatically use these tools to maintain context and retrieve relevant information during conversations.
API Endpoints
Store Memory
POST /store
Store a new memory with optional metadata and tags.
Request Body:
{
"content": "Memory content to store",
"metadata": {
"source": "example",
"timestamp": "2023-01-01T00:00:00Z"
},
"tags": ["example", "memory"],
"type": "general"
}
Search Memories
POST /search
Search memories using semantic search.
Request Body:
{
"query": "search query",
"n_results": 5,
"similarity_threshold": 0.7
}
Search by Tags
POST /search_tags
Search memories by tags.
Request Body:
{
"tags": ["example", "memory"]
}
Delete Memory
DELETE /memory/{content_hash}
Delete a specific memory by its content hash.
Get Statistics
GET /stats
Get system statistics including memory counts and tag distribution.
Health Check
GET /health
Check the health status of the server.
Directory Structure
txtai-assistant-mcp/
├── server/
│ ├── main.py # Main server implementation
│ └── requirements.txt # Python dependencies
├── scripts/
│ └── start.sh # Server startup script
├── data/ # Data storage directory
├── logs/ # Log files directory
├── .env.template # Environment configuration template
└── README.md # This file
Data Storage
Memories and tags are stored in JSON files in the data
directory:
-
memories.json
: Contains all stored memories -
tags.json
: Contains the tag index
Logging
Logs are stored in the logs
directory. The default log file is server.log
.
Development
To contribute to this project:
- Fork the repository
- Create a feature branch
- Make your changes
- Submit a pull request
Error Handling
The server implements comprehensive error handling:
- Invalid requests return appropriate HTTP status codes
- Errors are logged with stack traces
- User-friendly error messages are returned in responses
Security Considerations
- CORS settings are configurable via environment variables
- File paths are sanitized to prevent directory traversal
- Input validation is performed on all endpoints
License
This project is licensed under the MIT License - see the LICENSE file for details.
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
Support
If you encounter any issues or have questions, please file an issue on the GitHub repository.
相关推荐
Confidential guide on numerology and astrology, based of GG33 Public information
A geek-themed horoscope generator blending Bitcoin prices, tech jargon, and astrological whimsy.
Converts Figma frames into front-end code for various mobile frameworks.
PR Professional: Guiding You to Get Media Placements and Publicity Quickly and Effectively
Therapist adept at identifying core issues and offering practical advice with images.
Oede knorrepot die vasthoudt an de goeie ouwe tied van 't boerenleven
A medical specialist offering assistance grounded in clinical guidelines. Disclaimer: This is intended for research and is NOT safe for clinical use!
Descubra la colección más completa y actualizada de servidores MCP en el mercado. Este repositorio sirve como un centro centralizado, que ofrece un extenso catálogo de servidores MCP de código abierto y propietarios, completos con características, enlaces de documentación y colaboradores.
Manipulación basada en Micrypthon I2C del expansor GPIO de la serie MCP, derivada de AdaFruit_MCP230xx
Una puerta de enlace de API unificada para integrar múltiples API de explorador de blockchain similar a Esterscan con soporte de protocolo de contexto modelo (MCP) para asistentes de IA.
Servidor MCP para obtener contenido de la página web con el navegador sin cabeza de dramaturgo.
Espejo de https: //github.com/suhail-ak-s/mcp-typesense-server
Un bot de chat de IA para equipos pequeños y medianos, que apoyan modelos como Deepseek, Open AI, Claude y Gemini. 专为中小团队设计的 ai 聊天应用 , 支持 Deepseek 、 Open ai 、 Claude 、 Géminis 等模型。
Reviews

user_15tPvB41
I recently started using txtai-assistant-mcp by rmtech1, and I must say it's a game-changer! The seamless integration and user-friendly interface make AI interactions smooth and efficient. Highly recommend checking it out at https://github.com/rmtech1/txtai-assistant-mcp.