Aleph-10
Vector Memory MCP Server: un servidor MCP con capacidades de almacenamiento de memoria basadas en vectores
1
Github Watches
0
Github Forks
0
Github Stars
Aleph-10: Vector Memory MCP Server
Aleph-10 is a Model Context Protocol (MCP) server that combines weather data services with vector-based memory storage. This project provides tools for retrieving weather information and managing semantic memory through vector embeddings.
Features
- Weather Information: Get weather alerts and forecasts using the National Weather Service API
- Vector Memory: Store and retrieve information using semantic search
- Multiple Embedding Options: Support for both cloud-based (Google Gemini) and local (Ollama) embedding providers
- Metadata Support: Add and filter by metadata for efficient memory management
Getting Started
Prerequisites
- Node.js 18.x or higher
- pnpm package manager
Installation
- Clone the repository
git clone https://github.com/yourusername/aleph-10.git
cd aleph-10
- Install dependencies
pnpm install
- Configure environment variables (create a
.envfile in the project root)
EMBEDDING_PROVIDER=gemini
GEMINI_API_KEY=your_gemini_api_key
VECTOR_DB_PATH=./data/vector_db
LOG_LEVEL=info
- Build the project
pnpm build
- Run the server
node build/index.js
Usage
The server implements the Model Context Protocol and provides the following tools:
Weather Tools
-
get-alerts: Get weather alerts for a specific US state
- Parameters:
state(two-letter state code)
- Parameters:
-
get-forecast: Get weather forecast for a location
- Parameters:
latitudeandlongitude
- Parameters:
Memory Tools
-
memory-store: Store information in the vector database
- Parameters:
text(content to store),metadata(optional associated data)
- Parameters:
-
memory-retrieve: Find semantically similar information
- Parameters:
query(search text),limit(max results),filters(metadata filters)
- Parameters:
-
memory-update: Update existing memory entries
- Parameters:
id(memory ID),text(new content),metadata(updated metadata)
- Parameters:
-
memory-delete: Remove entries from the database
- Parameters:
id(memory ID to delete)
- Parameters:
-
memory-stats: Get statistics about the memory store
- Parameters: none
Configuration
The following environment variables can be configured:
| Variable | Description | Default |
|---|---|---|
EMBEDDING_PROVIDER |
Provider for vector embeddings (gemini or ollama) |
gemini |
GEMINI_API_KEY |
API key for Google Gemini | - |
OLLAMA_BASE_URL |
Base URL for Ollama API | http://localhost:11434 |
VECTOR_DB_PATH |
Storage location for vector database | ./data/vector_db |
LOG_LEVEL |
Logging verbosity | info |
Development
Project Structure
The project follows a modular structure:
aleph-10/
├── src/ # Source code
│ ├── index.ts # Main application entry point
│ ├── weather/ # Weather service module
│ ├── memory/ # Memory management module
│ ├── utils/ # Shared utilities
│ └── types/ # TypeScript type definitions
├── tests/ # Test files
└── vitest.config.ts # Vitest configuration
Running Tests
The project uses Vitest for testing. Run tests with:
# Run tests once
pnpm test
# Run tests in watch mode during development
pnpm test:watch
# Run tests with UI (optional)
pnpm test:ui
Building
pnpm build
License
This project is licensed under the ISC License.
Acknowledgments
- Model Context Protocol
- National Weather Service API
- Vitest - Next generation testing framework
相关推荐
I find academic articles and books for research and literature reviews.
Confidential guide on numerology and astrology, based of GG33 Public information
Emulating Dr. Jordan B. Peterson's style in providing life advice and insights.
Advanced software engineer GPT that excels through nailing the basics.
Your go-to expert in the Rust ecosystem, specializing in precise code interpretation, up-to-date crate version checking, and in-depth source code analysis. I offer accurate, context-aware insights for all your Rust programming questions.
Converts Figma frames into front-end code for various mobile frameworks.
Take an adjectivised noun, and create images making it progressively more adjective!
Descubra la colección más completa y actualizada de servidores MCP en el mercado. Este repositorio sirve como un centro centralizado, que ofrece un extenso catálogo de servidores MCP de código abierto y propietarios, completos con características, enlaces de documentación y colaboradores.
La aplicación AI de escritorio todo en uno y Docker con trapo incorporado, agentes de IA, creador de agentes sin código, compatibilidad de MCP y más.
Manipulación basada en Micrypthon I2C del expansor GPIO de la serie MCP, derivada de AdaFruit_MCP230xx
Plataforma de automatización de flujo de trabajo de código justo con capacidades de IA nativas. Combine el edificio visual con código personalizado, auto-anfitrión o nube, más de 400 integraciones.
🧑🚀 全世界最好的 llM 资料总结(数据处理、模型训练、模型部署、 O1 模型、 MCP 、小语言模型、视觉语言模型) | Resumen de los mejores recursos del mundo.
Espejo dehttps: //github.com/agentience/practices_mcp_server
Una lista curada de servidores de protocolo de contexto del modelo (MCP)
Reviews
user_bPxALz6H
As a loyal user of MCP applications, I highly recommend Aleph-10 by bjkemp. This tool is incredibly efficient and user-friendly, significantly streamlining my workflow. The comprehensive documentation available on GitHub ensures a smooth start and helps you unlock the full potential of Aleph-10 effortlessly. Whether you're a newbie or a pro, this application is a game-changer in the MCP space. Check it out at https://github.com/bjkemp/aleph-10.