I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

asistente de MCP
Servidor MCP Asistente de Pinecone
2
Github Watches
0
Github Forks
6
Github Stars
Pinecone Assistant MCP Server
An MCP server implementation for retrieving information from Pinecone Assistant.
Features
- Retrieves information from Pinecone Assistant
- Supports multiple results retrieval with a configurable number of results
Prerequisites
- Docker installed on your system
- Pinecone API key - obtain from the Pinecone Console
- Pinecone Assistant API host - after creating an Assistant (e.g. in Pinecone Console), you can find the host in the Assistant details page
Building with Docker
To build the Docker image:
docker build -t pinecone/assistant-mcp .
Running with Docker
Run the server with your Pinecone API key:
docker run -i --rm \
-e PINECONE_API_KEY=<YOUR_PINECONE_API_KEY_HERE> \
-e PINECONE_ASSISTANT_HOST=<YOUR_PINECONE_ASSISTANT_HOST_HERE> \
pinecone/assistant-mcp
Environment Variables
-
PINECONE_API_KEY
(required): Your Pinecone API key -
PINECONE_ASSISTANT_HOST
(optional): Pinecone Assistant API host (default: https://prod-1-data.ke.pinecone.io) -
LOG_LEVEL
(optional): Logging level (default: info)
Usage with Claude Desktop
Add this to your claude_desktop_config.json
:
{
"mcpServers": {
"pinecone-assistant": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-e",
"PINECONE_API_KEY",
"-e",
"PINECONE_ASSISTANT_HOST",
"pinecone/assistant-mcp"
],
"env": {
"PINECONE_API_KEY": "<YOUR_PINECONE_API_KEY_HERE>",
"PINECONE_ASSISTANT_HOST": "<YOUR_PINECONE_ASSISTANT_HOST_HERE>"
}
}
}
}
Building from Source
If you prefer to build from source without Docker:
- Make sure you have Rust installed (https://rustup.rs/)
- Clone this repository
- Run
cargo build --release
- The binary will be available at
target/release/assistant-mcp
Testing with the inspector
export PINECONE_API_KEY=<YOUR_PINECONE_API_KEY_HERE>
export PINECONE_ASSISTANT_HOST=<YOUR_PINECONE_ASSISTANT_HOST_HERE>
# Run the inspector alone
npx @modelcontextprotocol/inspector cargo run
# Or run with Docker directly through the inspector
npx @modelcontextprotocol/inspector -- docker run -i --rm -e PINECONE_API_KEY -e PINECONE_ASSISTANT_HOST pinecone/assistant-mcp
License
This project is licensed under the terms specified in the LICENSE file.
相关推荐
I find academic articles and books for research and literature reviews.
Evaluator for marketplace product descriptions, checks for relevancy and keyword stuffing.
Confidential guide on numerology and astrology, based of GG33 Public information
Emulating Dr. Jordan B. Peterson's style in providing life advice and insights.
Your go-to expert in the Rust ecosystem, specializing in precise code interpretation, up-to-date crate version checking, and in-depth source code analysis. I offer accurate, context-aware insights for all your Rust programming questions.
Advanced software engineer GPT that excels through nailing the basics.
Converts Figma frames into front-end code for various mobile frameworks.
This GPT assists in finding a top-rated business CPA - local or virtual. We account for their qualifications, experience, testimonials and reviews. Business operators provide a short description of your business, services wanted, and city or state.
Descubra la colección más completa y actualizada de servidores MCP en el mercado. Este repositorio sirve como un centro centralizado, que ofrece un extenso catálogo de servidores MCP de código abierto y propietarios, completos con características, enlaces de documentación y colaboradores.
La aplicación AI de escritorio todo en uno y Docker con trapo incorporado, agentes de IA, creador de agentes sin código, compatibilidad de MCP y más.
Manipulación basada en Micrypthon I2C del expansor GPIO de la serie MCP, derivada de AdaFruit_MCP230xx
Plataforma de automatización de flujo de trabajo de código justo con capacidades de IA nativas. Combine el edificio visual con código personalizado, auto-anfitrión o nube, más de 400 integraciones.
Espejo dehttps: //github.com/agentience/practices_mcp_server
🧑🚀 全世界最好的 llM 资料总结(数据处理、模型训练、模型部署、 O1 模型、 MCP 、小语言模型、视觉语言模型) | Resumen de los mejores recursos del mundo.
Una lista curada de servidores de protocolo de contexto del modelo (MCP)
Reviews

user_vGp8WDlq
I've been using mcp-prompts-rs by sparesparrow, and it has truly enhanced my workflow. The prompts are intuitive, making task completion faster and more efficient. Highly recommended for anyone looking to streamline their processes. Check it out at https://mcp.so/server/mcp-prompts-rs/sparesparrow.