MCP cover image
See in Github
2025-04-10

Pinecone Assistant MCP Server

2

Github Watches

0

Github Forks

6

Github Stars

Pinecone Assistant MCP Server

An MCP server implementation for retrieving information from Pinecone Assistant.

Features

  • Retrieves information from Pinecone Assistant
  • Supports multiple results retrieval with a configurable number of results

Prerequisites

  • Docker installed on your system
  • Pinecone API key - obtain from the Pinecone Console
  • Pinecone Assistant API host - after creating an Assistant (e.g. in Pinecone Console), you can find the host in the Assistant details page

Building with Docker

To build the Docker image:

docker build -t pinecone/assistant-mcp .

Running with Docker

Run the server with your Pinecone API key:

docker run -i --rm \
  -e PINECONE_API_KEY=<YOUR_PINECONE_API_KEY_HERE> \
  -e PINECONE_ASSISTANT_HOST=<YOUR_PINECONE_ASSISTANT_HOST_HERE> \
  pinecone/assistant-mcp

Environment Variables

  • PINECONE_API_KEY (required): Your Pinecone API key
  • PINECONE_ASSISTANT_HOST (optional): Pinecone Assistant API host (default: https://prod-1-data.ke.pinecone.io)
  • LOG_LEVEL (optional): Logging level (default: info)

Usage with Claude Desktop

Add this to your claude_desktop_config.json:

{
  "mcpServers": {
    "pinecone-assistant": {
      "command": "docker",
      "args": [
        "run", 
        "-i", 
        "--rm", 
        "-e", 
        "PINECONE_API_KEY", 
        "-e", 
        "PINECONE_ASSISTANT_HOST", 
        "pinecone/assistant-mcp"
      ],
      "env": {
        "PINECONE_API_KEY": "<YOUR_PINECONE_API_KEY_HERE>",
        "PINECONE_ASSISTANT_HOST": "<YOUR_PINECONE_ASSISTANT_HOST_HERE>"
      }
    }
  }
}

Building from Source

If you prefer to build from source without Docker:

  1. Make sure you have Rust installed (https://rustup.rs/)
  2. Clone this repository
  3. Run cargo build --release
  4. The binary will be available at target/release/assistant-mcp

Testing with the inspector

export PINECONE_API_KEY=<YOUR_PINECONE_API_KEY_HERE>
export PINECONE_ASSISTANT_HOST=<YOUR_PINECONE_ASSISTANT_HOST_HERE>
# Run the inspector alone
npx @modelcontextprotocol/inspector cargo run
# Or run with Docker directly through the inspector
npx @modelcontextprotocol/inspector -- docker run -i --rm -e PINECONE_API_KEY -e PINECONE_ASSISTANT_HOST pinecone/assistant-mcp

License

This project is licensed under the terms specified in the LICENSE file.

相关推荐

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • NiKole Maxwell
  • I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

  • Yusuf Emre Yeşilyurt
  • I find academic articles and books for research and literature reviews.

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • Carlos Ferrin
  • Encuentra películas y series en plataformas de streaming.

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • Contraband Interactive
  • Emulating Dr. Jordan B. Peterson's style in providing life advice and insights.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • rustassistant.com
  • Your go-to expert in the Rust ecosystem, specializing in precise code interpretation, up-to-date crate version checking, and in-depth source code analysis. I offer accurate, context-aware insights for all your Rust programming questions.

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • https://jgadvisorycpa.com
  • This GPT assists in finding a top-rated business CPA - local or virtual. We account for their qualifications, experience, testimonials and reviews. Business operators provide a short description of your business, services wanted, and city or state.

  • apappascs
  • Découvrez la collection la plus complète et la plus à jour de serveurs MCP sur le marché. Ce référentiel sert de centre centralisé, offrant un vaste catalogue de serveurs MCP open-source et propriétaires, avec des fonctionnalités, des liens de documentation et des contributeurs.

  • Mintplex-Labs
  • L'application tout-en-un desktop et Docker AI avec chiffon intégré, agents AI, constructeur d'agent sans code, compatibilité MCP, etc.

  • modelcontextprotocol
  • Serveurs de protocole de contexte modèle

  • ShrimpingIt
  • Manipulation basée sur Micropython I2C de l'exposition GPIO de la série MCP, dérivée d'Adafruit_MCP230XX

  • n8n-io
  • Plateforme d'automatisation de workflow à code équitable avec des capacités d'IA natives. Combinez le bâtiment visuel avec du code personnalisé, de l'auto-hôte ou du cloud, 400+ intégrations.

  • OffchainLabs
  • Aller la mise en œuvre de la preuve de la participation Ethereum

  • WangRongsheng
  • 🧑‍🚀 全世界最好的 LLM 资料总结 (数据处理、模型训练、模型部署、 O1 模型、 MCP 、小语言模型、视觉语言模型) | Résumé des meilleures ressources LLM du monde.

    Reviews

    4 (1)
    Avatar
    user_vGp8WDlq
    2025-04-16

    I've been using mcp-prompts-rs by sparesparrow, and it has truly enhanced my workflow. The prompts are intuitive, making task completion faster and more efficient. Highly recommended for anyone looking to streamline their processes. Check it out at https://mcp.so/server/mcp-prompts-rs/sparesparrow.