Emulating Dr. Jordan B. Peterson's style in providing life advice and insights.

Pinecone-MCP
Conecte sus proyectos de Pinecone a cursor, Claude y otros asistentes de IA
3
Github Watches
1
Github Forks
3
Github Stars
Pinecone Developer MCP Server
The Model Context Protocol (MCP) is a standard that allows coding assistants and other AI tools to interact with platforms like Pinecone. The Pinecone Developer MCP Server allows you to connect these tools with Pinecone projects and documentation.
Once connected, AI tools can:
- Search Pinecone documentation to answer questions accurately.
- Help you configure indexes based on your application's needs.
- Generate code informed by your index configuration and data, as well as Pinecone documentation and examples.
- Upsert and search for data in indexes, allowing you to test queries and evaluate results within your dev environment.
This MCP server is focused on improving the experience of developers working with Pinecone as part of their technology stack. It is intended for use with coding assistants. Pinecone also offers the Assistant MCP, which is designed to provide AI assistants with relevant context sourced from your knowledge base.
Setup
To configure the MCP server to access your Pinecone project, you will need to generate an API key using the console. Without an API key, your AI tool will still be able to search documentation. However, it will not be able to manage or query your indexes.
The MCP server requires Node.js. Ensure that node
and npx
are available in your PATH
.
Next, you will need to configure your AI assistant to use the MCP server.
Configure Cursor
To add the Pinecone MCP server to a project, create a .cursor/mcp.json
file in the project root (if it doesn't already exist) and add the following configuration:
{
"mcpServers": {
"pinecone": {
"command": "npx",
"args": [
"-y", "@pinecone-database/mcp"
],
"env": {
"PINECONE_API_KEY": "<your pinecone api key>"
}
}
}
}
You can check the status of the server in Cursor Settings > MCP.
To enable the server globally, add the configuration to the .cursor/mcp.json
in your home directory instead.
Configure Claude desktop
Use Claude desktop to locate the claude_desktop_config.json
file by navigating to Settings > Developer > Edit Config. Add the following configuration:
{
"mcpServers": {
"pinecone": {
"command": "npx",
"args": [
"-y", "@pinecone-database/mcp"
],
"env": {
"PINECONE_API_KEY": "<your pinecone api key>"
}
}
}
}
Restart Claude desktop. On the new chat screen, you should see a hammer (MCP) icon appear with the new MCP tools available.
Usage
Once configured, your AI tool will automatically make use of the MCP to interact with Pinecone. You may be prompted for permission before a tool can be used. Try asking your AI assistant to set up an example index, upload sample data, or search for you!
Tools
Pinecone Developer MCP Server provides the following tools for AI assistants to use:
-
search-docs
: Search the official Pinecone documentation. -
list-indexes
: Lists all Pinecone indexes. -
describe-index
: Describes the configuration of an index. -
describe-index-stats
: Provides statistics about the data in the index, including the number of records and available namespaces. -
create-index-for-model
: Creates a new index that uses an integrated inference model to embed text as vectors. -
upsert-records
: Inserts or updates records in an index with integrated inference. -
search-records
: Searches for records in an index based on a text query, using integrated inference for embedding. Has options for metadata filtering and reranking. -
cascading-search
: Searches for records across multiple indexes, deduplicating and reranking the results. -
rerank-documents
: Reranks a collection of records or text documents using a specialized reranking model.
Limitations
Only indexes with integrated inference are supported. Assistants, indexes without integrated inference, standalone embeddings, and vector search are not supported.
Contributing
We welcome your collaboration in improving the developer MCP experience. Please submit issues in the GitHub issue tracker. Information about contributing can be found in CONTRIBUTING.md.
相关推荐
Your go-to expert in the Rust ecosystem, specializing in precise code interpretation, up-to-date crate version checking, and in-depth source code analysis. I offer accurate, context-aware insights for all your Rust programming questions.
Create and Publish Business Websites in seconds. AI will gather all the details about your website and generate link to your website.
You're in a stone cell – can you get out? A classic choose-your-adventure interactive fiction game, based on a meticulously-crafted playbook. With a medieval fantasy setting, infinite choices and outcomes, and dice!
Carbon footprint calculations breakdown and advices on how to reduce it
Text your favorite pet, after answering 10 questions about their everyday lives!
Best-in-class AI domain names scoring engine and availability checker. Brandability, domain worth, root keywords and more.
Plataforma de automatización de flujo de trabajo de código justo con capacidades de IA nativas. Combine el edificio visual con código personalizado, auto-anfitrión o nube, más de 400 integraciones.
🧑🚀 全世界最好的 llM 资料总结(数据处理、模型训练、模型部署、 O1 模型、 MCP 、小语言模型、视觉语言模型) | Resumen de los mejores recursos del mundo.
Este repositorio es para el desarrollo del servidor Azure MCP, llevando el poder de Azure a sus agentes.
🔥 1Panel proporciona una interfaz web intuitiva y un servidor MCP para administrar sitios web, archivos, contenedores, bases de datos y LLM en un servidor de Linux.
Traducción de papel científico en PDF con formatos preservados - 基于 Ai 完整保留排版的 PDF 文档全文双语翻译 , 支持 支持 支持 支持 支持 支持 支持 支持 支持 支持 支持 支持 等服务 等服务 等服务 提供 提供 提供 提供 提供 提供 提供 提供 提供 提供 提供 提供 cli/mcp/docker/zotero
Este proyecto fue creado para demostrar cómo podemos conectarnos con diferentes protocolos de contexto del modelo (MCP).
AI's query engine - Platform for building AI that can answer questions over large scale federated data. - The only MCP Server you'll ever need
⛓️Rulego es un marco de motor de regla de orquestación de componentes de alta generación de alto rendimiento, de alto rendimiento y de alto rendimiento para GO.
Reviews

user_rJSO7fFq
The pinecone-mcp by pinecone-io is an outstanding application! Its intuitive design and user-friendly interface make managing MCP tasks seamless and efficient. The real-time updates and comprehensive features ensure smooth operations, making it a must-have tool for any MCP enthusiast. Highly recommended!

user_FHrmYv1l
Pinecone-mcp is an absolutely phenomenal application! It offers seamless integration and an intuitive interface that makes it incredibly user-friendly. The performance is top-notch, ensuring efficiency and reliability for all my needs. Kudos to pinecone-io for creating such a robust and versatile product. Highly recommended for anyone looking for a solid solution!

user_LeTUCa2u
I have been using the Pinecone-MCP for a while now, and I must say it is an outstanding product by Pinecone-IO. The ease of integration and the efficient performance it provides have significantly improved my workflow. The user-friendly interface and comprehensive features make it a must-have tool for any MCP application enthusiast. Highly recommend!

user_2oU8KsOU
I have been using Pinecone-MCP by Pinecone-io, and it has truly revolutionized my workflow. The user-friendly interface and seamless integration make it a must-have for any professional. Highly recommend giving it a try!