
MCP REST API and CLI Client
A simple REST API and CLI client to interact with Model Context Protocol (MCP) servers.
Key Features
1. MCP-Compatible Servers
- Supports any MCP-compatible servers servers.
- Pre-configured default servers:
- SQLite (test.db has been provided with sample products data)
- Brave Search
- Additional MCP servers can be added in the mcp-server-config.json file
2. Integrated with LangChain
- Leverages LangChain to execute LLM prompts.
- Enables multiple MCP servers to collaborate and respond to a specific query simultaneously.
3. LLM Provider Support
- Compatible with any LLM provider that supports APIs with function capabilities.
- Examples:
- OpenAI
- Claude
- Gemini
- AWS Nova
- Groq
- Ollama
- Essentially all LLM providers are supported as long as they provide a function-based API. Please refer langchain documentation for more details.
Setup
-
Clone the repository:
git clone https://github.com/rakesh-eltropy/mcp-client.git
-
Navigate to the Project Directory After cloning the repository, move to the project directory:
cd mcp-client
-
Set the OPENAI_API_KEY environment variable:
export OPENAI_API_KEY=your-openai-api-key
You can also set the
OPENAI_API_KEY
in the mcp-server-config.json file.You can also set the
provider
andmodel
in the mcp-server-config.json file. e.g.provider
can beollama
andmodel
can bellama3.2:3b
.
4.Set the BRAVE_API_KEY environment variable:
export BRAVE_API_KEY=your-brave-api-key
You can also set the BRAVE_API_KEY
in the mcp-server-config.json file.
You can get the free BRAVE_API_KEY
from Brave Search API.
-
Running from the CLI:
uv run cli.py
To explore the available commands, use the
help
option. You can chat with LLM usingchat
command. Sample prompts:What is the capital city of India?
Search the most expensive product from database and find more details about it from amazon?
-
Running from the REST API:
uvicorn app:app --reload
You can use the following curl command to chat with llm:
curl -X POST -H "Content-Type: application/json" -d '{"message": "list all the products from my local database?"}' http://localhost:8000/chat
You can use the following curl command to chat with llm with streaming:
curl -X POST -H "Content-Type: application/json" -d '{"message": "list all the products from my local database?", "streaming": true}' http://localhost:8000/chat
Contributing
Feel free to submit issues and pull requests for improvements or bug fixes.
相关推荐
I find academic articles and books for research and literature reviews.
Confidential guide on numerology and astrology, based of GG33 Public information
Converts Figma frames into front-end code for various mobile frameworks.
Advanced software engineer GPT that excels through nailing the basics.
Take an adjectivised noun, and create images making it progressively more adjective!
Siri Shortcut Finder – your go-to place for discovering amazing Siri Shortcuts with ease
Descubra la colección más completa y actualizada de servidores MCP en el mercado. Este repositorio sirve como un centro centralizado, que ofrece un extenso catálogo de servidores MCP de código abierto y propietarios, completos con características, enlaces de documentación y colaboradores.
Manipulación basada en Micrypthon I2C del expansor GPIO de la serie MCP, derivada de AdaFruit_MCP230xx
La aplicación AI de escritorio todo en uno y Docker con trapo incorporado, agentes de IA, creador de agentes sin código, compatibilidad de MCP y más.
Un poderoso complemento Neovim para administrar servidores MCP (protocolo de contexto del modelo)
Servidor MCP para obtener contenido de la página web con el navegador sin cabeza de dramaturgo.
Puente entre los servidores Ollama y MCP, lo que permite a LLM locales utilizar herramientas de protocolo de contexto del modelo
Plataforma de automatización de flujo de trabajo de código justo con capacidades de IA nativas. Combine el edificio visual con código personalizado, auto-anfitrión o nube, más de 400 integraciones.
🧑🚀 全世界最好的 llM 资料总结(数据处理、模型训练、模型部署、 O1 模型、 MCP 、小语言模型、视觉语言模型) | Resumen de los mejores recursos del mundo.
Una lista curada de servidores de protocolo de contexto del modelo (MCP)
Reviews

user_MSwFUsAe
I've been using mcp-client recently, and it has exceeded my expectations. The seamless integration and user-friendly interface make it an outstanding tool for any developer. Rakesh-eltropy has done an excellent job with this project, ensuring high performance and reliability. If you're looking for a dependable client, check out https://github.com/rakesh-eltropy/mcp-client. Highly recommended!