MCP cover image
See in Github
2025-03-11

1

Github Watches

11

Github Forks

38

Github Stars

MCP REST API and CLI Client

A simple REST API and CLI client to interact with Model Context Protocol (MCP) servers.

Key Features

1. MCP-Compatible Servers

  • Supports any MCP-compatible servers servers.
  • Pre-configured default servers:
    • SQLite (test.db has been provided with sample products data)
    • Brave Search
  • Additional MCP servers can be added in the mcp-server-config.json file

2. Integrated with LangChain

  • Leverages LangChain to execute LLM prompts.
  • Enables multiple MCP servers to collaborate and respond to a specific query simultaneously.

3. LLM Provider Support

  • Compatible with any LLM provider that supports APIs with function capabilities.
  • Examples:
    • OpenAI
    • Claude
    • Gemini
    • AWS Nova
    • Groq
    • Ollama
    • Essentially all LLM providers are supported as long as they provide a function-based API. Please refer langchain documentation for more details.

Setup

  1. Clone the repository:

    git clone https://github.com/rakesh-eltropy/mcp-client.git
    
  2. Navigate to the Project Directory After cloning the repository, move to the project directory:

    cd mcp-client
    
  3. Set the OPENAI_API_KEY environment variable:

    export OPENAI_API_KEY=your-openai-api-key
    

    You can also set the OPENAI_API_KEY in the mcp-server-config.json file.

    You can also set the provider and model in the mcp-server-config.json file. e.g. provider can be ollama and model can be llama3.2:3b.

4.Set the BRAVE_API_KEY environment variable:

export BRAVE_API_KEY=your-brave-api-key

You can also set the BRAVE_API_KEY in the mcp-server-config.json file. You can get the free BRAVE_API_KEY from Brave Search API.

  1. Running from the CLI:

    uv run cli.py
    

    To explore the available commands, use the help option. You can chat with LLM using chat command. Sample prompts:

      What is the capital city of India?
    
      Search the most expensive product from database and find more details about it from amazon?
    
  2. Running from the REST API:

    uvicorn app:app --reload
    

    You can use the following curl command to chat with llm:

    curl -X POST -H "Content-Type: application/json" -d '{"message": "list all the products from my local database?"}' http://localhost:8000/chat
    

    You can use the following curl command to chat with llm with streaming:

    curl -X POST -H "Content-Type: application/json" -d '{"message": "list all the products from my local database?", "streaming": true}' http://localhost:8000/chat
    

Contributing

Feel free to submit issues and pull requests for improvements or bug fixes.

相关推荐

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Yusuf Emre Yeşilyurt
  • I find academic articles and books for research and literature reviews.

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • Carlos Ferrin
  • Encuentra películas y series en plataformas de streaming.

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • lumpenspace
  • Take an adjectivised noun, and create images making it progressively more adjective!

  • https://appia.in
  • Siri Shortcut Finder – your go-to place for discovering amazing Siri Shortcuts with ease

  • apappascs
  • Découvrez la collection la plus complète et la plus à jour de serveurs MCP sur le marché. Ce référentiel sert de centre centralisé, offrant un vaste catalogue de serveurs MCP open-source et propriétaires, avec des fonctionnalités, des liens de documentation et des contributeurs.

  • ShrimpingIt
  • Manipulation basée sur Micropython I2C de l'exposition GPIO de la série MCP, dérivée d'Adafruit_MCP230XX

  • Mintplex-Labs
  • L'application tout-en-un desktop et Docker AI avec chiffon intégré, agents AI, constructeur d'agent sans code, compatibilité MCP, etc.

  • jae-jae
  • MCP Server pour récupérer le contenu de la page Web à l'aide du navigateur sans tête du dramwright.

  • ravitemer
  • Un puissant plugin Neovim pour gérer les serveurs MCP (Protocole de contexte modèle)

  • patruff
  • Pont entre les serveurs Olllama et MCP, permettant aux LLM locaux d'utiliser des outils de protocole de contexte de modèle

  • pontusab
  • La communauté du curseur et de la planche à voile, recherchez des règles et des MCP

  • WangRongsheng
  • 🧑‍🚀 全世界最好的 LLM 资料总结 (数据处理、模型训练、模型部署、 O1 模型、 MCP 、小语言模型、视觉语言模型) | Résumé des meilleures ressources LLM du monde.

  • n8n-io
  • Plateforme d'automatisation de workflow à code équitable avec des capacités d'IA natives. Combinez le bâtiment visuel avec du code personnalisé, de l'auto-hôte ou du cloud, 400+ intégrations.

  • av
  • Exécutez sans effort LLM Backends, API, Frontends et Services avec une seule commande.

    Reviews

    2 (1)
    Avatar
    user_MSwFUsAe
    2025-04-17

    I've been using mcp-client recently, and it has exceeded my expectations. The seamless integration and user-friendly interface make it an outstanding tool for any developer. Rakesh-eltropy has done an excellent job with this project, ensuring high performance and reliability. If you're looking for a dependable client, check out https://github.com/rakesh-eltropy/mcp-client. Highly recommended!