Cover image
Try Now
2025-03-11

3 years

Works with Finder

1

Github Watches

11

Github Forks

38

Github Stars

MCP REST API and CLI Client

A simple REST API and CLI client to interact with Model Context Protocol (MCP) servers.

Key Features

1. MCP-Compatible Servers

  • Supports any MCP-compatible servers servers.
  • Pre-configured default servers:
    • SQLite (test.db has been provided with sample products data)
    • Brave Search
  • Additional MCP servers can be added in the mcp-server-config.json file

2. Integrated with LangChain

  • Leverages LangChain to execute LLM prompts.
  • Enables multiple MCP servers to collaborate and respond to a specific query simultaneously.

3. LLM Provider Support

  • Compatible with any LLM provider that supports APIs with function capabilities.
  • Examples:
    • OpenAI
    • Claude
    • Gemini
    • AWS Nova
    • Groq
    • Ollama
    • Essentially all LLM providers are supported as long as they provide a function-based API. Please refer langchain documentation for more details.

Setup

  1. Clone the repository:

    git clone https://github.com/rakesh-eltropy/mcp-client.git
    
  2. Navigate to the Project Directory After cloning the repository, move to the project directory:

    cd mcp-client
    
  3. Set the OPENAI_API_KEY environment variable:

    export OPENAI_API_KEY=your-openai-api-key
    

    You can also set the OPENAI_API_KEY in the mcp-server-config.json file.

    You can also set the provider and model in the mcp-server-config.json file. e.g. provider can be ollama and model can be llama3.2:3b.

4.Set the BRAVE_API_KEY environment variable:

export BRAVE_API_KEY=your-brave-api-key

You can also set the BRAVE_API_KEY in the mcp-server-config.json file. You can get the free BRAVE_API_KEY from Brave Search API.

  1. Running from the CLI:

    uv run cli.py
    

    To explore the available commands, use the help option. You can chat with LLM using chat command. Sample prompts:

      What is the capital city of India?
    
      Search the most expensive product from database and find more details about it from amazon?
    
  2. Running from the REST API:

    uvicorn app:app --reload
    

    You can use the following curl command to chat with llm:

    curl -X POST -H "Content-Type: application/json" -d '{"message": "list all the products from my local database?"}' http://localhost:8000/chat
    

    You can use the following curl command to chat with llm with streaming:

    curl -X POST -H "Content-Type: application/json" -d '{"message": "list all the products from my local database?", "streaming": true}' http://localhost:8000/chat
    

Contributing

Feel free to submit issues and pull requests for improvements or bug fixes.

相关推荐

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • Lists Tailwind CSS classes in monospaced font

  • lumpenspace
  • Take an adjectivised noun, and create images making it progressively more adjective!

  • https://appia.in
  • Siri Shortcut Finder – your go-to place for discovering amazing Siri Shortcuts with ease

  • Carlos Ferrin
  • Encuentra películas y series en plataformas de streaming.

  • Yusuf Emre Yeşilyurt
  • I find academic articles and books for research and literature reviews.

  • tomoyoshi hirata
  • Sony α7IIIマニュアルアシスタント

  • apappascs
  • Discover the most comprehensive and up-to-date collection of MCP servers in the market. This repository serves as a centralized hub, offering an extensive catalog of open-source and proprietary MCP servers, complete with features, documentation links, and contributors.

  • ShrimpingIt
  • Micropython I2C-based manipulation of the MCP series GPIO expander, derived from Adafruit_MCP230xx

  • jae-jae
  • MCP server for fetch web page content using Playwright headless browser.

  • ravitemer
  • A powerful Neovim plugin for managing MCP (Model Context Protocol) servers

  • patruff
  • Bridge between Ollama and MCP servers, enabling local LLMs to use Model Context Protocol tools

  • pontusab
  • The Cursor & Windsurf community, find rules and MCPs

  • av
  • Effortlessly run LLM backends, APIs, frontends, and services with one command.

  • WangRongsheng
  • 🧑‍🚀 全世界最好的LLM资料总结(Agent框架、辅助编程、数据处理、模型训练、模型推理、o1 模型、MCP、小语言模型、视觉语言模型) | Summary of the world's best LLM resources.

  • Mintplex-Labs
  • The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, No-code agent builder, MCP compatibility, and more.

  • appcypher
  • Awesome MCP Servers - A curated list of Model Context Protocol servers

    Reviews

    2 (1)
    Avatar
    user_MSwFUsAe
    2025-04-17

    I've been using mcp-client recently, and it has exceeded my expectations. The seamless integration and user-friendly interface make it an outstanding tool for any developer. Rakesh-eltropy has done an excellent job with this project, ensuring high performance and reliability. If you're looking for a dependable client, check out https://github.com/rakesh-eltropy/mcp-client. Highly recommended!