I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

MCP
Client MCP générique open source pour tester et évaluer les serveurs et agents MCP
3 years
Works with Finder
2
Github Watches
3
Github Forks
37
Github Stars
mcp-chat
Open Source Generic MCP Client for testing & evaluating mcp servers and agents
Quickstart
Make sure that you have ANTHROPIC_API_KEY
exported in your environment or in a .env file in the root of the project. You can get an API key by signing up at the Anthropic Console keys page.
Simple use case that spawns an interactive chat prompt with the filesystem MCP server from CLI:
npx mcp-chat --server "npx -y @modelcontextprotocol/server-filesystem /Users/$USER/Desktop"
This will open up a chat prompt that you can use to interact with the servers and chat with an LLM.
Config
You can also just specify your claude_desktop_config.json (Mac):
npx mcp-chat --config "~/Library/Application Support/Claude/claude_desktop_config.json"
Or (Windows):
npx mcp-chat --config "%APPDATA%\Claude\claude_desktop_config.json"
Web mode
https://github.com/user-attachments/assets/b7e8a648-8084-4955-8cdf-fc6eb141572e
You can also run mcp-chat in web mode by specifying the --web
flag (make sure to have ANTHROPIC_API_KEY
exported in your environment):
npx mcp-chat --web
In web mode, you can start new chats, send messages to the model, and dynamically configure the mcp servers via the UI - no need to specify on the command line. In addition, chats created via the Web UI are saved to ~/.mcpchats/chats just like chats created via the CLI.
Features
- Run via CLI in interactive mode or directly pass prompts with
-p
- Web mode to chat with models via a web interface
--web
- Connect to any MCP server (JS, Python, Docker) in production or during development
- Choose between models with
-m
- Customize system prompt with
--system
- Saves chat history with settings in
~/.mcpchat/chats
including web chats - Save and restore commands in
~/.mcpchat/history
- View tool call output and arguments directly in chat to help debug mcp servers
CLI Usage
Run prompts via CLI with the -p
flag:
npx mcp-chat --server "npx mcp-server-kubernetes" -p "List the pods in the default namespace"
This runs the prompt with the kubenertes mcp-server & exits after the response is received on stdout.
Choose a model to chat with via CLI with the -m
flag:
npx mcp-chat --server "npx mcp-server-kubernetes" -m "claude-3.5"
Uses the model claude-3.5
to chat with. Note that currently only Anthropic models are supported.
Custom system prompt:
--system
flag can be used to specify a system prompt:
npx mcp-chat --system "Explain the output to the user in pirate speak." --server "npx mcp-server-kubernetes" -p "List the pods in the default namespace"
For developers of mcp-servers
You can pass in a local build of a python or node mcp-server to test it out with mcp-chat:
Node JS:
# Directly executing built script
npx mcp-chat --server "/path/to/mcp-server-kubernetes/dist/index.js"
# Using node / bun
npx mcp-chat --server "node /path/to/mcp-server-kubernetes/dist/index.js"
Python:
# Python: Using uv
npx mcp-chat --server "uv --directory /path/to/mcp-server-weather/ run weather.py"
# Using python / python3 - make sure to run in venv or install deps globally
npx mcp-chat --server "/path/to/mcp-server-weather/weather.py"
Development
Install dependencies & run the CLI:
git clone https://github.com/Flux159/mcp-chat
bun install
bun run dev
To develop mcp-chat while connecting to an mcp-server, make a build & run the CLI with the server flag:
npm run build && node dist/index.js --server "npx mcp-server-kubernetes" -p "List the pods in the default namespace"
Testing:
bun run test
Building:
bun run build
Publishing:
bun run publish
Publishing Docker:
bun run dockerbuild
Project Structure
├── src/
│ ├── index.ts # Main client implementation & CLI params
│ ├── constants.ts # Default constants
│ ├── interactive.ts # Interactive chat prompt handling & logic
├── test/ # Test files
│ ├── cli.test.ts # Test CLI params
│ ├── config.test.ts # Test config file parsing
Publishing new release
Go to the releases page, click on "Draft New Release", click "Choose a tag" and create a new tag by typing out a new version number using "v{major}.{minor}.{patch}" semver format. Then, write a release title "Release v{major}.{minor}.{patch}" and description / changelog if necessary and click "Publish Release".
This will create a new tag which will trigger a new release build via the cd.yml workflow. Once successful, the new release will be published to npm. Note that there is no need to update the package.json version manually, as the workflow will automatically update the version number in the package.json file & push a commit to main.
License
相关推荐
Confidential guide on numerology and astrology, based of GG33 Public information
A geek-themed horoscope generator blending Bitcoin prices, tech jargon, and astrological whimsy.
Therapist adept at identifying core issues and offering practical advice with images.
A medical specialist offering assistance grounded in clinical guidelines. Disclaimer: This is intended for research and is NOT safe for clinical use!
Advanced software engineer GPT that excels through nailing the basics.
Découvrez la collection la plus complète et la plus à jour de serveurs MCP sur le marché. Ce référentiel sert de centre centralisé, offrant un vaste catalogue de serveurs MCP open-source et propriétaires, avec des fonctionnalités, des liens de documentation et des contributeurs.
Manipulation basée sur Micropython I2C de l'exposition GPIO de la série MCP, dérivée d'Adafruit_MCP230XX
Une passerelle API unifiée pour intégrer plusieurs API d'explorateur de blockchain de type étherscan avec la prise en charge du protocole de contexte modèle (MCP) pour les assistants d'IA.
Miroir dehttps: //github.com/suhail-ak-s/mcp-typeseense-server
MCP Server pour récupérer le contenu de la page Web à l'aide du navigateur sans tête du dramwright.
Un bot de chat IA pour les petites et moyennes équipes, soutenant des modèles tels que Deepseek, Open AI, Claude et Gemini. 专为中小团队设计的 Ai 聊天应用 , 支持 Deepseek 、 Open Ai 、 Claude 、 Gemini 等模型。
本项目是一个钉钉 MCP (Protocole de connecteur de message) 服务 , 提供了与钉钉企业应用交互的 API 接口。项目基于 Go 语言开发 , 支持员工信息查询和消息发送等功能。
Reviews

user_Ry8hXkOa
As a loyal mcp-chat user, I highly recommend this incredible tool created by Flux159. It's an efficient, user-friendly chat application that supports seamless communication. Whether you're chatting with friends or working on team projects, mcp-chat's interface and performance are top-notch. Check it out on GitHub at https://github.com/flux159/mcp-chat.