I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

chat
Cliente de MCP genérico de código abierto para probar y evaluar servidores y agentes de MCP
3 years
Works with Finder
2
Github Watches
3
Github Forks
37
Github Stars
mcp-chat
Open Source Generic MCP Client for testing & evaluating mcp servers and agents
Quickstart
Make sure that you have ANTHROPIC_API_KEY
exported in your environment or in a .env file in the root of the project. You can get an API key by signing up at the Anthropic Console keys page.
Simple use case that spawns an interactive chat prompt with the filesystem MCP server from CLI:
npx mcp-chat --server "npx -y @modelcontextprotocol/server-filesystem /Users/$USER/Desktop"
This will open up a chat prompt that you can use to interact with the servers and chat with an LLM.
Config
You can also just specify your claude_desktop_config.json (Mac):
npx mcp-chat --config "~/Library/Application Support/Claude/claude_desktop_config.json"
Or (Windows):
npx mcp-chat --config "%APPDATA%\Claude\claude_desktop_config.json"
Web mode
https://github.com/user-attachments/assets/b7e8a648-8084-4955-8cdf-fc6eb141572e
You can also run mcp-chat in web mode by specifying the --web
flag (make sure to have ANTHROPIC_API_KEY
exported in your environment):
npx mcp-chat --web
In web mode, you can start new chats, send messages to the model, and dynamically configure the mcp servers via the UI - no need to specify on the command line. In addition, chats created via the Web UI are saved to ~/.mcpchats/chats just like chats created via the CLI.
Features
- Run via CLI in interactive mode or directly pass prompts with
-p
- Web mode to chat with models via a web interface
--web
- Connect to any MCP server (JS, Python, Docker) in production or during development
- Choose between models with
-m
- Customize system prompt with
--system
- Saves chat history with settings in
~/.mcpchat/chats
including web chats - Save and restore commands in
~/.mcpchat/history
- View tool call output and arguments directly in chat to help debug mcp servers
CLI Usage
Run prompts via CLI with the -p
flag:
npx mcp-chat --server "npx mcp-server-kubernetes" -p "List the pods in the default namespace"
This runs the prompt with the kubenertes mcp-server & exits after the response is received on stdout.
Choose a model to chat with via CLI with the -m
flag:
npx mcp-chat --server "npx mcp-server-kubernetes" -m "claude-3.5"
Uses the model claude-3.5
to chat with. Note that currently only Anthropic models are supported.
Custom system prompt:
--system
flag can be used to specify a system prompt:
npx mcp-chat --system "Explain the output to the user in pirate speak." --server "npx mcp-server-kubernetes" -p "List the pods in the default namespace"
For developers of mcp-servers
You can pass in a local build of a python or node mcp-server to test it out with mcp-chat:
Node JS:
# Directly executing built script
npx mcp-chat --server "/path/to/mcp-server-kubernetes/dist/index.js"
# Using node / bun
npx mcp-chat --server "node /path/to/mcp-server-kubernetes/dist/index.js"
Python:
# Python: Using uv
npx mcp-chat --server "uv --directory /path/to/mcp-server-weather/ run weather.py"
# Using python / python3 - make sure to run in venv or install deps globally
npx mcp-chat --server "/path/to/mcp-server-weather/weather.py"
Development
Install dependencies & run the CLI:
git clone https://github.com/Flux159/mcp-chat
bun install
bun run dev
To develop mcp-chat while connecting to an mcp-server, make a build & run the CLI with the server flag:
npm run build && node dist/index.js --server "npx mcp-server-kubernetes" -p "List the pods in the default namespace"
Testing:
bun run test
Building:
bun run build
Publishing:
bun run publish
Publishing Docker:
bun run dockerbuild
Project Structure
├── src/
│ ├── index.ts # Main client implementation & CLI params
│ ├── constants.ts # Default constants
│ ├── interactive.ts # Interactive chat prompt handling & logic
├── test/ # Test files
│ ├── cli.test.ts # Test CLI params
│ ├── config.test.ts # Test config file parsing
Publishing new release
Go to the releases page, click on "Draft New Release", click "Choose a tag" and create a new tag by typing out a new version number using "v{major}.{minor}.{patch}" semver format. Then, write a release title "Release v{major}.{minor}.{patch}" and description / changelog if necessary and click "Publish Release".
This will create a new tag which will trigger a new release build via the cd.yml workflow. Once successful, the new release will be published to npm. Note that there is no need to update the package.json version manually, as the workflow will automatically update the version number in the package.json file & push a commit to main.
License
相关推荐
Confidential guide on numerology and astrology, based of GG33 Public information
A geek-themed horoscope generator blending Bitcoin prices, tech jargon, and astrological whimsy.
Converts Figma frames into front-end code for various mobile frameworks.
Therapist adept at identifying core issues and offering practical advice with images.
A medical specialist offering assistance grounded in clinical guidelines. Disclaimer: This is intended for research and is NOT safe for clinical use!
Advanced software engineer GPT that excels through nailing the basics.
Descubra la colección más completa y actualizada de servidores MCP en el mercado. Este repositorio sirve como un centro centralizado, que ofrece un extenso catálogo de servidores MCP de código abierto y propietarios, completos con características, enlaces de documentación y colaboradores.
Manipulación basada en Micrypthon I2C del expansor GPIO de la serie MCP, derivada de AdaFruit_MCP230xx
Una puerta de enlace de API unificada para integrar múltiples API de explorador de blockchain similar a Esterscan con soporte de protocolo de contexto modelo (MCP) para asistentes de IA.
Espejo de https: //github.com/suhail-ak-s/mcp-typesense-server
Servidor MCP para obtener contenido de la página web con el navegador sin cabeza de dramaturgo.
Un bot de chat de IA para equipos pequeños y medianos, que apoyan modelos como Deepseek, Open AI, Claude y Gemini. 专为中小团队设计的 ai 聊天应用 , 支持 Deepseek 、 Open ai 、 Claude 、 Géminis 等模型。
本项目是一个钉钉 MCP (Protocolo del conector de mensajes )服务 , 提供了与钉钉企业应用交互的 API 接口。项目基于 Go 语言开发 支持员工信息查询和消息发送等功能。 支持员工信息查询和消息发送等功能。
Reviews

user_Ry8hXkOa
As a loyal mcp-chat user, I highly recommend this incredible tool created by Flux159. It's an efficient, user-friendly chat application that supports seamless communication. Whether you're chatting with friends or working on team projects, mcp-chat's interface and performance are top-notch. Check it out on GitHub at https://github.com/flux159/mcp-chat.