Cover image
Try Now
2025-04-11

在终端中运行并使用MCP服务器

3 years

Works with Finder

1

Github Watches

2

Github Forks

8

Github Stars

MCP Terminal

A terminal-based interactive client for Model Context Protocol (MCP) servers.

Installation

npm install -g mcp-terminal

Features

  • Connect to multiple MCP servers simultaneously
  • Interactive terminal for sending messages to models
  • Easy configuration management
  • Support for both stdio and SSE transports
  • Switch between connected servers

Configuration

Before using MCP CLI, you need to configure at least one server:

mcp-terminal configure

This will open your default editor with a configuration file where you can define MCP servers.

Example configuration:

{
  "mcpServers": {
    "local-sse": {
      "command": "npx @anthropic-ai/mcp-server@latest",
      "args": [],
      "url": "http://localhost:8765/sse"
    },
    "local-stdio": {
      "command": "npx @anthropic-ai/mcp-server@latest",
      "args": ["--stdio"]
    },
    "shopify": {
      "command": "npx",
      "args": [
        "shopify-mcp",
        "--accessToken",
        "your-shopify-access-token",
        "--domain",
        "your-store.myshopify.com"
      ]
    }
  }
}

Notice that servers can be configured with:

  • Both command and url for servers that need to be started locally but use SSE transport
  • Just command for servers that use stdio transport
  • Just url for connecting to remote servers

Usage

Configure MCP servers

mcp-terminal configure

This will open your default editor to configure MCP servers.

Start MCP server

mcp-terminal start

This will start the configured MCP server. You can have multiple servers configured.

Interactive Chat with AI using MCP tools

mcp-terminal chat

This starts an interactive chat session with an AI model that can use MCP tools from your configured server. The LLM can interact with the MCP server tools to help answer your questions and perform actions.

You can specify which server to use:

mcp-terminal chat -s local-stdio

Server Types

The chat command supports two types of server configurations:

  1. URL-based servers - Servers with a URL configured will connect via HTTP/SSE
  2. Command-based servers - Servers with only a command will be started automatically and use stdio transport

Requirements

To use the chat feature, you need:

  1. An OpenAI API key (set as OPENAI_API_KEY environment variable or in a .env file)
  2. A configured MCP server (configure using mcp-terminal configure)

Example usage

$ mcp-terminal chat
Starting chat session with LLM...
Type 'exit' or 'quit' to end the session.

Using MCP server: local-stdio
Starting MCP server: local-stdio
Server is running...
Connected to MCP server via stdio transport

You: What's the weather in New York today?
AI is thinking...
AI: I'd like to check the weather in New York for you, but I need to use a tool to get that information.

I attempted to use a weather tool, but we're currently connected via stdio transport, which doesn't allow me to directly access external tools. In a full implementation with the appropriate tools configured, I would be able to fetch real-time weather data for New York.

To get the actual weather in New York today, you could:
1. Use a different MCP server configured with HTTP/SSE transport and weather tools
2. Check a weather website or app directly
3. Ask me a different question I can answer without external tools

Can I help you with something else?

You: What is MCP?
AI is thinking...
AI: MCP stands for Model Context Protocol. It's an open standard protocol designed to connect AI language models (LLMs) like me with external tools, data sources, and APIs.

Here's what makes MCP important:

1. It allows AI models to extend their capabilities beyond their training data by accessing external tools and real-time information.

2. It provides a standardized way for developers to create tools that AI models can interact with, making integration simpler.

3. It enables AI assistants to perform actions in the real world - things like searching the web, accessing databases, running code, or interacting with services like the weather example you asked about earlier.

4. It can work through different transport methods, such as HTTP/SSE (Server-Sent Events) or stdio (standard input/output), depending on the implementation.

The MCP-terminal tool you're using right now is a client that helps manage MCP servers and facilitates communication between users, AI models, and the tools provided by those servers.

You: exit

## License

MIT

相关推荐

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Alexandru Strujac
  • Efficient thumbnail creator for YouTube videos

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • lumpenspace
  • Take an adjectivised noun, and create images making it progressively more adjective!

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • Lists Tailwind CSS classes in monospaced font

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • tomoyoshi hirata
  • Sony α7IIIマニュアルアシスタント

  • Carlos Ferrin
  • Encuentra películas y series en plataformas de streaming.

  • Yusuf Emre Yeşilyurt
  • I find academic articles and books for research and literature reviews.

  • apappascs
  • 发现市场上最全面,最新的MCP服务器集合。该存储库充当集中式枢纽,提供了广泛的开源和专有MCP服务器目录,并提供功能,文档链接和贡献者。

  • ShrimpingIt
  • MCP系列GPIO Expander的基于Micropython I2C的操作,源自ADAFRUIT_MCP230XX

  • jae-jae
  • MCP服务器使用剧作《无头浏览器》获取网页内容。

  • ravitemer
  • 一个功能强大的Neovim插件,用于管理MCP(模型上下文协议)服务器

  • patruff
  • Ollama和MCP服务器之间的桥梁,使本地LLMS可以使用模型上下文协议工具

  • pontusab
  • 光标与风浪冲浪社区,查找规则和MCP

  • av
  • 毫不费力地使用一个命令运行LLM后端,API,前端和服务。

  • Mintplex-Labs
  • 带有内置抹布,AI代理,无代理构建器,MCP兼容性等的多合一桌面和Docker AI应用程序。

  • WangRongsheng
  • 🧑‍🚀 llm 资料总结(数据处理、模型训练、模型部署、 o1 模型、mcp 、小语言模型、视觉语言模型)|摘要世界上最好的LLM资源。

  • modelcontextprotocol
  • 模型上下文协议服务器

    Reviews

    1 (1)
    Avatar
    user_qp3utgaZ
    2025-04-17

    As a dedicated user of mcp-terminal, I've found it to be an incredibly efficient and user-friendly tool for terminal operations. Developed by GeLi2001, it offers seamless navigation and robust performance, making it indispensable for developers. The GitHub link https://github.com/geli2001/mcp-terminal provides easy access to download and explore its features. Highly recommended!