Cover image
Try Now
2025-04-11

Führen Sie MCP -Server in Ihrem Terminal aus und verwenden Sie sie

3 years

Works with Finder

1

Github Watches

2

Github Forks

8

Github Stars

MCP Terminal

A terminal-based interactive client for Model Context Protocol (MCP) servers.

Installation

npm install -g mcp-terminal

Features

  • Connect to multiple MCP servers simultaneously
  • Interactive terminal for sending messages to models
  • Easy configuration management
  • Support for both stdio and SSE transports
  • Switch between connected servers

Configuration

Before using MCP CLI, you need to configure at least one server:

mcp-terminal configure

This will open your default editor with a configuration file where you can define MCP servers.

Example configuration:

{
  "mcpServers": {
    "local-sse": {
      "command": "npx @anthropic-ai/mcp-server@latest",
      "args": [],
      "url": "http://localhost:8765/sse"
    },
    "local-stdio": {
      "command": "npx @anthropic-ai/mcp-server@latest",
      "args": ["--stdio"]
    },
    "shopify": {
      "command": "npx",
      "args": [
        "shopify-mcp",
        "--accessToken",
        "your-shopify-access-token",
        "--domain",
        "your-store.myshopify.com"
      ]
    }
  }
}

Notice that servers can be configured with:

  • Both command and url for servers that need to be started locally but use SSE transport
  • Just command for servers that use stdio transport
  • Just url for connecting to remote servers

Usage

Configure MCP servers

mcp-terminal configure

This will open your default editor to configure MCP servers.

Start MCP server

mcp-terminal start

This will start the configured MCP server. You can have multiple servers configured.

Interactive Chat with AI using MCP tools

mcp-terminal chat

This starts an interactive chat session with an AI model that can use MCP tools from your configured server. The LLM can interact with the MCP server tools to help answer your questions and perform actions.

You can specify which server to use:

mcp-terminal chat -s local-stdio

Server Types

The chat command supports two types of server configurations:

  1. URL-based servers - Servers with a URL configured will connect via HTTP/SSE
  2. Command-based servers - Servers with only a command will be started automatically and use stdio transport

Requirements

To use the chat feature, you need:

  1. An OpenAI API key (set as OPENAI_API_KEY environment variable or in a .env file)
  2. A configured MCP server (configure using mcp-terminal configure)

Example usage

$ mcp-terminal chat
Starting chat session with LLM...
Type 'exit' or 'quit' to end the session.

Using MCP server: local-stdio
Starting MCP server: local-stdio
Server is running...
Connected to MCP server via stdio transport

You: What's the weather in New York today?
AI is thinking...
AI: I'd like to check the weather in New York for you, but I need to use a tool to get that information.

I attempted to use a weather tool, but we're currently connected via stdio transport, which doesn't allow me to directly access external tools. In a full implementation with the appropriate tools configured, I would be able to fetch real-time weather data for New York.

To get the actual weather in New York today, you could:
1. Use a different MCP server configured with HTTP/SSE transport and weather tools
2. Check a weather website or app directly
3. Ask me a different question I can answer without external tools

Can I help you with something else?

You: What is MCP?
AI is thinking...
AI: MCP stands for Model Context Protocol. It's an open standard protocol designed to connect AI language models (LLMs) like me with external tools, data sources, and APIs.

Here's what makes MCP important:

1. It allows AI models to extend their capabilities beyond their training data by accessing external tools and real-time information.

2. It provides a standardized way for developers to create tools that AI models can interact with, making integration simpler.

3. It enables AI assistants to perform actions in the real world - things like searching the web, accessing databases, running code, or interacting with services like the weather example you asked about earlier.

4. It can work through different transport methods, such as HTTP/SSE (Server-Sent Events) or stdio (standard input/output), depending on the implementation.

The MCP-terminal tool you're using right now is a client that helps manage MCP servers and facilitates communication between users, AI models, and the tools provided by those servers.

You: exit

## License

MIT

相关推荐

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • Yusuf Emre Yeşilyurt
  • I find academic articles and books for research and literature reviews.

  • Carlos Ferrin
  • Encuentra películas y series en plataformas de streaming.

  • 林乔安妮
  • A fashion stylist GPT offering outfit suggestions for various scenarios.

  • apappascs
  • Entdecken Sie die umfassendste und aktuellste Sammlung von MCP-Servern auf dem Markt. Dieses Repository dient als zentraler Hub und bietet einen umfangreichen Katalog von Open-Source- und Proprietary MCP-Servern mit Funktionen, Dokumentationslinks und Mitwirkenden.

  • pontusab
  • Die Cursor & Windsurf -Community finden Regeln und MCPs

  • av
  • Führen Sie mühelos LLM -Backends, APIs, Frontends und Dienste mit einem Befehl aus.

  • jae-jae
  • MCP -Server für den Fetch -Webseiteninhalt mit dem Headless -Browser von Dramatikern.

  • ravitemer
  • Ein leistungsstarkes Neovim -Plugin für die Verwaltung von MCP -Servern (Modellkontextprotokoll)

  • 1Panel-dev
  • 🔥 1Panel bietet eine intuitive Weboberfläche und einen MCP -Server, um Websites, Dateien, Container, Datenbanken und LLMs auf einem Linux -Server zu verwalten.

  • patruff
  • Brücke zwischen Ollama und MCP -Servern und ermöglicht es lokalen LLMs, Modellkontextprotokoll -Tools zu verwenden

  • Mintplex-Labs
  • Die All-in-One-Desktop & Docker-AI-Anwendung mit integriertem Lappen, AI-Agenten, No-Code-Agent Builder, MCP-Kompatibilität und vielem mehr.

  • GeyserMC
  • Eine Bibliothek für Kommunikation mit einem Minecraft -Client/Server.

    Reviews

    1 (1)
    Avatar
    user_qp3utgaZ
    2025-04-17

    As a dedicated user of mcp-terminal, I've found it to be an incredibly efficient and user-friendly tool for terminal operations. Developed by GeLi2001, it offers seamless navigation and robust performance, making it indispensable for developers. The GitHub link https://github.com/geli2001/mcp-terminal provides easy access to download and explore its features. Highly recommended!