Cover image
Try Now
2025-04-09

MCP客户和服务器实验

3 years

Works with Finder

1

Github Watches

0

Github Forks

0

Github Stars

Terminal-based Chat Client with MCP Server Integration

This project demonstrates how to build a terminal-based chat client interface that connects to an MCP server and integrates with OpenAI's API. It includes a simple weather service as an example of MCP functionality.

Prerequisites

  • Python 3.8 or higher
  • UV package manager (a fast, reliable Python package installer and resolver)

Installation

1. Install UV

UV is a modern Python package manager that offers significant performance improvements over traditional tools like pip. It's written in Rust and provides:

  • Faster package installation
  • Reliable dependency resolution
  • Built-in virtual environment management
  • Compatible with existing Python tooling

To install UV, run:

curl -LsSf https://astral.sh/uv/install.sh | sh

2. Project Setup

  1. Initialize a new project:
uv init
  1. Create and activate a virtual environment:
uv venv
source .venv/bin/activate  # On Unix/macOS
# or
.venv\Scripts\activate  # On Windows
  1. Install required packages:
uv pip install httpx mcp[cli] openai python-dotenv

Project Structure and Implementation Guide

The project consists of two main components: a chat client (client.py) and a weather service (weather.py). Let's walk through how each component was built and what each part does.

Building the Chat Client (client.py)

The chat client is built as an asynchronous Python application that connects to both an MCP server and OpenAI's API. Here's how it was constructed:

  1. Imports and Setup

    import asyncio
    import os
    import sys
    from typing import Optional
    from contextlib import AsyncExitStack
    from dotenv import load_dotenv
    import openai
    from mcp import ClientSession, StdioServerParameters
    from mcp.client.stdio import stdio_client
    
    • asyncio: For asynchronous programming
    • AsyncExitStack: Manages cleanup of async resources
    • dotenv: Loads environment variables from .env file
    • mcp: Core MCP functionality for server communication
  2. MCPClient Class The main client class handles:

    • Connection to the MCP server
    • OpenAI API integration
    • Message processing
    • Tool execution

    Key methods:

    • connect_to_server(): Establishes connection to the MCP server
    • process_query(): Handles user queries and tool execution
    • chat_loop(): Manages the interactive chat session
    • cleanup(): Ensures proper resource cleanup
  3. Main Function

    async def main():
        client = MCPClient()
        try:
            await client.connect_to_server(sys.argv[1])
            await client.chat_loop()
        finally:
            await client.cleanup()
    
    • Entry point that initializes the client
    • Connects to the specified server
    • Runs the chat loop
    • Ensures proper cleanup

Building the Weather Service (weather.py)

The weather service is built as an MCP server that provides weather information through the National Weather Service API:

  1. Service Initialization

    from mcp.server.fastmcp import FastMCP
    mcp = FastMCP("weather")
    
    • Creates an MCP server instance named "weather"
    • Sets up the server infrastructure
  2. API Integration

    NWS_API_BASE = "https://api.weather.gov"
    USER_AGENT = "weather-app/1.0"
    
    • Defines constants for the National Weather Service API
    • Sets up proper user agent for API requests
  3. Helper Functions

    • make_nws_request(): Handles API requests with proper error handling
    • format_alert(): Formats weather alerts into readable text
  4. MCP Tools Two main tools are implemented:

    a. get_alerts(state):

    • Fetches active weather alerts for a US state
    • Returns formatted alert information

    b. get_forecast(latitude, longitude):

    • Retrieves weather forecast for a location
    • Returns detailed forecast information
  5. Server Execution

    if __name__ == "__main__":
        mcp.run(transport="stdio")
    
    • Runs the MCP server using stdio transport
    • Enables communication with the chat client

Usage

  1. Create a .env file with your OpenAI API key:
OPENAI_API_KEY=your_api_key_here
  1. Start the MCP server:
python weather.py
  1. In a separate terminal, run the chat client:
python client.py weather.py
  1. Interact with the chat interface:
    • Ask general questions to chat with the AI
    • Use weather-related queries to get weather information
    • Example: "What's the weather in California?" or "Are there any alerts in New York?"

Using with Cursor's Agent Mode

This MCP server can be integrated directly with Cursor's Agent mode (Note: This is different from Cursor's Ask feature and only works in Agent mode). Here's how to set it up:

Adding the MCP Server to Cursor

  1. Open Cursor Settings
  2. Navigate to Features > MCP
  3. Click + Add New MCP Server
  4. Fill out the form:
    • Type: Select stdio
    • Name: "Weather Service" (or any name you prefer)
    • Command: Enter the full path to run the weather server:
      python /full/path/to/your/weather.py
      

Alternative: Project-Specific Configuration

You can also configure the MCP server for your project by creating a .cursor/mcp.json file:

  1. Create the .cursor directory in your project root:
mkdir .cursor
  1. Create mcp.json with the following content:
{
  "mcpServers": {
    "weather": {
      "command": "python",
      "args": [
        "/full/path/to/your/weather.py"
      ]
    }
  }
}

Using the Weather Tools

  1. Open Cursor's Composer (Agent mode)
  2. The Agent will automatically detect when weather information is needed
  3. Example queries:
    • "What's the current weather in San Francisco?"
    • "Are there any weather alerts in California?"
    • "Get me the forecast for New York City"

Important Notes

  • Tools are only available in Cursor's Agent mode (Composer), not in Ask mode
  • By default, Cursor will ask for approval before using MCP tools
  • You may need to click the refresh button in the MCP settings to see newly added tools
  • The server must be running on your local machine (remote servers require SSE transport)

Features

  • Real-time chat interface with OpenAI integration
  • MCP server integration for extensible functionality
  • Weather service with alerts and forecasts
  • Asynchronous operation for better performance
  • Proper error handling and resource cleanup
  • Environment variable configuration for API keys

Contributing

Feel free to submit issues and enhancement requests!

相关推荐

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Alexandru Strujac
  • Efficient thumbnail creator for YouTube videos

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • lumpenspace
  • Take an adjectivised noun, and create images making it progressively more adjective!

  • Lists Tailwind CSS classes in monospaced font

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • tomoyoshi hirata
  • Sony α7IIIマニュアルアシスタント

  • Yusuf Emre Yeşilyurt
  • I find academic articles and books for research and literature reviews.

  • Carlos Ferrin
  • Encuentra películas y series en plataformas de streaming.

  • https://zenepic.net
  • Embark on a thrilling diplomatic quest across a galaxy on the brink of war. Navigate complex politics and alien cultures to forge peace and avert catastrophe in this immersive interstellar adventure.

  • apappascs
  • 发现市场上最全面,最新的MCP服务器集合。该存储库充当集中式枢纽,提供了广泛的开源和专有MCP服务器目录,并提供功能,文档链接和贡献者。

  • ShrimpingIt
  • MCP系列GPIO Expander的基于Micropython I2C的操作,源自ADAFRUIT_MCP230XX

  • jae-jae
  • MCP服务器使用剧作《无头浏览器》获取网页内容。

  • ravitemer
  • 一个功能强大的Neovim插件,用于管理MCP(模型上下文协议)服务器

  • patruff
  • Ollama和MCP服务器之间的桥梁,使本地LLMS可以使用模型上下文协议工具

  • JackKuo666
  • 🔍使AI助手可以通过简单的MCP接口搜索和访问PYPI软件包信息。

  • pontusab
  • 光标与风浪冲浪社区,查找规则和MCP

  • av
  • 毫不费力地使用一个命令运行LLM后端,API,前端和服务。

    Reviews

    4 (1)
    Avatar
    user_3sAz7EEI
    2025-04-17

    As a devoted user of MCP_EXP, I can confidently say it's an exceptional tool. Created by alan-meigs, this utility offers robust features that have significantly streamlined my workflow. The intuitive interface and comprehensive documentation are impressive. Additionally, the community support on GitHub is excellent. For anyone in need of a reliable MCP application, MCP_EXP is a must-try! Highly recommended.