Cover image
Try Now
2025-04-02

MCP服务器,用于使用LLM工具实验

3 years

Works with Finder

1

Github Watches

0

Github Forks

1

Github Stars

mcp-server

MCP server for experimenting with LLM tools

This has been created to get an understanding of MCP servers, the protocol, and usage within LLMs. It is not intended for reuse!

Dependencies

  • Install 'uv'
  • run uv sync

Unit tests

  • uv run pytest

Launch the server

uv run mcp dev server.py

(.venv) ➜  mcp-server git:(main) ✗ uv run mcp dev server.py
Starting MCP inspector...
Proxy server listening on port 3000

🔍 MCP Inspector is up and running at http://localhost:5173 🚀

View the tools

Image of MCP Inspector

Available Tools

Tool Description Backend Service Required Configuration
add Simple addition tool Local computation None
calculator Evaluates mathematical expressions Local computation None
calculate_bmi Calculates Body Mass Index Local computation None
echo Returns input text unchanged Local computation None
long_task Processes files with progress tracking Local file system None
duckduckgo_search Web search using DuckDuckGo DuckDuckGo HTML endpoint None
wikipedia_search Searches Wikipedia articles Wikipedia API None
fetch_weather Gets current weather by location OpenWeatherMap API OPENWEATHER_API_KEY
openmeteo_forecast Gets detailed weather forecasts Open-Meteo API None
news_search Searches for recent news articles NewsAPI NEWSAPI_KEY
tavily_search AI-powered web search Tavily API TAVILY_API_KEY
arxiv_search Searches academic papers arXiv API None
github_get_file Retrieves file contents from GitHub GitHub API GITHUB_TOKEN
github_list_issues Lists issues in a repository GitHub API GITHUB_TOKEN
github_create_issue Creates a new issue in a repository GitHub API GITHUB_TOKEN
github_list_pull_requests Lists PRs in a repository GitHub API GITHUB_TOKEN
github_search_code Searches code on GitHub GitHub API GITHUB_TOKEN
github_user_activity Gets a user's GitHub activity summary GitHub API GITHUB_TOKEN
create_thumbnail Creates image thumbnails Local image processing None

Environment Variable Configuration

To use tools that require API keys, add the following to your environment:

# Weather services
export OPENWEATHER_API_KEY="your_openweather_api_key"

# News services
export NEWSAPI_KEY="your_newsapi_key"

# Search services
export TAVILY_API_KEY="your_tavily_api_key"

# GitHub tools
export GITHUB_TOKEN="your_github_personal_access_token"

Sample Chat Application

The repository includes a sample chat application that demonstrates how to use MCP tools with the Ollama LLM service.

Prerequisites

  • Install Ollama from https://ollama.ai/
  • Pull the granite model: ollama pull granite3.2:latest (or use any other model)
  • Install additional dependencies: uv pip install litellm colorama python-dotenv httpx

Configuration

Create a .env file in the project root with your configuration:

# Ollama configuration
OLLAMA_SERVER=http://localhost:11434
OLLAMA_MODEL=granite3.2:latest  # Change to any model you have pulled

# MCP server endpoint (default is localhost:3000)
MCP_ENDPOINT=localhost:3000

# Logging configuration
LOG_FILE=chat_interactions.log

# API keys for various services
OPENWEATHER_API_KEY=your_api_key_here
NEWSAPI_KEY=your_api_key_here
TAVILY_API_KEY=your_api_key_here
GITHUB_TOKEN=your_token_here

Launch the Chat Application

First, start the MCP server in one terminal:

uv run mcp dev server.py

Then, run the chat application in another terminal:

python run_chat.py

Interact with the LLM, which now has access to all the tools provided by the MCP server.

Features

  • The chat application automatically uses the MCP tools when appropriate
  • All interactions are logged to the file specified in LOG_FILE
  • Tools will be called when the LLM decides they're needed to answer a question
  • Tool parameters are automatically populated based on the LLM's understanding of the query

Caveats

  • It doesn't yet work with the default model.... work in progress!

相关推荐

  • NiKole Maxwell
  • I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • Yasir Eryilmaz
  • AI scriptwriting assistant for short, engaging video content.

  • Alexandru Strujac
  • Efficient thumbnail creator for YouTube videos

  • Daren White
  • A supportive coach for mastering all Spanish tenses.

  • J. DE HARO OLLE
  • Especialista en juegos de palabras en varios idiomas.

  • albert tan
  • Japanese education, creating tailored learning experiences.

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • apappascs
  • 发现市场上最全面,最新的MCP服务器集合。该存储库充当集中式枢纽,提供了广泛的开源和专有MCP服务器目录,并提供功能,文档链接和贡献者。

  • ShrimpingIt
  • MCP系列GPIO Expander的基于Micropython I2C的操作,源自ADAFRUIT_MCP230XX

  • huahuayu
  • 统一的API网关,用于将多个Etherscan样区块链Explorer API与对AI助手的模型上下文协议(MCP)支持。

  • deemkeen
  • 用电源组合控制您的MBOT2:MQTT+MCP+LLM

  • zhaoyunxing92
  • MCP(消息连接器协议)服务

  • pontusab
  • 光标与风浪冲浪社区,查找规则和MCP

    Reviews

    2 (1)
    Avatar
    user_VoETt2JD
    2025-04-17

    As a longtime user of mcp-server by planetf1, I am thoroughly impressed with its performance and versatility. This server application is robust and reliable, catering to various needs with ease. The GitHub page (https://github.com/planetf1/mcp-server) provides excellent documentation, making it easy to get started. Highly recommend this for anyone needing a dependable server solution!