MCP cover image
See in Github
2025-04-02

MCP服务器,用于使用LLM工具实验

1

Github Watches

0

Github Forks

1

Github Stars

mcp-server

MCP server for experimenting with LLM tools

This has been created to get an understanding of MCP servers, the protocol, and usage within LLMs. It is not intended for reuse!

Dependencies

  • Install 'uv'
  • run uv sync

Unit tests

  • uv run pytest

Launch the server

uv run mcp dev server.py

(.venv) ➜  mcp-server git:(main) ✗ uv run mcp dev server.py
Starting MCP inspector...
Proxy server listening on port 3000

🔍 MCP Inspector is up and running at http://localhost:5173 🚀

View the tools

Image of MCP Inspector

Available Tools

Tool Description Backend Service Required Configuration
add Simple addition tool Local computation None
calculator Evaluates mathematical expressions Local computation None
calculate_bmi Calculates Body Mass Index Local computation None
echo Returns input text unchanged Local computation None
long_task Processes files with progress tracking Local file system None
duckduckgo_search Web search using DuckDuckGo DuckDuckGo HTML endpoint None
wikipedia_search Searches Wikipedia articles Wikipedia API None
fetch_weather Gets current weather by location OpenWeatherMap API OPENWEATHER_API_KEY
openmeteo_forecast Gets detailed weather forecasts Open-Meteo API None
news_search Searches for recent news articles NewsAPI NEWSAPI_KEY
tavily_search AI-powered web search Tavily API TAVILY_API_KEY
arxiv_search Searches academic papers arXiv API None
github_get_file Retrieves file contents from GitHub GitHub API GITHUB_TOKEN
github_list_issues Lists issues in a repository GitHub API GITHUB_TOKEN
github_create_issue Creates a new issue in a repository GitHub API GITHUB_TOKEN
github_list_pull_requests Lists PRs in a repository GitHub API GITHUB_TOKEN
github_search_code Searches code on GitHub GitHub API GITHUB_TOKEN
github_user_activity Gets a user's GitHub activity summary GitHub API GITHUB_TOKEN
create_thumbnail Creates image thumbnails Local image processing None

Environment Variable Configuration

To use tools that require API keys, add the following to your environment:

# Weather services
export OPENWEATHER_API_KEY="your_openweather_api_key"

# News services
export NEWSAPI_KEY="your_newsapi_key"

# Search services
export TAVILY_API_KEY="your_tavily_api_key"

# GitHub tools
export GITHUB_TOKEN="your_github_personal_access_token"

Sample Chat Application

The repository includes a sample chat application that demonstrates how to use MCP tools with the Ollama LLM service.

Prerequisites

  • Install Ollama from https://ollama.ai/
  • Pull the granite model: ollama pull granite3.2:latest (or use any other model)
  • Install additional dependencies: uv pip install litellm colorama python-dotenv httpx

Configuration

Create a .env file in the project root with your configuration:

# Ollama configuration
OLLAMA_SERVER=http://localhost:11434
OLLAMA_MODEL=granite3.2:latest  # Change to any model you have pulled

# MCP server endpoint (default is localhost:3000)
MCP_ENDPOINT=localhost:3000

# Logging configuration
LOG_FILE=chat_interactions.log

# API keys for various services
OPENWEATHER_API_KEY=your_api_key_here
NEWSAPI_KEY=your_api_key_here
TAVILY_API_KEY=your_api_key_here
GITHUB_TOKEN=your_token_here

Launch the Chat Application

First, start the MCP server in one terminal:

uv run mcp dev server.py

Then, run the chat application in another terminal:

python run_chat.py

Interact with the LLM, which now has access to all the tools provided by the MCP server.

Features

  • The chat application automatically uses the MCP tools when appropriate
  • All interactions are logged to the file specified in LOG_FILE
  • Tools will be called when the LLM decides they're needed to answer a question
  • Tool parameters are automatically populated based on the LLM's understanding of the query

Caveats

  • It doesn't yet work with the default model.... work in progress!

相关推荐

  • NiKole Maxwell
  • I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Yusuf Emre Yeşilyurt
  • I find academic articles and books for research and literature reviews.

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • Carlos Ferrin
  • Encuentra películas y series en plataformas de streaming.

  • Bora Yalcin
  • Evaluator for marketplace product descriptions, checks for relevancy and keyword stuffing.

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • Contraband Interactive
  • Emulating Dr. Jordan B. Peterson's style in providing life advice and insights.

  • rustassistant.com
  • Your go-to expert in the Rust ecosystem, specializing in precise code interpretation, up-to-date crate version checking, and in-depth source code analysis. I offer accurate, context-aware insights for all your Rust programming questions.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • Alexandru Strujac
  • Efficient thumbnail creator for YouTube videos

  • apappascs
  • 发现市场上最全面,最新的MCP服务器集合。该存储库充当集中式枢纽,提供了广泛的开源和专有MCP服务器目录,并提供功能,文档链接和贡献者。

  • Mintplex-Labs
  • 带有内置抹布,AI代理,无代理构建器,MCP兼容性等的多合一桌面和Docker AI应用程序。

  • modelcontextprotocol
  • 模型上下文协议服务器

  • ShrimpingIt
  • MCP系列GPIO Expander的基于Micropython I2C的操作,源自ADAFRUIT_MCP230XX

  • n8n-io
  • 具有本机AI功能的公平代码工作流程自动化平台。将视觉构建与自定义代码,自宿主或云相结合,400+集成。

  • WangRongsheng
  • 🧑‍🚀 llm 资料总结(数据处理、模型训练、模型部署、 o1 模型、mcp 、小语言模型、视觉语言模型)|摘要世界上最好的LLM资源。

  • open-webui
  • 用户友好的AI接口(支持Ollama,OpenAi API,...)

    Reviews

    2 (1)
    Avatar
    user_VoETt2JD
    2025-04-17

    As a longtime user of mcp-server by planetf1, I am thoroughly impressed with its performance and versatility. This server application is robust and reliable, catering to various needs with ease. The GitHub page (https://github.com/planetf1/mcp-server) provides excellent documentation, making it easy to get started. Highly recommend this for anyone needing a dependable server solution!