I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

servidor MCP
Servidor MCP para experimentar con las herramientas LLM
3 years
Works with Finder
1
Github Watches
0
Github Forks
1
Github Stars
mcp-server
MCP server for experimenting with LLM tools
This has been created to get an understanding of MCP servers, the protocol, and usage within LLMs. It is not intended for reuse!
Dependencies
- Install 'uv'
- run
uv sync
Unit tests
-
uv run pytest
Launch the server
uv run mcp dev server.py
(.venv) ➜ mcp-server git:(main) ✗ uv run mcp dev server.py
Starting MCP inspector...
Proxy server listening on port 3000
🔍 MCP Inspector is up and running at http://localhost:5173 🚀
View the tools
Available Tools
Tool | Description | Backend Service | Required Configuration |
---|---|---|---|
add | Simple addition tool | Local computation | None |
calculator | Evaluates mathematical expressions | Local computation | None |
calculate_bmi | Calculates Body Mass Index | Local computation | None |
echo | Returns input text unchanged | Local computation | None |
long_task | Processes files with progress tracking | Local file system | None |
duckduckgo_search | Web search using DuckDuckGo | DuckDuckGo HTML endpoint | None |
wikipedia_search | Searches Wikipedia articles | Wikipedia API | None |
fetch_weather | Gets current weather by location | OpenWeatherMap API | OPENWEATHER_API_KEY |
openmeteo_forecast | Gets detailed weather forecasts | Open-Meteo API | None |
news_search | Searches for recent news articles | NewsAPI | NEWSAPI_KEY |
tavily_search | AI-powered web search | Tavily API | TAVILY_API_KEY |
arxiv_search | Searches academic papers | arXiv API | None |
github_get_file | Retrieves file contents from GitHub | GitHub API | GITHUB_TOKEN |
github_list_issues | Lists issues in a repository | GitHub API | GITHUB_TOKEN |
github_create_issue | Creates a new issue in a repository | GitHub API | GITHUB_TOKEN |
github_list_pull_requests | Lists PRs in a repository | GitHub API | GITHUB_TOKEN |
github_search_code | Searches code on GitHub | GitHub API | GITHUB_TOKEN |
github_user_activity | Gets a user's GitHub activity summary | GitHub API | GITHUB_TOKEN |
create_thumbnail | Creates image thumbnails | Local image processing | None |
Environment Variable Configuration
To use tools that require API keys, add the following to your environment:
# Weather services
export OPENWEATHER_API_KEY="your_openweather_api_key"
# News services
export NEWSAPI_KEY="your_newsapi_key"
# Search services
export TAVILY_API_KEY="your_tavily_api_key"
# GitHub tools
export GITHUB_TOKEN="your_github_personal_access_token"
Sample Chat Application
The repository includes a sample chat application that demonstrates how to use MCP tools with the Ollama LLM service.
Prerequisites
- Install Ollama from https://ollama.ai/
- Pull the granite model: ollama pull granite3.2:latest (or use any other model)
- Install additional dependencies: uv pip install litellm colorama python-dotenv httpx
Configuration
Create a .env file in the project root with your configuration:
# Ollama configuration
OLLAMA_SERVER=http://localhost:11434
OLLAMA_MODEL=granite3.2:latest # Change to any model you have pulled
# MCP server endpoint (default is localhost:3000)
MCP_ENDPOINT=localhost:3000
# Logging configuration
LOG_FILE=chat_interactions.log
# API keys for various services
OPENWEATHER_API_KEY=your_api_key_here
NEWSAPI_KEY=your_api_key_here
TAVILY_API_KEY=your_api_key_here
GITHUB_TOKEN=your_token_here
Launch the Chat Application
First, start the MCP server in one terminal:
uv run mcp dev server.py
Then, run the chat application in another terminal:
python run_chat.py
Interact with the LLM, which now has access to all the tools provided by the MCP server.
Features
- The chat application automatically uses the MCP tools when appropriate
- All interactions are logged to the file specified in LOG_FILE
- Tools will be called when the LLM decides they're needed to answer a question
- Tool parameters are automatically populated based on the LLM's understanding of the query
Caveats
- It doesn't yet work with the default model.... work in progress!
相关推荐
Confidential guide on numerology and astrology, based of GG33 Public information
Converts Figma frames into front-end code for various mobile frameworks.
Advanced software engineer GPT that excels through nailing the basics.
Descubra la colección más completa y actualizada de servidores MCP en el mercado. Este repositorio sirve como un centro centralizado, que ofrece un extenso catálogo de servidores MCP de código abierto y propietarios, completos con características, enlaces de documentación y colaboradores.
Manipulación basada en Micrypthon I2C del expansor GPIO de la serie MCP, derivada de AdaFruit_MCP230xx
Una puerta de enlace de API unificada para integrar múltiples API de explorador de blockchain similar a Esterscan con soporte de protocolo de contexto modelo (MCP) para asistentes de IA.
Espejo de https: //github.com/suhail-ak-s/mcp-typesense-server
本项目是一个钉钉 MCP (Protocolo del conector de mensajes )服务 , 提供了与钉钉企业应用交互的 API 接口。项目基于 Go 语言开发 支持员工信息查询和消息发送等功能。 支持员工信息查询和消息发送等功能。
Reviews

user_VoETt2JD
As a longtime user of mcp-server by planetf1, I am thoroughly impressed with its performance and versatility. This server application is robust and reliable, catering to various needs with ease. The GitHub page (https://github.com/planetf1/mcp-server) provides excellent documentation, making it easy to get started. Highly recommend this for anyone needing a dependable server solution!