Cover image
Try Now
2025-04-02

MCP server for experimenting with LLM tools

3 years

Works with Finder

1

Github Watches

0

Github Forks

1

Github Stars

mcp-server

MCP server for experimenting with LLM tools

This has been created to get an understanding of MCP servers, the protocol, and usage within LLMs. It is not intended for reuse!

Dependencies

  • Install 'uv'
  • run uv sync

Unit tests

  • uv run pytest

Launch the server

uv run mcp dev server.py

(.venv) ➜  mcp-server git:(main) ✗ uv run mcp dev server.py
Starting MCP inspector...
Proxy server listening on port 3000

🔍 MCP Inspector is up and running at http://localhost:5173 🚀

View the tools

Image of MCP Inspector

Available Tools

Tool Description Backend Service Required Configuration
add Simple addition tool Local computation None
calculator Evaluates mathematical expressions Local computation None
calculate_bmi Calculates Body Mass Index Local computation None
echo Returns input text unchanged Local computation None
long_task Processes files with progress tracking Local file system None
duckduckgo_search Web search using DuckDuckGo DuckDuckGo HTML endpoint None
wikipedia_search Searches Wikipedia articles Wikipedia API None
fetch_weather Gets current weather by location OpenWeatherMap API OPENWEATHER_API_KEY
openmeteo_forecast Gets detailed weather forecasts Open-Meteo API None
news_search Searches for recent news articles NewsAPI NEWSAPI_KEY
tavily_search AI-powered web search Tavily API TAVILY_API_KEY
arxiv_search Searches academic papers arXiv API None
github_get_file Retrieves file contents from GitHub GitHub API GITHUB_TOKEN
github_list_issues Lists issues in a repository GitHub API GITHUB_TOKEN
github_create_issue Creates a new issue in a repository GitHub API GITHUB_TOKEN
github_list_pull_requests Lists PRs in a repository GitHub API GITHUB_TOKEN
github_search_code Searches code on GitHub GitHub API GITHUB_TOKEN
github_user_activity Gets a user's GitHub activity summary GitHub API GITHUB_TOKEN
create_thumbnail Creates image thumbnails Local image processing None

Environment Variable Configuration

To use tools that require API keys, add the following to your environment:

# Weather services
export OPENWEATHER_API_KEY="your_openweather_api_key"

# News services
export NEWSAPI_KEY="your_newsapi_key"

# Search services
export TAVILY_API_KEY="your_tavily_api_key"

# GitHub tools
export GITHUB_TOKEN="your_github_personal_access_token"

Sample Chat Application

The repository includes a sample chat application that demonstrates how to use MCP tools with the Ollama LLM service.

Prerequisites

  • Install Ollama from https://ollama.ai/
  • Pull the granite model: ollama pull granite3.2:latest (or use any other model)
  • Install additional dependencies: uv pip install litellm colorama python-dotenv httpx

Configuration

Create a .env file in the project root with your configuration:

# Ollama configuration
OLLAMA_SERVER=http://localhost:11434
OLLAMA_MODEL=granite3.2:latest  # Change to any model you have pulled

# MCP server endpoint (default is localhost:3000)
MCP_ENDPOINT=localhost:3000

# Logging configuration
LOG_FILE=chat_interactions.log

# API keys for various services
OPENWEATHER_API_KEY=your_api_key_here
NEWSAPI_KEY=your_api_key_here
TAVILY_API_KEY=your_api_key_here
GITHUB_TOKEN=your_token_here

Launch the Chat Application

First, start the MCP server in one terminal:

uv run mcp dev server.py

Then, run the chat application in another terminal:

python run_chat.py

Interact with the LLM, which now has access to all the tools provided by the MCP server.

Features

  • The chat application automatically uses the MCP tools when appropriate
  • All interactions are logged to the file specified in LOG_FILE
  • Tools will be called when the LLM decides they're needed to answer a question
  • Tool parameters are automatically populated based on the LLM's understanding of the query

Caveats

  • It doesn't yet work with the default model.... work in progress!

相关推荐

  • NiKole Maxwell
  • I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

  • Bora Yalcin
  • Evaluator for marketplace product descriptions, checks for relevancy and keyword stuffing.

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • Lists Tailwind CSS classes in monospaced font

  • lumpenspace
  • Take an adjectivised noun, and create images making it progressively more adjective!

  • Yasir Eryilmaz
  • AI scriptwriting assistant for short, engaging video content.

  • apappascs
  • Discover the most comprehensive and up-to-date collection of MCP servers in the market. This repository serves as a centralized hub, offering an extensive catalog of open-source and proprietary MCP servers, complete with features, documentation links, and contributors.

  • ShrimpingIt
  • Micropython I2C-based manipulation of the MCP series GPIO expander, derived from Adafruit_MCP230xx

  • huahuayu
  • A unified API gateway for integrating multiple etherscan-like blockchain explorer APIs with Model Context Protocol (MCP) support for AI assistants.

  • deemkeen
  • control your mbot2 with a power combo: mqtt+mcp+llm

  • jae-jae
  • MCP server for fetch web page content using Playwright headless browser.

    Reviews

    2 (1)
    Avatar
    user_VoETt2JD
    2025-04-17

    As a longtime user of mcp-server by planetf1, I am thoroughly impressed with its performance and versatility. This server application is robust and reliable, catering to various needs with ease. The GitHub page (https://github.com/planetf1/mcp-server) provides excellent documentation, making it easy to get started. Highly recommend this for anyone needing a dependable server solution!