MCP cover image
See in Github
2025-04-10

0

Github Watches

0

Github Forks

0

Github Stars

Formula One MCP Server

PyPI version Python Versions License: MIT

A Model Context Protocol (MCP) server that provides Formula One racing data. This package exposes various tools for querying F1 data including event schedules, driver information, telemetry data, and race results.

Features

  • Event Schedule: Access the complete F1 race calendar for any season
  • Event Information: Detailed data about specific Grand Prix events
  • Session Results: Comprehensive results from races, qualifying sessions, sprints, and practice sessions
  • Driver Information: Access driver details for specific sessions
  • Performance Analysis: Analyze a driver's performance with lap time statistics
  • Driver Comparison: Compare multiple drivers' performances in the same session
  • Telemetry Data: Access detailed telemetry for specific laps
  • Championship Standings: View driver and constructor standings for any season

Installation

In a uv managed python project, add to dependencies by:

uv add f1-mcp-server

Alternatively, for projects using pip for dependencies:

pip install f1-mcp-server

To run the server inside your project:

uv run f1-mcp-server

Or to run it globally in isolated environment:

uvx f1-mcp-server

To install directly from the source:

git clone https://github.com/Machine-To-Machine/f1-mcp-server.git
cd f1-mcp-server
pip install -e .

Usage

Command Line

The server can be run in two modes:

Standard I/O mode (default):

uvx run f1-mcp-server

SSE transport mode (for web applications):

uvx f1-mcp-server --transport sse --port 8000

Python API

from f1_mcp_server import main

# Run the server with default settings
main()

# Or with SSE transport settings
main(port=9000, transport="sse")

API Documentation

The server exposes the following tools via MCP:

Tool Name Description
get_event_schedule Get Formula One race calendar for a specific season
get_event_info Get detailed information about a specific Formula One Grand Prix
get_session_results Get results for a specific Formula One session
get_driver_info Get information about a specific Formula One driver
analyze_driver_performance Analyze a driver's performance in a Formula One session
compare_drivers Compare performance between multiple Formula One drivers
get_telemetry Get telemetry data for a specific Formula One lap
get_championship_standings Get Formula One championship standings

See the FastF1 documentation for detailed information about the underlying data: FastF1 Documentation

Dependencies

  • anyio (>=4.9.0)
  • click (>=8.1.8)
  • fastf1 (>=3.5.3)
  • mcp (>=1.6.0)
  • numpy (>=2.2.4)
  • pandas (>=2.2.3)
  • uvicorn (>=0.34.0)

Development

Setup Development Environment

git clone https://github.com/Machine-To-Machine/f1-mcp-server.git
cd f1-mcp-server
uv venv
source .venv/bin/activate  # On Windows: .venv\Scripts\activate
uv pip install -e ".[dev]"

Code Quality

# Run linting
uv run ruff check .

# Run formatting check
uv run ruff format --check .

# Run security checks
uv run bandit -r src/

Contribution Guidelines

  1. Fork the repository
  2. Create a feature branch: git checkout -b feature-name
  3. Commit your changes: git commit -am 'Add some feature'
  4. Push to the branch: git push origin feature-name
  5. Submit a pull request

License

This project is licensed under the MIT License - see the LICENSE file for details.

Authors

  • Machine To Machine

Acknowledgements

This project leverages FastF1, an excellent Python package for accessing Formula 1 data. We are grateful to its maintainers and contributors.

This project was inspired by rakeshgangwar/f1-mcp-server which was written in TypeScript. The f1_data.py module was mostly adapted from their source code.

相关推荐

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Yusuf Emre Yeşilyurt
  • I find academic articles and books for research and literature reviews.

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • Carlos Ferrin
  • Encuentra películas y series en plataformas de streaming.

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • lumpenspace
  • Take an adjectivised noun, and create images making it progressively more adjective!

  • https://appia.in
  • Siri Shortcut Finder – your go-to place for discovering amazing Siri Shortcuts with ease

  • apappascs
  • Descubra la colección más completa y actualizada de servidores MCP en el mercado. Este repositorio sirve como un centro centralizado, que ofrece un extenso catálogo de servidores MCP de código abierto y propietarios, completos con características, enlaces de documentación y colaboradores.

  • ShrimpingIt
  • Manipulación basada en Micrypthon I2C del expansor GPIO de la serie MCP, derivada de AdaFruit_MCP230xx

  • modelcontextprotocol
  • Servidores de protocolo de contexto modelo

  • jae-jae
  • Servidor MCP para obtener contenido de la página web con el navegador sin cabeza de dramaturgo.

  • Mintplex-Labs
  • La aplicación AI de escritorio todo en uno y Docker con trapo incorporado, agentes de IA, creador de agentes sin código, compatibilidad de MCP y más.

  • ravitemer
  • Un poderoso complemento Neovim para administrar servidores MCP (protocolo de contexto del modelo)

  • patruff
  • Puente entre los servidores Ollama y MCP, lo que permite a LLM locales utilizar herramientas de protocolo de contexto del modelo

  • pontusab
  • La comunidad de cursor y windsurf, encontrar reglas y MCP

  • WangRongsheng
  • 🧑‍🚀 全世界最好的 llM 资料总结(数据处理、模型训练、模型部署、 O1 模型、 MCP 、小语言模型、视觉语言模型) | Resumen de los mejores recursos del mundo.

    Reviews

    3 (1)
    Avatar
    user_tF0A1UAG
    2025-04-17

    As a dedicated user of f1-mcp-server, I must say this product is exceptional! Developed by Machine-To-Machine, it provides seamless machine-to-machine communication. The setup via the start URL was straightforward, and the welcome information was clear and helpful. Highly recommend this for anyone looking to enhance their server communication capabilities. Check it out on GitHub!