Cover image
Try Now
2025-04-10

3 years

Works with Finder

0

Github Watches

0

Github Forks

0

Github Stars

Formula One MCP Server

PyPI version Python Versions License: MIT

A Model Context Protocol (MCP) server that provides Formula One racing data. This package exposes various tools for querying F1 data including event schedules, driver information, telemetry data, and race results.

Features

  • Event Schedule: Access the complete F1 race calendar for any season
  • Event Information: Detailed data about specific Grand Prix events
  • Session Results: Comprehensive results from races, qualifying sessions, sprints, and practice sessions
  • Driver Information: Access driver details for specific sessions
  • Performance Analysis: Analyze a driver's performance with lap time statistics
  • Driver Comparison: Compare multiple drivers' performances in the same session
  • Telemetry Data: Access detailed telemetry for specific laps
  • Championship Standings: View driver and constructor standings for any season

Installation

In a uv managed python project, add to dependencies by:

uv add f1-mcp-server

Alternatively, for projects using pip for dependencies:

pip install f1-mcp-server

To run the server inside your project:

uv run f1-mcp-server

Or to run it globally in isolated environment:

uvx f1-mcp-server

To install directly from the source:

git clone https://github.com/Machine-To-Machine/f1-mcp-server.git
cd f1-mcp-server
pip install -e .

Usage

Command Line

The server can be run in two modes:

Standard I/O mode (default):

uvx run f1-mcp-server

SSE transport mode (for web applications):

uvx f1-mcp-server --transport sse --port 8000

Python API

from f1_mcp_server import main

# Run the server with default settings
main()

# Or with SSE transport settings
main(port=9000, transport="sse")

API Documentation

The server exposes the following tools via MCP:

Tool Name Description
get_event_schedule Get Formula One race calendar for a specific season
get_event_info Get detailed information about a specific Formula One Grand Prix
get_session_results Get results for a specific Formula One session
get_driver_info Get information about a specific Formula One driver
analyze_driver_performance Analyze a driver's performance in a Formula One session
compare_drivers Compare performance between multiple Formula One drivers
get_telemetry Get telemetry data for a specific Formula One lap
get_championship_standings Get Formula One championship standings

See the FastF1 documentation for detailed information about the underlying data: FastF1 Documentation

Dependencies

  • anyio (>=4.9.0)
  • click (>=8.1.8)
  • fastf1 (>=3.5.3)
  • mcp (>=1.6.0)
  • numpy (>=2.2.4)
  • pandas (>=2.2.3)
  • uvicorn (>=0.34.0)

Development

Setup Development Environment

git clone https://github.com/Machine-To-Machine/f1-mcp-server.git
cd f1-mcp-server
uv venv
source .venv/bin/activate  # On Windows: .venv\Scripts\activate
uv pip install -e ".[dev]"

Code Quality

# Run linting
uv run ruff check .

# Run formatting check
uv run ruff format --check .

# Run security checks
uv run bandit -r src/

Contribution Guidelines

  1. Fork the repository
  2. Create a feature branch: git checkout -b feature-name
  3. Commit your changes: git commit -am 'Add some feature'
  4. Push to the branch: git push origin feature-name
  5. Submit a pull request

License

This project is licensed under the MIT License - see the LICENSE file for details.

Authors

  • Machine To Machine

Acknowledgements

This project leverages FastF1, an excellent Python package for accessing Formula 1 data. We are grateful to its maintainers and contributors.

This project was inspired by rakeshgangwar/f1-mcp-server which was written in TypeScript. The f1_data.py module was mostly adapted from their source code.

相关推荐

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • Lists Tailwind CSS classes in monospaced font

  • lumpenspace
  • Take an adjectivised noun, and create images making it progressively more adjective!

  • https://appia.in
  • Siri Shortcut Finder – your go-to place for discovering amazing Siri Shortcuts with ease

  • Carlos Ferrin
  • Encuentra películas y series en plataformas de streaming.

  • Yusuf Emre Yeşilyurt
  • I find academic articles and books for research and literature reviews.

  • tomoyoshi hirata
  • Sony α7IIIマニュアルアシスタント

  • apappascs
  • Discover the most comprehensive and up-to-date collection of MCP servers in the market. This repository serves as a centralized hub, offering an extensive catalog of open-source and proprietary MCP servers, complete with features, documentation links, and contributors.

  • ShrimpingIt
  • Micropython I2C-based manipulation of the MCP series GPIO expander, derived from Adafruit_MCP230xx

  • jae-jae
  • MCP server for fetch web page content using Playwright headless browser.

  • ravitemer
  • A powerful Neovim plugin for managing MCP (Model Context Protocol) servers

  • patruff
  • Bridge between Ollama and MCP servers, enabling local LLMs to use Model Context Protocol tools

  • pontusab
  • The Cursor & Windsurf community, find rules and MCPs

  • av
  • Effortlessly run LLM backends, APIs, frontends, and services with one command.

  • WangRongsheng
  • 🧑‍🚀 全世界最好的LLM资料总结(Agent框架、辅助编程、数据处理、模型训练、模型推理、o1 模型、MCP、小语言模型、视觉语言模型) | Summary of the world's best LLM resources.

  • Mintplex-Labs
  • The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, No-code agent builder, MCP compatibility, and more.

  • modelcontextprotocol
  • Model Context Protocol Servers

    Reviews

    3 (1)
    Avatar
    user_tF0A1UAG
    2025-04-17

    As a dedicated user of f1-mcp-server, I must say this product is exceptional! Developed by Machine-To-Machine, it provides seamless machine-to-machine communication. The setup via the start URL was straightforward, and the welcome information was clear and helpful. Highly recommend this for anyone looking to enhance their server communication capabilities. Check it out on GitHub!