Cover image
Try Now
2025-03-26

Le proxy OpenAPI-MCP traduit les spécifications OpenAPI en outils MCP, permettant aux agents d'IA d'accéder aux API externes sans emballages personnalisés!

3 years

Works with Finder

1

Github Watches

3

Github Forks

19

Github Stars

OpenAPI to Model Context Protocol (MCP)

License: MIT Repo Size Last Commit Open Issues Python version

The OpenAPI-MCP proxy translates OpenAPI specs into MCP tools, enabling AI agents to access external APIs without custom wrappers!

OpenAPI-MCP

Bridge the gap between AI agents and external APIs

The OpenAPI to Model Context Protocol (MCP) proxy server bridges the gap between AI agents and external APIs by dynamically translating OpenAPI specifications into standardized MCP tools, resources, and prompts. This simplifies integration by eliminating the need for custom API wrappers.


If you find it useful, please give it a ⭐ on GitHub!


Key Features

  • FastMCP Transport: Optimized for stdio, working out-of-the-box with popular LLM orchestrators.
  • OpenAPI Integration: Parses and registers OpenAPI operations as callable tools.
  • Resource Registration: Automatically converts OpenAPI component schemas into resource objects with defined URIs.
  • Prompt Generation: Generates contextual prompts based on API operations to guide LLMs in using the API.
  • OAuth2 Support: Handles machine authentication via Client Credentials flow.
  • JSON-RPC 2.0 Support: Fully compliant request/response structure.
  • Auto Metadata: Derives tool names, summaries, and schemas from the OpenAPI specification.
  • Sanitized Tool Names: Ensures compatibility with MCP name constraints.
  • Flexible Parameter Parsing: Supports query strings (with a leading "?") and multiple JSON variations (including keys with dots and numeric values).
  • Enhanced Parameter Handling: Automatically converts parameters to the correct data types.
  • Extended Tool Metadata: Includes detailed parameter information and response schemas.

Quick Start

Installation

git clone https://github.com/gujord/OpenAPI-MCP.git
cd OpenAPI-MCP
pip install -r requirements.txt

LLM Orchestrator Configuration

For Claude Desktop, Cursor, and Windsurf, use the snippet below and adapt the paths accordingly:

{
  "mcpServers": {

    "petstore3": {
      "command": "full_path_to_openapi_mcp/venv/bin/python",
      "args": ["full_path_to_openapi_mcp/src/server.py"],
      "env": {
        "SERVER_NAME": "petstore3",
        "OPENAPI_URL": "https://petstore3.swagger.io/api/v3/openapi.json"
      },
      "transport": "stdio"
    }

  }
}

Apply this configuration to the following files:

  • Cursor: ~/.cursor/mcp.json
  • Windsurf: ~/.codeium/windsurf/mcp_config.json
  • Claude Desktop: ~/Library/Application Support/Claude/claude_desktop_config.json

Replace full_path_to_openapi_mcp with your actual installation path.

Environment Configuration

Variable Description Required Default
OPENAPI_URL URL to the OpenAPI specification Yes -
SERVER_NAME MCP server name No openapi_proxy_server
OAUTH_CLIENT_ID OAuth client ID No -
OAUTH_CLIENT_SECRET OAuth client secret No -
OAUTH_TOKEN_URL OAuth token endpoint URL No -
OAUTH_SCOPE OAuth scope No api

How It Works

  1. Parses OpenAPI Spec: Loads the OpenAPI specification using httpx and PyYAML if needed.
  2. Registers Operations: Extracts API operations and generates MCP-compatible tools with proper input and response schemas.
  3. Resource Registration: Automatically converts OpenAPI component schemas into resource objects with assigned URIs (e.g., /resource/{name}).
  4. Prompt Generation: Creates contextual prompts based on API operations to assist LLMs in understanding API usage.
  5. Authentication: Supports OAuth2 authentication via the Client Credentials flow.
  6. Parameter Handling: Converts parameters to required data types and supports flexible query string and JSON formats.
  7. JSON-RPC 2.0 Compliance: Ensures standard communication protocols for tool interactions.
sequenceDiagram
    participant LLM as LLM (Claude/GPT)
    participant MCP as OpenAPI-MCP Proxy
    participant API as External API

    Note over LLM, API: Communication Process

    LLM->>MCP: 1. Initialize (initialize)
    MCP-->>LLM: Metadata, tools, resources, and prompts

    LLM->>MCP: 2. Request tools (tools_list)
    MCP-->>LLM: Detailed list of tools, resources, and prompts

    LLM->>MCP: 3. Call tool (tools_call)

    alt With OAuth2
        MCP->>API: Request OAuth2 token
        API-->>MCP: Access Token
    end

    MCP->>API: 4. Execute API call with proper formatting
    API-->>MCP: 5. API response (JSON)

    alt Type Conversion
        MCP->>MCP: 6. Convert parameters to correct data types
    end

    MCP-->>LLM: 7. Formatted response from API

    alt Dry Run Mode
        LLM->>MCP: Call with dry_run=true
        MCP-->>LLM: Display request information without executing call
    end

Resources & Prompts

In addition to tools, the proxy server now automatically registers:

  • Resources: Derived from OpenAPI component schemas, resource objects are registered with defined URIs (e.g., /resource/{name}) for structured data handling.
  • Prompts: Contextual prompts are generated based on API operations to provide usage guidance to LLMs, enhancing their understanding of available endpoints.

This extended metadata improves integration by providing comprehensive API context.

OpenAPI-MCP

Contributing

  • Fork this repository.
  • Create a new branch.
  • Submit a pull request with a clear description of your changes.

License

MIT License

If you find it useful, please give it a ⭐ on GitHub!

相关推荐

  • NiKole Maxwell
  • I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • lumpenspace
  • Take an adjectivised noun, and create images making it progressively more adjective!

  • Yasir Eryilmaz
  • AI scriptwriting assistant for short, engaging video content.

  • apappascs
  • Découvrez la collection la plus complète et la plus à jour de serveurs MCP sur le marché. Ce référentiel sert de centre centralisé, offrant un vaste catalogue de serveurs MCP open-source et propriétaires, avec des fonctionnalités, des liens de documentation et des contributeurs.

  • ShrimpingIt
  • Manipulation basée sur Micropython I2C de l'exposition GPIO de la série MCP, dérivée d'Adafruit_MCP230XX

  • huahuayu
  • Une passerelle API unifiée pour intégrer plusieurs API d'explorateur de blockchain de type étherscan avec la prise en charge du protocole de contexte modèle (MCP) pour les assistants d'IA.

  • deemkeen
  • Contrôlez votre MBOT2 avec un combo d'alimentation: MQTT + MCP + LLM

  • jae-jae
  • MCP Server pour récupérer le contenu de la page Web à l'aide du navigateur sans tête du dramwright.

    Reviews

    2 (1)
    Avatar
    user_GR8dVmPw
    2025-04-15

    The OpenAI Complete MCP Server by aiamblichus is an exceptional tool for any MCP enthusiast. Its seamless integration and advanced features have significantly boosted my productivity. Highly recommend for anyone looking for a comprehensive MCP solution! Check it out at https://mcp.so/server/mcp-openai-complete/aiamblichus.