I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

OpenAPI-MCP
The OpenAPI-MCP proxy translates OpenAPI specs into MCP tools, enabling AI agents to access external APIs without custom wrappers!
3 years
Works with Finder
1
Github Watches
3
Github Forks
19
Github Stars
OpenAPI to Model Context Protocol (MCP)
The OpenAPI-MCP proxy translates OpenAPI specs into MCP tools, enabling AI agents to access external APIs without custom wrappers!
Bridge the gap between AI agents and external APIs
The OpenAPI to Model Context Protocol (MCP) proxy server bridges the gap between AI agents and external APIs by dynamically translating OpenAPI specifications into standardized MCP tools, resources, and prompts. This simplifies integration by eliminating the need for custom API wrappers.
- Repository: https://github.com/gujord/OpenAPI-MCP
If you find it useful, please give it a ⭐ on GitHub!
Key Features
-
FastMCP Transport: Optimized for
stdio
, working out-of-the-box with popular LLM orchestrators. - OpenAPI Integration: Parses and registers OpenAPI operations as callable tools.
- Resource Registration: Automatically converts OpenAPI component schemas into resource objects with defined URIs.
- Prompt Generation: Generates contextual prompts based on API operations to guide LLMs in using the API.
- OAuth2 Support: Handles machine authentication via Client Credentials flow.
- JSON-RPC 2.0 Support: Fully compliant request/response structure.
- Auto Metadata: Derives tool names, summaries, and schemas from the OpenAPI specification.
- Sanitized Tool Names: Ensures compatibility with MCP name constraints.
- Flexible Parameter Parsing: Supports query strings (with a leading "?") and multiple JSON variations (including keys with dots and numeric values).
- Enhanced Parameter Handling: Automatically converts parameters to the correct data types.
- Extended Tool Metadata: Includes detailed parameter information and response schemas.
Quick Start
Installation
git clone https://github.com/gujord/OpenAPI-MCP.git
cd OpenAPI-MCP
pip install -r requirements.txt
LLM Orchestrator Configuration
For Claude Desktop, Cursor, and Windsurf, use the snippet below and adapt the paths accordingly:
{
"mcpServers": {
"petstore3": {
"command": "full_path_to_openapi_mcp/venv/bin/python",
"args": ["full_path_to_openapi_mcp/src/server.py"],
"env": {
"SERVER_NAME": "petstore3",
"OPENAPI_URL": "https://petstore3.swagger.io/api/v3/openapi.json"
},
"transport": "stdio"
}
}
}
Apply this configuration to the following files:
- Cursor:
~/.cursor/mcp.json
- Windsurf:
~/.codeium/windsurf/mcp_config.json
- Claude Desktop:
~/Library/Application Support/Claude/claude_desktop_config.json
Replace
full_path_to_openapi_mcp
with your actual installation path.
Environment Configuration
Variable | Description | Required | Default |
---|---|---|---|
OPENAPI_URL |
URL to the OpenAPI specification | Yes | - |
SERVER_NAME |
MCP server name | No | openapi_proxy_server |
OAUTH_CLIENT_ID |
OAuth client ID | No | - |
OAUTH_CLIENT_SECRET |
OAuth client secret | No | - |
OAUTH_TOKEN_URL |
OAuth token endpoint URL | No | - |
OAUTH_SCOPE |
OAuth scope | No | api |
How It Works
-
Parses OpenAPI Spec: Loads the OpenAPI specification using
httpx
andPyYAML
if needed. - Registers Operations: Extracts API operations and generates MCP-compatible tools with proper input and response schemas.
-
Resource Registration: Automatically converts OpenAPI component schemas into resource objects with assigned URIs (e.g.,
/resource/{name}
). - Prompt Generation: Creates contextual prompts based on API operations to assist LLMs in understanding API usage.
- Authentication: Supports OAuth2 authentication via the Client Credentials flow.
- Parameter Handling: Converts parameters to required data types and supports flexible query string and JSON formats.
- JSON-RPC 2.0 Compliance: Ensures standard communication protocols for tool interactions.
sequenceDiagram
participant LLM as LLM (Claude/GPT)
participant MCP as OpenAPI-MCP Proxy
participant API as External API
Note over LLM, API: Communication Process
LLM->>MCP: 1. Initialize (initialize)
MCP-->>LLM: Metadata, tools, resources, and prompts
LLM->>MCP: 2. Request tools (tools_list)
MCP-->>LLM: Detailed list of tools, resources, and prompts
LLM->>MCP: 3. Call tool (tools_call)
alt With OAuth2
MCP->>API: Request OAuth2 token
API-->>MCP: Access Token
end
MCP->>API: 4. Execute API call with proper formatting
API-->>MCP: 5. API response (JSON)
alt Type Conversion
MCP->>MCP: 6. Convert parameters to correct data types
end
MCP-->>LLM: 7. Formatted response from API
alt Dry Run Mode
LLM->>MCP: Call with dry_run=true
MCP-->>LLM: Display request information without executing call
end
Resources & Prompts
In addition to tools, the proxy server now automatically registers:
-
Resources: Derived from OpenAPI component schemas, resource objects are registered with defined URIs (e.g.,
/resource/{name}
) for structured data handling. - Prompts: Contextual prompts are generated based on API operations to provide usage guidance to LLMs, enhancing their understanding of available endpoints.
This extended metadata improves integration by providing comprehensive API context.
Contributing
- Fork this repository.
- Create a new branch.
- Submit a pull request with a clear description of your changes.
License
If you find it useful, please give it a ⭐ on GitHub!
相关推荐
Evaluator for marketplace product descriptions, checks for relevancy and keyword stuffing.
Confidential guide on numerology and astrology, based of GG33 Public information
Converts Figma frames into front-end code for various mobile frameworks.
Advanced software engineer GPT that excels through nailing the basics.
Take an adjectivised noun, and create images making it progressively more adjective!
Discover the most comprehensive and up-to-date collection of MCP servers in the market. This repository serves as a centralized hub, offering an extensive catalog of open-source and proprietary MCP servers, complete with features, documentation links, and contributors.
Micropython I2C-based manipulation of the MCP series GPIO expander, derived from Adafruit_MCP230xx
A unified API gateway for integrating multiple etherscan-like blockchain explorer APIs with Model Context Protocol (MCP) support for AI assistants.
Mirror ofhttps://github.com/agentience/practices_mcp_server
Mirror ofhttps://github.com/bitrefill/bitrefill-mcp-server
Reviews

user_GR8dVmPw
The OpenAI Complete MCP Server by aiamblichus is an exceptional tool for any MCP enthusiast. Its seamless integration and advanced features have significantly boosted my productivity. Highly recommend for anyone looking for a comprehensive MCP solution! Check it out at https://mcp.so/server/mcp-openai-complete/aiamblichus.