
puente mcp-llm
Implementación de MCP que permite la comunicación entre los servidores MCP y los LLM compatibles con OpenAI
10
Github Watches
35
Github Forks
288
Github Stars
MCP LLM Bridge
A bridge connecting Model Context Protocol (MCP) servers to OpenAI-compatible LLMs. Primary support for OpenAI API, with additional compatibility for local endpoints that implement the OpenAI API specification.
The implementation provides a bidirectional protocol translation layer between MCP and OpenAI's function-calling interface. It converts MCP tool specifications into OpenAI function schemas and handles the mapping of function invocations back to MCP tool executions. This enables any OpenAI-compatible language model to leverage MCP-compliant tools through a standardized interface, whether using cloud-based models or local implementations like Ollama.
Read more about MCP by Anthropic here:
Demo:
Quick Start
# Install
curl -LsSf https://astral.sh/uv/install.sh | sh
git clone https://github.com/bartolli/mcp-llm-bridge.git
cd mcp-llm-bridge
uv venv
source .venv/bin/activate
uv pip install -e .
# Create test database
python -m mcp_llm_bridge.create_test_db
Configuration
OpenAI (Primary)
Create .env
:
OPENAI_API_KEY=your_key
OPENAI_MODEL=gpt-4o # or any other OpenAI model that supports tools
Note: reactivate the environment if needed to use the keys in .env
: source .venv/bin/activate
Then configure the bridge in src/mcp_llm_bridge/main.py
config = BridgeConfig(
mcp_server_params=StdioServerParameters(
command="uvx",
args=["mcp-server-sqlite", "--db-path", "test.db"],
env=None
),
llm_config=LLMConfig(
api_key=os.getenv("OPENAI_API_KEY"),
model=os.getenv("OPENAI_MODEL", "gpt-4o"),
base_url=None
)
)
Additional Endpoint Support
The bridge also works with any endpoint implementing the OpenAI API specification:
Ollama
llm_config=LLMConfig(
api_key="not-needed",
model="mistral-nemo:12b-instruct-2407-q8_0",
base_url="http://localhost:11434/v1"
)
Note: After testing various models, including llama3.2:3b-instruct-fp16
, I found that mistral-nemo:12b-instruct-2407-q8_0
handles complex queries more effectively.
LM Studio
llm_config=LLMConfig(
api_key="not-needed",
model="local-model",
base_url="http://localhost:1234/v1"
)
I didn't test this, but it should work.
Usage
python -m mcp_llm_bridge.main
# Try: "What are the most expensive products in the database?"
# Exit with 'quit' or Ctrl+C
Running Tests
Install the package with test dependencies:
uv pip install -e ".[test]"
Then run the tests:
python -m pytest -v tests/
License
Contributing
PRs welcome.
相关推荐
Advanced software engineer GPT that excels through nailing the basics.
Emulating Dr. Jordan B. Peterson's style in providing life advice and insights.
Embark on a thrilling diplomatic quest across a galaxy on the brink of war. Navigate complex politics and alien cultures to forge peace and avert catastrophe in this immersive interstellar adventure.
Your go-to expert in the Rust ecosystem, specializing in precise code interpretation, up-to-date crate version checking, and in-depth source code analysis. I offer accurate, context-aware insights for all your Rust programming questions.
La aplicación AI de escritorio todo en uno y Docker con trapo incorporado, agentes de IA, creador de agentes sin código, compatibilidad de MCP y más.
Puente entre los servidores Ollama y MCP, lo que permite a LLM locales utilizar herramientas de protocolo de contexto del modelo
Plataforma de automatización de flujo de trabajo de código justo con capacidades de IA nativas. Combine el edificio visual con código personalizado, auto-anfitrión o nube, más de 400 integraciones.
🧑🚀 全世界最好的 llM 资料总结(数据处理、模型训练、模型部署、 O1 模型、 MCP 、小语言模型、视觉语言模型) | Resumen de los mejores recursos del mundo.
Una lista curada de servidores de protocolo de contexto del modelo (MCP)
Este repositorio es para el desarrollo del servidor Azure MCP, llevando el poder de Azure a sus agentes.
Servidores MCP impresionantes: una lista curada de servidores de protocolo de contexto del modelo
Reviews

user_TlJH7KBB
I've been using the mcp-llm-bridge by bartolli from the GitHub link provided, and it has truly enhanced my application experience. The seamless integration and user-friendly interface make it ideal for developers. Highly recommend checking it out for optimizing your MCP applications!