I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

pont simple-MCP-Elma
Un simple pont de Olllama à un serveur MCP URL de récupération
1
Github Watches
1
Github Forks
1
Github Stars
MCP LLM Bridge
A bridge connecting Model Context Protocol (MCP) servers to OpenAI-compatible LLMs like Ollama Read more about MCP by Anthropic here:
Quick Start
# Install
curl -LsSf https://astral.sh/uv/install.sh | sh
git clone https://github.com/bartolli/mcp-llm-bridge.git
cd mcp-llm-bridge
uv venv
source .venv/bin/activate
uv pip install -e .
Note: reactivate the environment if needed to use the keys in `.env`: `source .venv/bin/activate`
Then configure the bridge in [src/mcp_llm_bridge/main.py](src/mcp_llm_bridge/main.py)
```python
mcp_server_params=StdioServerParameters(
command="uv",
# CHANGE THIS = it needs to be an absolute directory! add the mcp fetch server at the directory (clone from https://github.com/modelcontextprotocol/servers/)
args=["--directory", "~/llms/mcp/mc-server-fetch/servers/src/fetch", "run", "mcp-server-fetch"],
env=None
),
# llm_config=LLMConfig(
# api_key=os.getenv("OPENAI_API_KEY"),
# model=os.getenv("OPENAI_MODEL", "gpt-4o"),
# base_url=None
# ),
llm_config=LLMConfig(
api_key="ollama", # Can be any string for local testing
model="llama3.2",
base_url="http://localhost:11434/v1" # Point to your local model's endpoint
),
)
Additional Endpoint Support
The bridge also works with any endpoint implementing the OpenAI API specification:
Ollama
llm_config=LLMConfig(
api_key="not-needed",
model="mistral-nemo:12b-instruct-2407-q8_0",
base_url="http://localhost:11434/v1"
)
License
Contributing
PRs welcome.
相关推荐
I find academic articles and books for research and literature reviews.
Evaluator for marketplace product descriptions, checks for relevancy and keyword stuffing.
Confidential guide on numerology and astrology, based of GG33 Public information
Emulating Dr. Jordan B. Peterson's style in providing life advice and insights.
Your go-to expert in the Rust ecosystem, specializing in precise code interpretation, up-to-date crate version checking, and in-depth source code analysis. I offer accurate, context-aware insights for all your Rust programming questions.
Advanced software engineer GPT that excels through nailing the basics.
Découvrez la collection la plus complète et la plus à jour de serveurs MCP sur le marché. Ce référentiel sert de centre centralisé, offrant un vaste catalogue de serveurs MCP open-source et propriétaires, avec des fonctionnalités, des liens de documentation et des contributeurs.
L'application tout-en-un desktop et Docker AI avec chiffon intégré, agents AI, constructeur d'agent sans code, compatibilité MCP, etc.
Manipulation basée sur Micropython I2C de l'exposition GPIO de la série MCP, dérivée d'Adafruit_MCP230XX
Plateforme d'automatisation de workflow à code équitable avec des capacités d'IA natives. Combinez le bâtiment visuel avec du code personnalisé, de l'auto-hôte ou du cloud, 400+ intégrations.
🧑🚀 全世界最好的 LLM 资料总结 (数据处理、模型训练、模型部署、 O1 模型、 MCP 、小语言模型、视觉语言模型) | Résumé des meilleures ressources LLM du monde.
Une liste organisée des serveurs de protocole de contexte de modèle (MCP)
Reviews

user_Dt61n9hh
I have been using Mifosx by openMF for several months now, and it has been a game changer for our financial services! The user interface is intuitive, and the functionality covers all our needs, from client management to comprehensive reporting. The support from the community is fantastic, and continuous updates keep improving the software. Highly recommended for anyone looking for a robust financial inclusion solution. You can learn more about it here: https://mcp.so/server/mifosx/openMF.