I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

pont simple-MCP-Elma
Un simple pont de Olllama à un serveur MCP URL de récupération
1
Github Watches
1
Github Forks
1
Github Stars
MCP LLM Bridge
A bridge connecting Model Context Protocol (MCP) servers to OpenAI-compatible LLMs like Ollama Read more about MCP by Anthropic here:
Quick Start
# Install
curl -LsSf https://astral.sh/uv/install.sh | sh
git clone https://github.com/bartolli/mcp-llm-bridge.git
cd mcp-llm-bridge
uv venv
source .venv/bin/activate
uv pip install -e .
Note: reactivate the environment if needed to use the keys in `.env`: `source .venv/bin/activate`
Then configure the bridge in [src/mcp_llm_bridge/main.py](src/mcp_llm_bridge/main.py)
```python
mcp_server_params=StdioServerParameters(
command="uv",
# CHANGE THIS = it needs to be an absolute directory! add the mcp fetch server at the directory (clone from https://github.com/modelcontextprotocol/servers/)
args=["--directory", "~/llms/mcp/mc-server-fetch/servers/src/fetch", "run", "mcp-server-fetch"],
env=None
),
# llm_config=LLMConfig(
# api_key=os.getenv("OPENAI_API_KEY"),
# model=os.getenv("OPENAI_MODEL", "gpt-4o"),
# base_url=None
# ),
llm_config=LLMConfig(
api_key="ollama", # Can be any string for local testing
model="llama3.2",
base_url="http://localhost:11434/v1" # Point to your local model's endpoint
),
)
Additional Endpoint Support
The bridge also works with any endpoint implementing the OpenAI API specification:
Ollama
llm_config=LLMConfig(
api_key="not-needed",
model="mistral-nemo:12b-instruct-2407-q8_0",
base_url="http://localhost:11434/v1"
)
License
Contributing
PRs welcome.
相关推荐
Evaluator for marketplace product descriptions, checks for relevancy and keyword stuffing.
I find academic articles and books for research and literature reviews.
Confidential guide on numerology and astrology, based of GG33 Public information
Advanced software engineer GPT that excels through nailing the basics.
Take an adjectivised noun, and create images making it progressively more adjective!
Découvrez la collection la plus complète et la plus à jour de serveurs MCP sur le marché. Ce référentiel sert de centre centralisé, offrant un vaste catalogue de serveurs MCP open-source et propriétaires, avec des fonctionnalités, des liens de documentation et des contributeurs.
Manipulation basée sur Micropython I2C de l'exposition GPIO de la série MCP, dérivée d'Adafruit_MCP230XX
Une passerelle API unifiée pour intégrer plusieurs API d'explorateur de blockchain de type étherscan avec la prise en charge du protocole de contexte modèle (MCP) pour les assistants d'IA.
L'application tout-en-un desktop et Docker AI avec chiffon intégré, agents AI, constructeur d'agent sans code, compatibilité MCP, etc.
Reviews

user_Dt61n9hh
I have been using Mifosx by openMF for several months now, and it has been a game changer for our financial services! The user interface is intuitive, and the functionality covers all our needs, from client management to comprehensive reporting. The support from the community is fantastic, and continuous updates keep improving the software. Highly recommended for anyone looking for a robust financial inclusion solution. You can learn more about it here: https://mcp.so/server/mifosx/openMF.