I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

Claude-lmstudio-Bridge_v2
Ein Bridge MCP -Server, auf dem Claude über LM Studio mit lokal ausgeführten LLM -Modellen kommunizieren kann
3 years
Works with Finder
1
Github Watches
2
Github Forks
1
Github Stars
Claude-LMStudio-Bridge
A simple Model Control Protocol (MCP) server that allows Claude to communicate with locally running LLM models via LM Studio.
Overview
This bridge enables Claude to send prompts to locally running models in LM Studio and receive their responses. This can be useful for:
- Comparing Claude's responses with other models
- Accessing specialized local models for specific tasks
- Running queries even when you have limited Claude API quota
- Keeping sensitive queries entirely local
Prerequisites
- Python 3.8+
- Anthropic Claude with MCP capability
- LM Studio running locally
- Local LLM model(s) loaded in LM Studio
Installation
-
Clone this repository:
git clone https://github.com/infinitimeless/Claude-LMStudio-Bridge_V2.git cd Claude-LMStudio-Bridge_V2
-
Create a virtual environment:
python -m venv venv source venv/bin/activate # On Windows: venv\Scripts\activate
-
Install the required packages (choose one method):
Using requirements.txt:
pip install -r requirements.txt
Or directly install dependencies:
pip install requests "mcp[cli]" openai anthropic-mcp
Usage
-
Start LM Studio and load your preferred model.
-
Ensure LM Studio's local server is running (usually on port 1234 by default).
-
Run the bridge server:
python lmstudio_bridge.py
-
In Claude's interface, enable the MCP server and point it to your locally running bridge.
-
You can now use the following MCP tools in your conversation with Claude:
-
health_check
: Check if LM Studio API is accessible -
list_models
: Get a list of available models in LM Studio -
get_current_model
: Check which model is currently loaded -
chat_completion
: Send a prompt to the current model
-
Example
Once connected, you can ask Claude to use the local model:
Claude, please use the LM Studio bridge to ask the local model: "What's your opinion on quantum computing?"
Claude will use the chat_completion
tool to send the query to your local model and display the response.
Configuration
By default, the bridge connects to LM Studio at http://localhost:1234/v1
. If your LM Studio instance is running on a different port, modify the LMSTUDIO_API_BASE
variable in lmstudio_bridge.py
.
Troubleshooting
If you encounter issues with dependencies, try installing them directly:
pip install requests "mcp[cli]" openai anthropic-mcp
For detailed installation instructions and troubleshooting, see the Installation Guide.
License
MIT
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
相关推荐
Evaluator for marketplace product descriptions, checks for relevancy and keyword stuffing.
Confidential guide on numerology and astrology, based of GG33 Public information
A geek-themed horoscope generator blending Bitcoin prices, tech jargon, and astrological whimsy.
Converts Figma frames into front-end code for various mobile frameworks.
Advanced software engineer GPT that excels through nailing the basics.
Therapist adept at identifying core issues and offering practical advice with images.
Entdecken Sie die umfassendste und aktuellste Sammlung von MCP-Servern auf dem Markt. Dieses Repository dient als zentraler Hub und bietet einen umfangreichen Katalog von Open-Source- und Proprietary MCP-Servern mit Funktionen, Dokumentationslinks und Mitwirkenden.
Ein einheitliches API-Gateway zur Integration mehrerer Ethercan-ähnlicher Blockchain-Explorer-APIs mit Modellkontextprotokoll (MCP) für AI-Assistenten.
Mirror ofhttps: //github.com/bitrefill/bitrefill-mcp-server
MCP -Server für den Fetch -Webseiteninhalt mit dem Headless -Browser von Dramatikern.
Reviews

user_807mSexU
As a dedicated user of awesome-mcp, I can confidently say this product is a game-changer. Created by MCPHubCloud, it offers seamless integration and robust functionality. The user interface is intuitive and the support is top-notch. I highly recommend it to anyone looking to optimize their server environment. You can find more details at https://mcp.so/server/awesome-mcp/MCPHubCloud.