I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

LMStudio-MCP
Un servidor de protocolo de control de modelo (MCP) que permite a Claude comunicarse con modelos LLM localmente en ejecución a través de LM Studio.
3 years
Works with Finder
1
Github Watches
2
Github Forks
15
Github Stars
LMStudio-MCP
A Model Control Protocol (MCP) server that allows Claude to communicate with locally running LLM models via LM Studio.
Overview
LMStudio-MCP creates a bridge between Claude (with MCP capabilities) and your locally running LM Studio instance. This allows Claude to:
- Check the health of your LM Studio API
- List available models
- Get the currently loaded model
- Generate completions using your local models
This enables you to leverage your own locally running models through Claude's interface, combining Claude's capabilities with your private models.
Prerequisites
- Python 3.7+
- LM Studio installed and running locally with a model loaded
- Claude with MCP access
- Required Python packages (see Installation)
Installation
-
Clone this repository:
git clone https://github.com/infinitimeless/LMStudio-MCP.git cd LMStudio-MCP
-
Install the required packages:
pip install requests "mcp[cli]" openai
MCP Configuration
For Claude to connect to this bridge, you need to configure the MCP settings properly. You can either:
-
Use directly from GitHub:
{ "lmstudio-mcp": { "command": "uvx", "args": [ "https://github.com/infinitimeless/LMStudio-MCP" ] } }
-
Use local installation:
{ "lmstudio-mcp": { "command": "/bin/bash", "args": [ "-c", "cd /path/to/LMStudio-MCP && source venv/bin/activate && python lmstudio_bridge.py" ] } }
For detailed MCP configuration instructions, see MCP_CONFIGURATION.md.
Usage
-
Start your LM Studio application and ensure it's running on port 1234 (the default)
-
Load a model in LM Studio
-
If running locally (not using
uvx
), run the LMStudio-MCP server:python lmstudio_bridge.py
-
In Claude, connect to the MCP server when prompted by selecting "lmstudio-mcp"
Available Functions
The bridge provides the following functions:
-
health_check()
: Verify if LM Studio API is accessible -
list_models()
: Get a list of all available models in LM Studio -
get_current_model()
: Identify which model is currently loaded -
chat_completion(prompt, system_prompt, temperature, max_tokens)
: Generate text from your local model
Known Limitations
- Some models (e.g., phi-3.5-mini-instruct_uncensored) may have compatibility issues
- The bridge currently uses only the OpenAI-compatible API endpoints of LM Studio
- Model responses will be limited by the capabilities of your locally loaded model
Troubleshooting
API Connection Issues
If Claude reports 404 errors when trying to connect to LM Studio:
- Ensure LM Studio is running and has a model loaded
- Check that LM Studio's server is running on port 1234
- Verify your firewall isn't blocking the connection
- Try using "127.0.0.1" instead of "localhost" in the API URL if issues persist
Model Compatibility
If certain models don't work correctly:
- Some models might not fully support the OpenAI chat completions API format
- Try different parameter values (temperature, max_tokens) for problematic models
- Consider switching to a more compatible model if problems persist
For more detailed troubleshooting help, see TROUBLESHOOTING.md.
License
MIT
Acknowledgements
This project was originally developed as "Claude-LMStudio-Bridge_V2" and has been renamed and open-sourced as "LMStudio-MCP".
相关推荐
Evaluator for marketplace product descriptions, checks for relevancy and keyword stuffing.
Confidential guide on numerology and astrology, based of GG33 Public information
Converts Figma frames into front-end code for various mobile frameworks.
Advanced software engineer GPT that excels through nailing the basics.
Therapist adept at identifying core issues and offering practical advice with images.
Take an adjectivised noun, and create images making it progressively more adjective!
Descubra la colección más completa y actualizada de servidores MCP en el mercado. Este repositorio sirve como un centro centralizado, que ofrece un extenso catálogo de servidores MCP de código abierto y propietarios, completos con características, enlaces de documentación y colaboradores.
Manipulación basada en Micrypthon I2C del expansor GPIO de la serie MCP, derivada de AdaFruit_MCP230xx
Una puerta de enlace de API unificada para integrar múltiples API de explorador de blockchain similar a Esterscan con soporte de protocolo de contexto modelo (MCP) para asistentes de IA.
Espejo dehttps: //github.com/agentience/practices_mcp_server
Espejo de https: //github.com/bitrefill/bitrefill-mcp-server
Servidor MCP para obtener contenido de la página web con el navegador sin cabeza de dramaturgo.
Reviews

user_0z5PXnPy
As a devoted user of LMStudio-MCP, I can confidently say it significantly streamlines my workflow. Authored by infinitimeless and accessible on GitHub, this app offers remarkable functionality for managing projects efficiently. Its user-friendly interface and robust performance have made it an indispensable tool in my daily tasks. Highly recommended for anyone looking to enhance productivity!