MCP cover image
See in Github
2025-03-22

Un servidor Bridge MCP que permite a Claude comunicarse con modelos LLM localmente en ejecución a través de LM Studio

1

Github Watches

2

Github Forks

1

Github Stars

Claude-LMStudio-Bridge

A simple Model Control Protocol (MCP) server that allows Claude to communicate with locally running LLM models via LM Studio.

Overview

This bridge enables Claude to send prompts to locally running models in LM Studio and receive their responses. This can be useful for:

  • Comparing Claude's responses with other models
  • Accessing specialized local models for specific tasks
  • Running queries even when you have limited Claude API quota
  • Keeping sensitive queries entirely local

Prerequisites

Installation

  1. Clone this repository:

    git clone https://github.com/infinitimeless/Claude-LMStudio-Bridge_V2.git
    cd Claude-LMStudio-Bridge_V2
    
  2. Create a virtual environment:

    python -m venv venv
    source venv/bin/activate  # On Windows: venv\Scripts\activate
    
  3. Install the required packages (choose one method):

    Using requirements.txt:

    pip install -r requirements.txt
    

    Or directly install dependencies:

    pip install requests "mcp[cli]" openai anthropic-mcp
    

Usage

  1. Start LM Studio and load your preferred model.

  2. Ensure LM Studio's local server is running (usually on port 1234 by default).

  3. Run the bridge server:

    python lmstudio_bridge.py
    
  4. In Claude's interface, enable the MCP server and point it to your locally running bridge.

  5. You can now use the following MCP tools in your conversation with Claude:

    • health_check: Check if LM Studio API is accessible
    • list_models: Get a list of available models in LM Studio
    • get_current_model: Check which model is currently loaded
    • chat_completion: Send a prompt to the current model

Example

Once connected, you can ask Claude to use the local model:

Claude, please use the LM Studio bridge to ask the local model: "What's your opinion on quantum computing?"

Claude will use the chat_completion tool to send the query to your local model and display the response.

Configuration

By default, the bridge connects to LM Studio at http://localhost:1234/v1. If your LM Studio instance is running on a different port, modify the LMSTUDIO_API_BASE variable in lmstudio_bridge.py.

Troubleshooting

If you encounter issues with dependencies, try installing them directly:

pip install requests "mcp[cli]" openai anthropic-mcp

For detailed installation instructions and troubleshooting, see the Installation Guide.

License

MIT

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

相关推荐

  • NiKole Maxwell
  • I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Yusuf Emre Yeşilyurt
  • I find academic articles and books for research and literature reviews.

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • Carlos Ferrin
  • Encuentra películas y series en plataformas de streaming.

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • Contraband Interactive
  • Emulating Dr. Jordan B. Peterson's style in providing life advice and insights.

  • rustassistant.com
  • Your go-to expert in the Rust ecosystem, specializing in precise code interpretation, up-to-date crate version checking, and in-depth source code analysis. I offer accurate, context-aware insights for all your Rust programming questions.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • apappascs
  • Descubra la colección más completa y actualizada de servidores MCP en el mercado. Este repositorio sirve como un centro centralizado, que ofrece un extenso catálogo de servidores MCP de código abierto y propietarios, completos con características, enlaces de documentación y colaboradores.

  • Mintplex-Labs
  • La aplicación AI de escritorio todo en uno y Docker con trapo incorporado, agentes de IA, creador de agentes sin código, compatibilidad de MCP y más.

  • modelcontextprotocol
  • Servidores de protocolo de contexto modelo

  • ShrimpingIt
  • Manipulación basada en Micrypthon I2C del expansor GPIO de la serie MCP, derivada de AdaFruit_MCP230xx

  • n8n-io
  • Plataforma de automatización de flujo de trabajo de código justo con capacidades de IA nativas. Combine el edificio visual con código personalizado, auto-anfitrión o nube, más de 400 integraciones.

  • WangRongsheng
  • 🧑‍🚀 全世界最好的 llM 资料总结(数据处理、模型训练、模型部署、 O1 模型、 MCP 、小语言模型、视觉语言模型) | Resumen de los mejores recursos del mundo.

  • open-webui
  • Interfaz de IA fácil de usar (admite Ollama, Operai API, ...)

    Reviews

    4 (1)
    Avatar
    user_807mSexU
    2025-04-15

    As a dedicated user of awesome-mcp, I can confidently say this product is a game-changer. Created by MCPHubCloud, it offers seamless integration and robust functionality. The user interface is intuitive and the support is top-notch. I highly recommend it to anyone looking to optimize their server environment. You can find more details at https://mcp.so/server/awesome-mcp/MCPHubCloud.