MCP cover image
See in Github
2025-03-21

MCP服务器以LM Studio中运行的本地LLM桥接Claude

1

Github Watches

3

Github Forks

2

Github Stars

Claude-LMStudio Bridge

An MCP server that bridges Claude with local LLMs running in LM Studio.

Overview

This tool allows Claude to interact with your local LLMs running in LM Studio, providing:

  • Access to list all available models in LM Studio
  • The ability to generate text using your local LLMs
  • Support for chat completions through your local models
  • A health check tool to verify connectivity with LM Studio

Prerequisites

  • Claude Desktop with MCP support
  • LM Studio installed and running locally with API server enabled
  • Python 3.8+ installed

Quick Start (Recommended)

For macOS/Linux:

  1. Clone the repository
git clone https://github.com/infinitimeless/claude-lmstudio-bridge.git
cd claude-lmstudio-bridge
  1. Run the setup script
chmod +x setup.sh
./setup.sh
  1. Follow the setup script's instructions to configure Claude Desktop

For Windows:

  1. Clone the repository
git clone https://github.com/infinitimeless/claude-lmstudio-bridge.git
cd claude-lmstudio-bridge
  1. Run the setup script
setup.bat
  1. Follow the setup script's instructions to configure Claude Desktop

Manual Setup

If you prefer to set things up manually:

  1. Create a virtual environment (optional but recommended)
python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate
  1. Install the required packages
pip install -r requirements.txt
  1. Configure Claude Desktop:
    • Open Claude Desktop preferences
    • Navigate to the 'MCP Servers' section
    • Add a new MCP server with the following configuration:
      • Name: lmstudio-bridge
      • Command: /bin/bash (on macOS/Linux) or cmd.exe (on Windows)
      • Arguments:
        • macOS/Linux: /path/to/claude-lmstudio-bridge/run_server.sh
        • Windows: /c C:\path\to\claude-lmstudio-bridge\run_server.bat

Usage with Claude

After setting up the bridge, you can use the following commands in Claude:

  1. Check the connection to LM Studio:
Can you check if my LM Studio server is running?
  1. List available models:
List the available models in my local LM Studio
  1. Generate text with a local model:
Generate a short poem about spring using my local LLM
  1. Send a chat completion:
Ask my local LLM: "What are the main features of transformers in machine learning?"

Troubleshooting

Diagnosing LM Studio Connection Issues

Use the included debugging tool to check your LM Studio connection:

python debug_lmstudio.py

For more detailed tests:

python debug_lmstudio.py --test-chat --verbose

Common Issues

"Cannot connect to LM Studio API"

  • Make sure LM Studio is running
  • Verify the API server is enabled in LM Studio (Settings > API Server)
  • Check that the port (default: 1234) matches what's in your .env file

"No models are loaded"

  • Open LM Studio and load a model
  • Verify the model is running successfully

"MCP package not found"

  • Try reinstalling: pip install "mcp[cli]" httpx python-dotenv
  • Make sure you're using Python 3.8 or later

"Claude can't find the bridge"

  • Check Claude Desktop configuration
  • Make sure the path to run_server.sh or run_server.bat is correct and absolute
  • Verify the server script is executable: chmod +x run_server.sh (on macOS/Linux)

Advanced Configuration

You can customize the bridge behavior by creating a .env file with these settings:

LMSTUDIO_HOST=127.0.0.1
LMSTUDIO_PORT=1234
DEBUG=false

Set DEBUG=true to enable verbose logging for troubleshooting.

License

MIT

相关推荐

  • NiKole Maxwell
  • I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

  • Bora Yalcin
  • Evaluator for marketplace product descriptions, checks for relevancy and keyword stuffing.

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Yusuf Emre Yeşilyurt
  • I find academic articles and books for research and literature reviews.

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • Carlos Ferrin
  • Encuentra películas y series en plataformas de streaming.

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • Alexandru Strujac
  • Efficient thumbnail creator for YouTube videos

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • lumpenspace
  • Take an adjectivised noun, and create images making it progressively more adjective!

  • apappascs
  • 发现市场上最全面,最新的MCP服务器集合。该存储库充当集中式枢纽,提供了广泛的开源和专有MCP服务器目录,并提供功能,文档链接和贡献者。

  • ShrimpingIt
  • MCP系列GPIO Expander的基于Micropython I2C的操作,源自ADAFRUIT_MCP230XX

  • modelcontextprotocol
  • 模型上下文协议服务器

  • huahuayu
  • 统一的API网关,用于将多个Etherscan样区块链Explorer API与对AI助手的模型上下文协议(MCP)支持。

  • Mintplex-Labs
  • 带有内置抹布,AI代理,无代理构建器,MCP兼容性等的多合一桌面和Docker AI应用程序。

    Reviews

    1 (1)
    Avatar
    user_3qcpkfZH
    2025-04-16

    I've been using Semgrep for a few months now, and it's been a game-changer for my code security reviews. Its seamless integration and powerful pattern-matching capabilities have significantly streamlined my workflow. Highly recommend checking it out. For more details, visit the [Semgrep page](https://mcp.so/server/Semgrep/semgrep).