Cover image
Try Now
2025-04-07

Servidor del Programa de Control Master de AI (MCP): permitiendo que los modelos de IA interactúen con su sistema

3 years

Works with Finder

1

Github Watches

0

Github Forks

0

Github Stars

AI Master Control Program (MCP) Server

The AI MCP Server enables AI models, including locally hosted models with Ollama and Claude Desktop, to interact with your computer system. It acts as a bridge that allows AI models to:

  • Execute system commands
  • Create, read, update, and delete files
  • Control other programs
  • Communicate with each other

Architecture

The system consists of:

  1. MCP Server: Central server that processes requests from AI models
  2. Client Library: Enables easy integration with AI models
  3. Model Connectors: Interfaces with various AI model backends (Ollama, Claude Desktop, etc.)
  4. Task Execution Engine: Performs system operations and program control

Installation

Prerequisites

Automated Installation

For quick and easy installation, use the provided installation script:

# Clone the repository
git clone https://github.com/GrizzFuOnYou/master_mcp_server.git
cd master_mcp_server

# Run the installation script
python install.py

The installation script will:

  1. Verify Python version compatibility
  2. Install all dependencies
  3. Create a directory structure
  4. Configure environment variables
  5. Create platform-specific startup scripts
  6. Set up Claude Desktop as the default AI model

Manual Setup

If you prefer manual installation:

  1. Clone the repository:

    git clone https://github.com/GrizzFuOnYou/master_mcp_server.git
    cd master_mcp_server
    
  2. Install dependencies:

    pip install -r requirements.txt
    
  3. Configure environment variables:

    cp .env.example .env
    # Edit .env with your preferred settings
    

Usage

Starting the Server

Using Startup Script (Recommended)

After installation:

  • Windows: Run start_mcp_server.bat
  • Linux/Mac: Run ./start_mcp_server.sh

Manual Start

Run the MCP server:

python startup.py

By default, the server will listen on 0.0.0.0:8000.

Connecting AI Models

Claude Desktop (Default)

Claude Desktop is configured as the default model. To use it:

  1. Make sure Claude Desktop is running on your system
  2. The server will automatically attempt to connect on startup
  3. Claude Desktop should be available at the default location: http://localhost:5000/api

If you need to manually connect:

from mcp_client import MCPClient

# Initialize client
client = MCPClient("http://localhost:8000", "your-secret-api-key")

# Connect to Claude Desktop
result = client.connect_model("claude-desktop", "claude", {"api_url": "http://localhost:5000/api"})
print(f"Connection result: {result}")

Claude Desktop Connection JSON

If you need to manually configure Claude Desktop integration, use the following JSON configuration:

{
  "model_id": "claude-desktop",
  "model_type": "claude",
  "config": {
    "api_url": "http://localhost:5000/api",
    "temperature": 0.7,
    "max_tokens": 1000
  }
}

Ollama Models

To connect to an Ollama model:

from mcp_client import MCPClient

# Initialize client
client = MCPClient("http://localhost:8000", "your-secret-api-key")

# Connect to an Ollama model
result = client.connect_model("llama2", "ollama", {"host": "http://localhost:11434"})
print(f"Connection result: {result}")

Executing System Operations

Once connected, AI models can perform various system operations:

# Execute a command
result = client.execute_system_command("claude-desktop", "echo", ["Hello, World!"])

# Write a file
result = client.write_file("claude-desktop", "test.txt", "This is a test file created by Claude!")

# Read a file
result = client.read_file("claude-desktop", "test.txt")

# Start a program
result = client.start_program("claude-desktop", "notepad.exe")

# Stop a program
result = client.stop_program("claude-desktop", pid)

# Query the AI model
result = client.query_model("claude-desktop", "claude-desktop", "What is the capital of France?")

API Reference

Server Endpoints

Endpoint Method Description
/connect_model POST Connect to an AI model
/disconnect_model/{model_id} POST Disconnect from an AI model
/list_models GET List all connected models
/execute_task POST Execute a task requested by an AI model
/task_status/{task_id} GET Get the status of a task

Client Methods

Method Description
connect_model(model_id, model_type, config) Connect to an AI model
disconnect_model(model_id) Disconnect from an AI model
list_models() List all connected models
execute_system_command(model_id, command, args, working_dir, timeout) Execute a system command
execute_file_operation(model_id, operation, path, content) Execute a file operation
control_program(model_id, action, program_path, args, pid) Control a program
query_model(model_id, target_model, prompt) Query an AI model

Model Configuration

Claude Desktop Configuration

To connect to Claude Desktop, use the following configuration:

{
  "api_url": "http://localhost:5000/api",
  "temperature": 0.7,
  "max_tokens": 1000
}

Ollama Configuration

To connect to an Ollama model, use the following configuration:

{
  "host": "http://localhost:11434"
}

Security Considerations

IMPORTANT: This server grants AI models significant access to your system. Use with caution.

Security measures implemented:

  • API key authentication
  • Logging of all operations
  • Configurable permissions (coming soon)
  • Rate limiting (coming soon)

Troubleshooting

Claude Desktop Connection Issues

If you encounter issues connecting to Claude Desktop:

  1. Ensure Claude Desktop is running
  2. Verify the API URL (default: http://localhost:5000/api)
  3. Check the logs for specific error messages
  4. Restart Claude Desktop and try again

Ollama Connection Issues

If you encounter issues connecting to Ollama:

  1. Ensure Ollama is running (ollama serve)
  2. Verify the model exists (ollama list)
  3. Check the API URL (default: http://localhost:11434)
  4. Try pulling the model again (ollama pull modelname)

Extension Points

The MCP server can be extended to support:

  • Additional AI model backends
  • More sophisticated program control
  • GUI interaction capabilities
  • Web browsing capabilities
  • Network operation capabilities

License

MIT

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

相关推荐

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • lumpenspace
  • Take an adjectivised noun, and create images making it progressively more adjective!

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • Yusuf Emre Yeşilyurt
  • I find academic articles and books for research and literature reviews.

  • Carlos Ferrin
  • Encuentra películas y series en plataformas de streaming.

  • https://zenepic.net
  • Embark on a thrilling diplomatic quest across a galaxy on the brink of war. Navigate complex politics and alien cultures to forge peace and avert catastrophe in this immersive interstellar adventure.

  • apappascs
  • Descubra la colección más completa y actualizada de servidores MCP en el mercado. Este repositorio sirve como un centro centralizado, que ofrece un extenso catálogo de servidores MCP de código abierto y propietarios, completos con características, enlaces de documentación y colaboradores.

  • ShrimpingIt
  • Manipulación basada en Micrypthon I2C del expansor GPIO de la serie MCP, derivada de AdaFruit_MCP230xx

  • pontusab
  • La comunidad de cursor y windsurf, encontrar reglas y MCP

  • ravitemer
  • Un poderoso complemento Neovim para administrar servidores MCP (protocolo de contexto del modelo)

  • jae-jae
  • Servidor MCP para obtener contenido de la página web con el navegador sin cabeza de dramaturgo.

  • patruff
  • Puente entre los servidores Ollama y MCP, lo que permite a LLM locales utilizar herramientas de protocolo de contexto del modelo

  • av
  • Ejecute sin esfuerzo LLM Backends, API, frontends y servicios con un solo comando.

  • chongdashu
  • Habilite clientes asistentes de IA como Cursor, Windsurf y Claude Desktop para controlar el motor irreal a través del lenguaje natural utilizando el Protocolo de contexto del modelo (MCP).

  • wgpsec
  • 一款基于各大企业信息 API 的工具 , 解决在遇到的各种针对国内企业信息收集难题。一键收集控股公司 解决在遇到的各种针对国内企业信息收集难题。一键收集控股公司 ICP 备案、 Aplicación 、小程序、微信公众号等信息聚合导出。支持 MCP 接入

    Reviews

    3 (1)
    Avatar
    user_u0VyN3qN
    2025-04-17

    I've been using master_mcp_server by GrizzFuOnYou, and it's simply outstanding. It provides seamless integration with my applications and the setup is remarkably intuitive. The detailed documentation on the GitHub link (https://github.com/GrizzFuOnYou/master_mcp_server) was very helpful, ensuring a smooth implementation. Truly a must-have for any MCP enthusiast!