MCP cover image
See in Github
2025-04-09

Serveur MCP

1

Github Watches

0

Github Forks

0

Github Stars

Booner_MCP

An AI infrastructure-as-code platform using Model Context Protocol (MCP) with Ollama for agentic coding and server management.

Overview

This project allows AI agents to interact with local infrastructure, deploy and manage various server types (web servers, game servers, databases) through the Model Context Protocol. It integrates with a local Ollama deployment running mixtral on powerful hardware.

System Architecture

Machines Configuration

  • Machine 1 (Booner_Ollama - 10.0.0.10):

    • Runs Ollama LLM server on port 11434
    • Provides AI capabilities to the MCP system
    • Hardware: AMD 5700X3D, 4070 Ti Super, 64GB RAM, Quadro P4000
  • Machine 2 (Booner_MCP & agent - 10.0.0.4):

    • Runs the MCP core management API
    • Runs the Next.js web interface
    • Hardware: Ryzen 7 5700X3D, 32GB RAM
    • Worker Agent for Agentic Coding
  • Machine 3 (booner-mcp-web & agent - 10.0.0.4):

    • Runs the Web GUI to interact with MCP & Ollama
    • Worker Agent for Agentic Coding
  • Machine 3 (OPN_IaC - 10.0.0.2):

    • Runs infrastructure as code tools
    • Dedicated OPNSense Network API
  • Deployment Targets (Machine N):

    • Run game servers, web applications, etc.
    • Hardware examples:
    • E5-2680 v4, 16GB RAM, RTX 3050 8GB
  • Storage: TrueNAS with 8TB HDD & 2TB SSD (NFS shared)

Software Stack

  • OS: Ubuntu 24 (all machines)
  • LLM: Mixtral via Ollama
  • Primary Languages: Python, Go, NextJS
  • Containerization: Docker

Project Structure

  • agents/: AI agent definitions and orchestration code
  • servers/: MCP server implementations for different infrastructure tasks
  • api/: API server for agent communication
  • config/: Configuration files for different environments and systems

Setup & Deployment

Prerequisites

  • Git with support for submodules
  • Docker and Docker Compose
  • Node.js 18+ (for local development)
  • Python 3.11+ (for local development)

Initial Setup

  1. Clone the repository with submodules:

    git clone --recurse-submodules https://github.com/vespo92/Booner_MCP.git
    cd Booner_MCP
    
  2. Create an environment file:

    cp .env.example .env
    
  3. Generate a secure AUTH_SECRET:

    # On Linux/macOS
    ./generate_auth_secret.sh
    
    # On Windows
    .\generate_auth_secret.ps1
    
  4. Deploy with Docker Compose:

    docker-compose up -d
    

Accessing the Services

Development Workflow

Working with the Main Project

cd Booner_MCP
# Make changes
git add .
git commit -m "Your commit message"
git push origin main

相关推荐

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Yusuf Emre Yeşilyurt
  • I find academic articles and books for research and literature reviews.

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • Carlos Ferrin
  • Encuentra películas y series en plataformas de streaming.

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • Contraband Interactive
  • Emulating Dr. Jordan B. Peterson's style in providing life advice and insights.

  • rustassistant.com
  • Your go-to expert in the Rust ecosystem, specializing in precise code interpretation, up-to-date crate version checking, and in-depth source code analysis. I offer accurate, context-aware insights for all your Rust programming questions.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • lumpenspace
  • Take an adjectivised noun, and create images making it progressively more adjective!

  • apappascs
  • Découvrez la collection la plus complète et la plus à jour de serveurs MCP sur le marché. Ce référentiel sert de centre centralisé, offrant un vaste catalogue de serveurs MCP open-source et propriétaires, avec des fonctionnalités, des liens de documentation et des contributeurs.

  • modelcontextprotocol
  • Serveurs de protocole de contexte modèle

  • Mintplex-Labs
  • L'application tout-en-un desktop et Docker AI avec chiffon intégré, agents AI, constructeur d'agent sans code, compatibilité MCP, etc.

  • ShrimpingIt
  • Manipulation basée sur Micropython I2C de l'exposition GPIO de la série MCP, dérivée d'Adafruit_MCP230XX

  • n8n-io
  • Plateforme d'automatisation de workflow à code équitable avec des capacités d'IA natives. Combinez le bâtiment visuel avec du code personnalisé, de l'auto-hôte ou du cloud, 400+ intégrations.

  • open-webui
  • Interface AI conviviale (prend en charge Olllama, Openai API, ...)

  • WangRongsheng
  • 🧑‍🚀 全世界最好的 LLM 资料总结 (数据处理、模型训练、模型部署、 O1 模型、 MCP 、小语言模型、视觉语言模型) | Résumé des meilleures ressources LLM du monde.

  • metorial
  • Versions conteneurisées de centaines de serveurs MCP 📡 🧠 🧠

  • ravitemer
  • Un puissant plugin Neovim pour gérer les serveurs MCP (Protocole de contexte modèle)

    Reviews

    5 (1)
    Avatar
    user_TQeQneBD
    2025-04-16

    Booner_MCP is an incredibly versatile tool created by vespo92! It's efficient, user-friendly, and well-documented, which makes it easy to integrate into various projects. The GitHub page provides comprehensive resources and support, making it a must-have for MCP enthusiasts. Highly recommend checking it out!