Confidential guide on numerology and astrology, based of GG33 Public information

Booner_MCP
MCP Server
3 years
Works with Finder
1
Github Watches
0
Github Forks
0
Github Stars
Booner_MCP
An AI infrastructure-as-code platform using Model Context Protocol (MCP) with Ollama for agentic coding and server management.
Overview
This project allows AI agents to interact with local infrastructure, deploy and manage various server types (web servers, game servers, databases) through the Model Context Protocol. It integrates with a local Ollama deployment running mixtral on powerful hardware.
System Architecture
Machines Configuration
-
Machine 1 (Booner_Ollama - 10.0.0.10):
- Runs Ollama LLM server on port 11434
- Provides AI capabilities to the MCP system
- Hardware: AMD 5700X3D, 4070 Ti Super, 64GB RAM, Quadro P4000
-
Machine 2 (Booner_MCP & agent - 10.0.0.4):
- Runs the MCP core management API
- Runs the Next.js web interface
- Hardware: Ryzen 7 5700X3D, 32GB RAM
- Worker Agent for Agentic Coding
-
Machine 3 (booner-mcp-web & agent - 10.0.0.4):
- Runs the Web GUI to interact with MCP & Ollama
- Worker Agent for Agentic Coding
-
Machine 3 (OPN_IaC - 10.0.0.2):
- Runs infrastructure as code tools
- Dedicated OPNSense Network API
-
Deployment Targets (Machine N):
- Run game servers, web applications, etc.
- Hardware examples:
- E5-2680 v4, 16GB RAM, RTX 3050 8GB
-
Storage: TrueNAS with 8TB HDD & 2TB SSD (NFS shared)
Software Stack
- OS: Ubuntu 24 (all machines)
- LLM: Mixtral via Ollama
- Primary Languages: Python, Go, NextJS
- Containerization: Docker
Project Structure
-
agents/
: AI agent definitions and orchestration code -
servers/
: MCP server implementations for different infrastructure tasks -
api/
: API server for agent communication -
config/
: Configuration files for different environments and systems
Setup & Deployment
Prerequisites
- Git with support for submodules
- Docker and Docker Compose
- Node.js 18+ (for local development)
- Python 3.11+ (for local development)
Initial Setup
-
Clone the repository with submodules:
git clone --recurse-submodules https://github.com/vespo92/Booner_MCP.git cd Booner_MCP
-
Create an environment file:
cp .env.example .env
-
Generate a secure AUTH_SECRET:
# On Linux/macOS ./generate_auth_secret.sh # On Windows .\generate_auth_secret.ps1
-
Deploy with Docker Compose:
docker-compose up -d
Accessing the Services
- Web UI: http://10.0.0.1:3000
- API: http://10.0.0.1:8000
- Ollama: http://10.0.0.10:11434
Development Workflow
Working with the Main Project
cd Booner_MCP
# Make changes
git add .
git commit -m "Your commit message"
git push origin main
相关推荐
Converts Figma frames into front-end code for various mobile frameworks.
Advanced software engineer GPT that excels through nailing the basics.
Take an adjectivised noun, and create images making it progressively more adjective!
Siri Shortcut Finder – your go-to place for discovering amazing Siri Shortcuts with ease
I find academic articles and books for research and literature reviews.
Discover the most comprehensive and up-to-date collection of MCP servers in the market. This repository serves as a centralized hub, offering an extensive catalog of open-source and proprietary MCP servers, complete with features, documentation links, and contributors.
Micropython I2C-based manipulation of the MCP series GPIO expander, derived from Adafruit_MCP230xx
Bridge between Ollama and MCP servers, enabling local LLMs to use Model Context Protocol tools
🧑🚀 全世界最好的LLM资料总结(Agent框架、辅助编程、数据处理、模型训练、模型推理、o1 模型、MCP、小语言模型、视觉语言模型) | Summary of the world's best LLM resources.
The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, No-code agent builder, MCP compatibility, and more.
Reviews

user_TQeQneBD
Booner_MCP is an incredibly versatile tool created by vespo92! It's efficient, user-friendly, and well-documented, which makes it easy to integrate into various projects. The GitHub page provides comprehensive resources and support, making it a must-have for MCP enthusiasts. Highly recommend checking it out!