MCP-WEB-UI
MCP Web UI es una interfaz de usuario basada en la web que sirve como un host dentro de la arquitectura del Protocolo de contexto del modelo (MCP). Proporciona una interfaz potente y fácil de usar para interactuar con grandes modelos de idiomas (LLM) mientras gestiona la agregación y coordinación de contexto entre clientes y servidores.
2
Github Watches
8
Github Forks
39
Github Stars
MCP Web UI
MCP Web UI is a web-based user interface that serves as a Host within the Model Context Protocol (MCP) architecture. It provides a powerful and user-friendly interface for interacting with Large Language Models (LLMs) while managing context aggregation and coordination between clients and servers.
🌟 Overview
MCP Web UI is designed to simplify and enhance interactions with AI language models by providing:
- A unified interface for multiple LLM providers
- Real-time, streaming chat experiences
- Flexible configuration and model management
- Robust context handling using the MCP protocol
Demo Video
🚀 Features
- 🤖 Multi-Provider LLM Integration:
- Anthropic (Claude models)
- OpenAI (GPT models)
- Ollama (local models)
- OpenRouter (multiple providers)
- 💬 Intuitive Chat Interface
- 🔄 Real-time Response Streaming via Server-Sent Events (SSE)
- 🔧 Dynamic Configuration Management
- 📊 Advanced Context Aggregation
- 💾 Persistent Chat History using BoltDB
- 🎯 Flexible Model Selection
📋 Prerequisites
- Go 1.23+
- Docker (optional)
- API keys for desired LLM providers
🛠 Installation
Quick Start
-
Clone the repository:
git clone https://github.com/MegaGrindStone/mcp-web-ui.git cd mcp-web-ui -
Configure your environment:
mkdir -p $HOME/.config/mcpwebui cp config.example.yaml $HOME/.config/mcpwebui/config.yaml -
Set up API keys:
export ANTHROPIC_API_KEY=your_anthropic_key export OPENAI_API_KEY=your_openai_key export OPENROUTER_API_KEY=your_openrouter_key
Running the Application
Local Development
go mod download
go run ./cmd/server/main.go
Docker Deployment
docker build -t mcp-web-ui .
docker run -p 8080:8080 \
-v $HOME/.config/mcpwebui/config.yaml:/app/config.yaml \
-e ANTHROPIC_API_KEY \
-e OPENAI_API_KEY \
-e OPENROUTER_API_KEY \
mcp-web-ui
🔧 Configuration
The configuration file (config.yaml) provides comprehensive settings for customizing the MCP Web UI. Here's a detailed breakdown:
Server Configuration
-
port: The port on which the server will run (default: 8080) -
logLevel: Logging verbosity (options: debug, info, warn, error; default: info) -
logMode: Log output format (options: json, text; default: text)
Prompt Configuration
-
systemPrompt: Default system prompt for the AI assistant -
titleGeneratorPrompt: Prompt used to generate chat titles
LLM (Language Model) Configuration
The llm section supports multiple providers with provider-specific configurations:
Common LLM Parameters
-
provider: Choose from: ollama, anthropic, openai, openrouter -
model: Specific model name (e.g., 'claude-3-5-sonnet-20241022') -
parameters: Fine-tune model behavior:-
temperature: Randomness of responses (0.0-1.0) -
topP: Nucleus sampling threshold -
topK: Number of highest probability tokens to keep -
frequencyPenalty: Reduce repetition of token sequences -
presencePenalty: Encourage discussing new topics -
maxTokens: Maximum response length -
stop: Sequences to stop generation - And more provider-specific parameters
-
Provider-Specific Configurations
-
Ollama:
-
host: Ollama server URL (default: http://localhost:11434)
-
-
Anthropic:
-
apiKey: Anthropic API key (can use ANTHROPIC_API_KEY env variable) -
maxTokens: Maximum token limit - Note: Stop sequences containing only whitespace are ignored, and whitespace is trimmed from valid sequences as Anthropic doesn't support whitespace in stop sequences
-
-
OpenAI:
-
apiKey: OpenAI API key (can use OPENAI_API_KEY env variable) -
endpoint: OpenAI API endpoint (default: https://api.openai.com/v1) - For using alternative OpenAI-compatible APIs, see this discussion thread
-
-
OpenRouter:
-
apiKey: OpenRouter API key (can use OPENROUTER_API_KEY env variable)
-
Title Generator Configuration
The genTitleLLM section allows separate configuration for title generation, defaulting to the main LLM if not specified.
MCP Server Configurations
-
mcpSSEServers: Configure Server-Sent Events (SSE) servers-
url: SSE server URL -
maxPayloadSize: Maximum payload size
-
-
mcpStdIOServers: Configure Standard Input/Output servers-
command: Command to run server -
args: Arguments for the server command
-
Example MCP Server Configurations
SSE Server Example:
mcpSSEServers:
filesystem:
url: https://yoursseserver.com
maxPayloadSize: 1048576 # 1MB
StdIO Server Examples:
- Using the official filesystem MCP server:
mcpStdIOServers:
filesystem:
command: npx
args:
- -y
- "@modelcontextprotocol/server-filesystem"
- "/path/to/your/files"
This example can be used directly as the official filesystem mcp server is an executable package that can be run with npx. Just update the path to point to your desired directory.
- Using go-mcp filesystem MCP server:
mcpStdIOServers:
filesystem:
command: go
args:
- run
- github.com/your_username/your_app # Replace with your app
- -path
- "/data/mcp/filesystem" # Path to expose to MCP clients
For this example, you'll need to create a new Go application that imports the github.com/MegaGrindStone/go-mcp/servers/filesystem package. The flag naming (like -path in this example) is completely customizable based on how you structure your own application - it doesn't have to be called "path". This example is merely a starting point showing one possible implementation where a flag is used to specify which directory to expose. You're free to design your own application structure and command-line interface according to your specific needs.
Example Configuration Snippet
port: 8080
logLevel: info
systemPrompt: You are a helpful assistant.
llm:
provider: anthropic
model: claude-3-5-sonnet-20241022
maxTokens: 1000
parameters:
temperature: 0.7
genTitleLLM:
provider: openai
model: gpt-3.5-turbo
🏗 Project Structure
-
cmd/: Application entry point -
internal/handlers/: Web request handlers -
internal/models/: Data models -
internal/services/: LLM provider integrations -
static/: Static assets (CSS) -
templates/: HTML templates
🤝 Contributing
- Fork the repository
- Create a feature branch
- Commit your changes
- Push and create a Pull Request
📄 License
MIT License
相关推荐
I find academic articles and books for research and literature reviews.
Confidential guide on numerology and astrology, based of GG33 Public information
Advanced software engineer GPT that excels through nailing the basics.
Emulating Dr. Jordan B. Peterson's style in providing life advice and insights.
Converts Figma frames into front-end code for various mobile frameworks.
Your go-to expert in the Rust ecosystem, specializing in precise code interpretation, up-to-date crate version checking, and in-depth source code analysis. I offer accurate, context-aware insights for all your Rust programming questions.
Take an adjectivised noun, and create images making it progressively more adjective!
Descubra la colección más completa y actualizada de servidores MCP en el mercado. Este repositorio sirve como un centro centralizado, que ofrece un extenso catálogo de servidores MCP de código abierto y propietarios, completos con características, enlaces de documentación y colaboradores.
La aplicación AI de escritorio todo en uno y Docker con trapo incorporado, agentes de IA, creador de agentes sin código, compatibilidad de MCP y más.
Plataforma de automatización de flujo de trabajo de código justo con capacidades de IA nativas. Combine el edificio visual con código personalizado, auto-anfitrión o nube, más de 400 integraciones.
Manipulación basada en Micrypthon I2C del expansor GPIO de la serie MCP, derivada de AdaFruit_MCP230xx
🧑🚀 全世界最好的 llM 资料总结(数据处理、模型训练、模型部署、 O1 模型、 MCP 、小语言模型、视觉语言模型) | Resumen de los mejores recursos del mundo.
Una lista curada de servidores de protocolo de contexto del modelo (MCP)
Un poderoso complemento Neovim para administrar servidores MCP (protocolo de contexto del modelo)
Reviews
user_Ycdujvh5
As a dedicated user of MCP Web UI by MegaGrindStone, I can confidently say it is a remarkable tool for web interface development. Its intuitive design and comprehensive features have significantly streamlined my project workflows. The documentation is clear, making the setup process smooth even for beginners. Highly recommend checking it out at https://github.com/MegaGrindStone/mcp-web-ui!
