MCP cover image
See in Github
2025-03-31

Una interfaz web que conecta los servidores locales Ollama LLM para modelar los servidores del Protocolo de contexto (MCP). Permite que los modelos de código abierto usen operaciones de archivos, búsqueda web y herramientas de razonamiento similares a los asistentes comerciales de IA, todos ejecutados en privado en su propio hardware.

1

Github Watches

1

Github Forks

2

Github Stars

Ollama-MCP Bridge WebUI

A TypeScript implementation that connects local LLMs (via Ollama) to Model Context Protocol (MCP) servers with a web interface. This bridge allows open-source models to use the same tools and capabilities as Claude, enabling powerful local AI assistants that run entirely on your own hardware.

Features

  • Multi-MCP Integration: Connect multiple MCP servers simultaneously
  • Tool Detection: Automatically identifies which tool to use based on queries
  • Web Interface: Clean UI with collapsible tool descriptions
  • Comprehensive Toolset: Filesystem, web search, and reasoning capabilities

Setup

Automatic Installation

The easiest way to set up the bridge is to use the included installation script:

./install.bat

This script will:

  1. Check for and install Node.js if needed
  2. Check for and install Ollama if needed
  3. Install all dependencies
  4. Create the workspace directory (../workspace)
  5. Set up initial configuration
  6. Build the TypeScript project
  7. Download the Qwen model for Ollama

After running the script, you only need to:

  1. Add your API keys to the .env file (the $VARIABLE_NAME references in the config will be replaced with actual values)

Manual Setup

If you prefer to set up manually:

  1. Install Ollama from ollama.com/download
  2. Pull the Qwen model: ollama pull qwen2.5-coder:7b-instruct-q4_K_M
  3. Install dependencies: npm install
  4. Create a workspace directory: mkdir ../workspace
  5. Configure API keys in .env
  6. Build the project: npm run build

Configuration

The bridge is configured through two main files:

1. bridge_config.json

This file defines MCP servers, LLM settings, and system prompt. Environment variables are referenced with $VARIABLE_NAME syntax.

Example:

{
  "mcpServers": {
    "filesystem": {
      "command": "node",
      "args": [
        "To/Your/Directory/Ollama-MCP-Bridge-WebUI/node_modules/@modelcontextprotocol/server-filesystem/dist/index.js",
        "To/Your/Directory/Ollama-MCP-Bridge-WebUI/../workspace"
      ],
      "allowedDirectory": "To/Your/Directory/Ollama-MCP-Bridge-WebUI/../workspace"
    },
    "brave-search": {
      "command": "node",
      "args": [
        "To/Your/Directory/Ollama-MCP-Bridge-WebUI/node_modules/@modelcontextprotocol/server-brave-search/dist/index.js"
      ],
      "env": {
        "BRAVE_API_KEY": "$BRAVE_API_KEY"
      }
    },
    "sequential-thinking": {
      "command": "node",
      "args": [
        "To/Your/Directory/Ollama-MCP-Bridge-WebUI/node_modules/@modelcontextprotocol/server-sequential-thinking/dist/index.js"
      ]
    }
  },
  "llm": {
    "model": "qwen2.5-coder:7b-instruct-q4_K_M",
    "baseUrl": "http://localhost:11434",
    "apiKey": "ollama",
    "temperature": 0.7,
    "maxTokens": 8000
  },
  "systemPrompt": "You are a helpful assistant that can use various tools to help answer questions. You have access to three main tool groups: 1) Filesystem operations - for working with files and directories, 2) Brave search - for finding information on the web, 3) Sequential thinking for complex problem-solving. When a user asks a question that requires external information, real-time data, or file manipulation, you should use a tool rather than guessing or using only your pre-trained knowledge."
}

2. .env file

This file stores sensitive information like API keys:

# Brave Search API key
BRAVE_API_KEY=your_brave_key_here

The bridge will automatically replace $BRAVE_API_KEY in the configuration with the actual value from your .env file.

Usage

Starting the Bridge

Simply run:

./start.bat

This will start the bridge with the web interface.

Web Interface

Open http://localhost:8080 (or the port shown in the console) in your browser to access the web interface.

License

MIT License - See LICENSE file for details

相关推荐

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • NiKole Maxwell
  • I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

  • Yusuf Emre Yeşilyurt
  • I find academic articles and books for research and literature reviews.

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • Carlos Ferrin
  • Encuentra películas y series en plataformas de streaming.

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • Contraband Interactive
  • Emulating Dr. Jordan B. Peterson's style in providing life advice and insights.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • rustassistant.com
  • Your go-to expert in the Rust ecosystem, specializing in precise code interpretation, up-to-date crate version checking, and in-depth source code analysis. I offer accurate, context-aware insights for all your Rust programming questions.

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • lumpenspace
  • Take an adjectivised noun, and create images making it progressively more adjective!

  • apappascs
  • Descubra la colección más completa y actualizada de servidores MCP en el mercado. Este repositorio sirve como un centro centralizado, que ofrece un extenso catálogo de servidores MCP de código abierto y propietarios, completos con características, enlaces de documentación y colaboradores.

  • Mintplex-Labs
  • La aplicación AI de escritorio todo en uno y Docker con trapo incorporado, agentes de IA, creador de agentes sin código, compatibilidad de MCP y más.

  • modelcontextprotocol
  • Servidores de protocolo de contexto modelo

  • ShrimpingIt
  • Manipulación basada en Micrypthon I2C del expansor GPIO de la serie MCP, derivada de AdaFruit_MCP230xx

  • n8n-io
  • Plataforma de automatización de flujo de trabajo de código justo con capacidades de IA nativas. Combine el edificio visual con código personalizado, auto-anfitrión o nube, más de 400 integraciones.

  • WangRongsheng
  • 🧑‍🚀 全世界最好的 llM 资料总结(数据处理、模型训练、模型部署、 O1 模型、 MCP 、小语言模型、视觉语言模型) | Resumen de los mejores recursos del mundo.

  • metorial
  • Versiones contenedores de cientos de servidores MCP 📡 🧠

  • open-webui
  • Interfaz de IA fácil de usar (admite Ollama, Operai API, ...)

    Reviews

    2 (1)
    Avatar
    user_IlrmDKUi
    2025-04-16

    I have been using Ollama-MCP-Bridge-WebUI and it's fantastic! The seamless integration and user-friendly interface make it easy to navigate and utilize. Kudos to Rkm1999 for developing such a robust tool. For anyone using MCP applications, this is a must-have. Highly recommended! Check it out at https://github.com/Rkm1999/Ollama-MCP-Bridge-WebUI.