MCP cover image
See in Github
2025-03-31

Une interface Web connectant les serveurs LLMALAM LOCAL OLLAMA pour modéliser le protocole de contexte (MCP). Permet aux modèles open source d'utiliser les opérations de fichiers, la recherche Web et les outils de raisonnement similaires aux assistants commerciaux d'IA - tous exécutés en privé sur votre propre matériel.

1

Github Watches

1

Github Forks

2

Github Stars

Ollama-MCP Bridge WebUI

A TypeScript implementation that connects local LLMs (via Ollama) to Model Context Protocol (MCP) servers with a web interface. This bridge allows open-source models to use the same tools and capabilities as Claude, enabling powerful local AI assistants that run entirely on your own hardware.

Features

  • Multi-MCP Integration: Connect multiple MCP servers simultaneously
  • Tool Detection: Automatically identifies which tool to use based on queries
  • Web Interface: Clean UI with collapsible tool descriptions
  • Comprehensive Toolset: Filesystem, web search, and reasoning capabilities

Setup

Automatic Installation

The easiest way to set up the bridge is to use the included installation script:

./install.bat

This script will:

  1. Check for and install Node.js if needed
  2. Check for and install Ollama if needed
  3. Install all dependencies
  4. Create the workspace directory (../workspace)
  5. Set up initial configuration
  6. Build the TypeScript project
  7. Download the Qwen model for Ollama

After running the script, you only need to:

  1. Add your API keys to the .env file (the $VARIABLE_NAME references in the config will be replaced with actual values)

Manual Setup

If you prefer to set up manually:

  1. Install Ollama from ollama.com/download
  2. Pull the Qwen model: ollama pull qwen2.5-coder:7b-instruct-q4_K_M
  3. Install dependencies: npm install
  4. Create a workspace directory: mkdir ../workspace
  5. Configure API keys in .env
  6. Build the project: npm run build

Configuration

The bridge is configured through two main files:

1. bridge_config.json

This file defines MCP servers, LLM settings, and system prompt. Environment variables are referenced with $VARIABLE_NAME syntax.

Example:

{
  "mcpServers": {
    "filesystem": {
      "command": "node",
      "args": [
        "To/Your/Directory/Ollama-MCP-Bridge-WebUI/node_modules/@modelcontextprotocol/server-filesystem/dist/index.js",
        "To/Your/Directory/Ollama-MCP-Bridge-WebUI/../workspace"
      ],
      "allowedDirectory": "To/Your/Directory/Ollama-MCP-Bridge-WebUI/../workspace"
    },
    "brave-search": {
      "command": "node",
      "args": [
        "To/Your/Directory/Ollama-MCP-Bridge-WebUI/node_modules/@modelcontextprotocol/server-brave-search/dist/index.js"
      ],
      "env": {
        "BRAVE_API_KEY": "$BRAVE_API_KEY"
      }
    },
    "sequential-thinking": {
      "command": "node",
      "args": [
        "To/Your/Directory/Ollama-MCP-Bridge-WebUI/node_modules/@modelcontextprotocol/server-sequential-thinking/dist/index.js"
      ]
    }
  },
  "llm": {
    "model": "qwen2.5-coder:7b-instruct-q4_K_M",
    "baseUrl": "http://localhost:11434",
    "apiKey": "ollama",
    "temperature": 0.7,
    "maxTokens": 8000
  },
  "systemPrompt": "You are a helpful assistant that can use various tools to help answer questions. You have access to three main tool groups: 1) Filesystem operations - for working with files and directories, 2) Brave search - for finding information on the web, 3) Sequential thinking for complex problem-solving. When a user asks a question that requires external information, real-time data, or file manipulation, you should use a tool rather than guessing or using only your pre-trained knowledge."
}

2. .env file

This file stores sensitive information like API keys:

# Brave Search API key
BRAVE_API_KEY=your_brave_key_here

The bridge will automatically replace $BRAVE_API_KEY in the configuration with the actual value from your .env file.

Usage

Starting the Bridge

Simply run:

./start.bat

This will start the bridge with the web interface.

Web Interface

Open http://localhost:8080 (or the port shown in the console) in your browser to access the web interface.

License

MIT License - See LICENSE file for details

相关推荐

  • NiKole Maxwell
  • I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Yusuf Emre Yeşilyurt
  • I find academic articles and books for research and literature reviews.

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • Carlos Ferrin
  • Encuentra películas y series en plataformas de streaming.

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • Contraband Interactive
  • Emulating Dr. Jordan B. Peterson's style in providing life advice and insights.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • rustassistant.com
  • Your go-to expert in the Rust ecosystem, specializing in precise code interpretation, up-to-date crate version checking, and in-depth source code analysis. I offer accurate, context-aware insights for all your Rust programming questions.

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • lumpenspace
  • Take an adjectivised noun, and create images making it progressively more adjective!

  • apappascs
  • Découvrez la collection la plus complète et la plus à jour de serveurs MCP sur le marché. Ce référentiel sert de centre centralisé, offrant un vaste catalogue de serveurs MCP open-source et propriétaires, avec des fonctionnalités, des liens de documentation et des contributeurs.

  • Mintplex-Labs
  • L'application tout-en-un desktop et Docker AI avec chiffon intégré, agents AI, constructeur d'agent sans code, compatibilité MCP, etc.

  • modelcontextprotocol
  • Serveurs de protocole de contexte modèle

  • ShrimpingIt
  • Manipulation basée sur Micropython I2C de l'exposition GPIO de la série MCP, dérivée d'Adafruit_MCP230XX

  • n8n-io
  • Plateforme d'automatisation de workflow à code équitable avec des capacités d'IA natives. Combinez le bâtiment visuel avec du code personnalisé, de l'auto-hôte ou du cloud, 400+ intégrations.

  • WangRongsheng
  • 🧑‍🚀 全世界最好的 LLM 资料总结 (数据处理、模型训练、模型部署、 O1 模型、 MCP 、小语言模型、视觉语言模型) | Résumé des meilleures ressources LLM du monde.

  • open-webui
  • Interface AI conviviale (prend en charge Olllama, Openai API, ...)

  • metorial
  • Versions conteneurisées de centaines de serveurs MCP 📡 🧠 🧠

    Reviews

    2 (1)
    Avatar
    user_IlrmDKUi
    2025-04-16

    I have been using Ollama-MCP-Bridge-WebUI and it's fantastic! The seamless integration and user-friendly interface make it easy to navigate and utilize. Kudos to Rkm1999 for developing such a robust tool. For anyone using MCP applications, this is a must-have. Highly recommended! Check it out at https://github.com/Rkm1999/Ollama-MCP-Bridge-WebUI.