MCP cover image
See in Github
2025-04-06

该应用程序通过Ollama在本地模型聊天中演示了MCP服务器和客户端的使用,该聊天通过Serper结合了Web搜索。

1

Github Watches

0

Github Forks

1

Github Stars

🔍 🤖 🌐 Ollama Chat with MCP

A powerful demonstration of integrating local LLMs with real-time web search capabilities using the Model Context Protocol (MCP).

Overview

Ollama Chat with MCP showcases how to extend a local language model's capabilities through tool use. This application combines the power of locally running LLMs via Ollama with up-to-date web search functionality provided by an MCP server.

The project consists of three main components:

  • MCP Web Search Server: Provides web search functionality using the Serper.dev API
  • Terminal Client: A CLI interface for chat and search interactions
  • Web Frontend: A user-friendly Gradio-based web interface

By using this architecture, the application demonstrates how MCP enables local models to access external tools and data sources, significantly enhancing their capabilities.

Features

  • 🔎 Web-enhanced chat: Access real-time web search results during conversation
  • 🧠 Local model execution: Uses Ollama to run models entirely on your own hardware
  • 🔌 MCP integration: Demonstrates practical implementation of the Model Context Protocol
  • 🌐 Dual interfaces: Choose between terminal CLI or web-based GUI
  • 📊 Structured search results: Clean formatting of web search data for optimal context
  • 🔄 Conversation memory: Maintains context throughout the chat session

Requirements

  • Python 3.11+
  • Ollama installed and running locally
  • A Serper.dev API key (free tier available)
  • Internet connection for web searches

Installation

  1. Clone the repository:

    git clone https://github.com/redbuilding/ollama-chat-with-mcp.git
    cd ollama-chat-with-mcp
    
  2. Install dependencies:

    pip install -r requirements.txt
    
  3. Create a .env file in the project root with your Serper.dev API key:

    SERPER_API_KEY=your_serper_api_key_here
    
  4. Ensure Ollama is installed and the hardcoded model is available (default qwen2.5:14b):

    ollama pull qwen2.5:14b
    

Usage

Starting the Web Interface

To use the web-based interface:

python chat_frontend.py

This will start the Gradio web interface, typically accessible at http://localhost:7860

Using the Terminal Client

To use the command-line interface:

python chat_client.py

Search Commands

In both interfaces, you can use special commands to trigger web searches:

  • Search and summarize: #search for "financial market outlook April 2025"
  • Search and answer a question: #search for "reality TV this week" and what happened recently?

Other Commands

  • Clear conversation history: #clear
  • Exit the application: exit or quit

How It Works

  1. The MCP server exposes a web search capability as a tool
  2. When a user requests search information, the client sends a query to the MCP server
  3. The server processes the request through Serper.dev and returns formatted results
  4. The client constructs an enhanced prompt including the search results
  5. The local Ollama model receives this prompt and generates an informed response
  6. The response is displayed to the user with search attribution

File Structure

  • server.py - MCP server with web search tool
  • chat_client.py - Terminal-based chat client
  • chat_frontend.py - Gradio web interface client
  • requirements.txt - Project dependencies
  • .env - Configuration for API keys (create this file & add your key for Serper)

Customization

  • Change the Ollama model by modifying the model name in the chat client files
  • Adjust the number of search results by changing the max_results parameter
  • Modify the prompt templates to better suit your specific use case

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

This project is licensed under the MIT License - see the LICENSE file for details.

相关推荐

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Yusuf Emre Yeşilyurt
  • I find academic articles and books for research and literature reviews.

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • Carlos Ferrin
  • Encuentra películas y series en plataformas de streaming.

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • Contraband Interactive
  • Emulating Dr. Jordan B. Peterson's style in providing life advice and insights.

  • rustassistant.com
  • Your go-to expert in the Rust ecosystem, specializing in precise code interpretation, up-to-date crate version checking, and in-depth source code analysis. I offer accurate, context-aware insights for all your Rust programming questions.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • Alexandru Strujac
  • Efficient thumbnail creator for YouTube videos

  • lumpenspace
  • Take an adjectivised noun, and create images making it progressively more adjective!

  • apappascs
  • 发现市场上最全面,最新的MCP服务器集合。该存储库充当集中式枢纽,提供了广泛的开源和专有MCP服务器目录,并提供功能,文档链接和贡献者。

  • modelcontextprotocol
  • 模型上下文协议服务器

  • Mintplex-Labs
  • 带有内置抹布,AI代理,无代理构建器,MCP兼容性等的多合一桌面和Docker AI应用程序。

  • ShrimpingIt
  • MCP系列GPIO Expander的基于Micropython I2C的操作,源自ADAFRUIT_MCP230XX

  • n8n-io
  • 具有本机AI功能的公平代码工作流程自动化平台。将视觉构建与自定义代码,自宿主或云相结合,400+集成。

  • open-webui
  • 用户友好的AI接口(支持Ollama,OpenAi API,...)

  • WangRongsheng
  • 🧑‍🚀 llm 资料总结(数据处理、模型训练、模型部署、 o1 模型、mcp 、小语言模型、视觉语言模型)|摘要世界上最好的LLM资源。

  • metorial
  • 数百个MCP服务器的容器化版本📡📡

    Reviews

    4 (1)
    Avatar
    user_uGvNmODy
    2025-04-17

    I'm a big fan of the "ollama-chat-with-mcp" by redbuilding. This application offers a seamless chat experience with MCP, backed by solid programming and a user-friendly interface. It's a great tool for both developers and chat enthusiasts. Check it out at https://github.com/redbuilding/ollama-chat-with-mcp. Highly recommend!