Cover image
Try Now
2025-04-06

该应用程序通过Ollama在本地模型聊天中演示了MCP服务器和客户端的使用,该聊天通过Serper结合了Web搜索。

3 years

Works with Finder

1

Github Watches

0

Github Forks

1

Github Stars

🔍 🤖 🌐 Ollama Chat with MCP

A powerful demonstration of integrating local LLMs with real-time web search capabilities using the Model Context Protocol (MCP).

Overview

Ollama Chat with MCP showcases how to extend a local language model's capabilities through tool use. This application combines the power of locally running LLMs via Ollama with up-to-date web search functionality provided by an MCP server.

The project consists of three main components:

  • MCP Web Search Server: Provides web search functionality using the Serper.dev API
  • Terminal Client: A CLI interface for chat and search interactions
  • Web Frontend: A user-friendly Gradio-based web interface

By using this architecture, the application demonstrates how MCP enables local models to access external tools and data sources, significantly enhancing their capabilities.

Features

  • 🔎 Web-enhanced chat: Access real-time web search results during conversation
  • 🧠 Local model execution: Uses Ollama to run models entirely on your own hardware
  • 🔌 MCP integration: Demonstrates practical implementation of the Model Context Protocol
  • 🌐 Dual interfaces: Choose between terminal CLI or web-based GUI
  • 📊 Structured search results: Clean formatting of web search data for optimal context
  • 🔄 Conversation memory: Maintains context throughout the chat session

Requirements

  • Python 3.11+
  • Ollama installed and running locally
  • A Serper.dev API key (free tier available)
  • Internet connection for web searches

Installation

  1. Clone the repository:

    git clone https://github.com/redbuilding/ollama-chat-with-mcp.git
    cd ollama-chat-with-mcp
    
  2. Install dependencies:

    pip install -r requirements.txt
    
  3. Create a .env file in the project root with your Serper.dev API key:

    SERPER_API_KEY=your_serper_api_key_here
    
  4. Ensure Ollama is installed and the hardcoded model is available (default qwen2.5:14b):

    ollama pull qwen2.5:14b
    

Usage

Starting the Web Interface

To use the web-based interface:

python chat_frontend.py

This will start the Gradio web interface, typically accessible at http://localhost:7860

Using the Terminal Client

To use the command-line interface:

python chat_client.py

Search Commands

In both interfaces, you can use special commands to trigger web searches:

  • Search and summarize: #search for "financial market outlook April 2025"
  • Search and answer a question: #search for "reality TV this week" and what happened recently?

Other Commands

  • Clear conversation history: #clear
  • Exit the application: exit or quit

How It Works

  1. The MCP server exposes a web search capability as a tool
  2. When a user requests search information, the client sends a query to the MCP server
  3. The server processes the request through Serper.dev and returns formatted results
  4. The client constructs an enhanced prompt including the search results
  5. The local Ollama model receives this prompt and generates an informed response
  6. The response is displayed to the user with search attribution

File Structure

  • server.py - MCP server with web search tool
  • chat_client.py - Terminal-based chat client
  • chat_frontend.py - Gradio web interface client
  • requirements.txt - Project dependencies
  • .env - Configuration for API keys (create this file & add your key for Serper)

Customization

  • Change the Ollama model by modifying the model name in the chat client files
  • Adjust the number of search results by changing the max_results parameter
  • Modify the prompt templates to better suit your specific use case

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

This project is licensed under the MIT License - see the LICENSE file for details.

相关推荐

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • Alexandru Strujac
  • Efficient thumbnail creator for YouTube videos

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • Yusuf Emre Yeşilyurt
  • I find academic articles and books for research and literature reviews.

  • lumpenspace
  • Take an adjectivised noun, and create images making it progressively more adjective!

  • Carlos Ferrin
  • Encuentra películas y series en plataformas de streaming.

  • https://zenepic.net
  • Embark on a thrilling diplomatic quest across a galaxy on the brink of war. Navigate complex politics and alien cultures to forge peace and avert catastrophe in this immersive interstellar adventure.

  • apappascs
  • 发现市场上最全面,最新的MCP服务器集合。该存储库充当集中式枢纽,提供了广泛的开源和专有MCP服务器目录,并提供功能,文档链接和贡献者。

  • ShrimpingIt
  • MCP系列GPIO Expander的基于Micropython I2C的操作,源自ADAFRUIT_MCP230XX

  • pontusab
  • 光标与风浪冲浪社区,查找规则和MCP

  • ravitemer
  • 一个功能强大的Neovim插件,用于管理MCP(模型上下文协议)服务器

  • jae-jae
  • MCP服务器使用剧作《无头浏览器》获取网页内容。

  • patruff
  • Ollama和MCP服务器之间的桥梁,使本地LLMS可以使用模型上下文协议工具

  • av
  • 毫不费力地使用一个命令运行LLM后端,API,前端和服务。

  • JackKuo666
  • 🔍使AI助手可以通过简单的MCP接口搜索和访问PYPI软件包信息。

  • Mintplex-Labs
  • 带有内置抹布,AI代理,无代理构建器,MCP兼容性等的多合一桌面和Docker AI应用程序。

    Reviews

    4 (1)
    Avatar
    user_uGvNmODy
    2025-04-17

    I'm a big fan of the "ollama-chat-with-mcp" by redbuilding. This application offers a seamless chat experience with MCP, backed by solid programming and a user-friendly interface. It's a great tool for both developers and chat enthusiasts. Check it out at https://github.com/redbuilding/ollama-chat-with-mcp. Highly recommend!