
ollama-chat-with-mcp
This app demonstrates use of MCP server and client in a local model chat via Ollama that incorporates web search via Serper.
1
Github Watches
0
Github Forks
1
Github Stars
🔍 🤖 🌐 Ollama Chat with MCP
A powerful demonstration of integrating local LLMs with real-time web search capabilities using the Model Context Protocol (MCP).
Overview
Ollama Chat with MCP showcases how to extend a local language model's capabilities through tool use. This application combines the power of locally running LLMs via Ollama with up-to-date web search functionality provided by an MCP server.
The project consists of three main components:
- MCP Web Search Server: Provides web search functionality using the Serper.dev API
- Terminal Client: A CLI interface for chat and search interactions
- Web Frontend: A user-friendly Gradio-based web interface
By using this architecture, the application demonstrates how MCP enables local models to access external tools and data sources, significantly enhancing their capabilities.
Features
- 🔎 Web-enhanced chat: Access real-time web search results during conversation
- 🧠 Local model execution: Uses Ollama to run models entirely on your own hardware
- 🔌 MCP integration: Demonstrates practical implementation of the Model Context Protocol
- 🌐 Dual interfaces: Choose between terminal CLI or web-based GUI
- 📊 Structured search results: Clean formatting of web search data for optimal context
- 🔄 Conversation memory: Maintains context throughout the chat session
Requirements
- Python 3.11+
- Ollama installed and running locally
- A Serper.dev API key (free tier available)
- Internet connection for web searches
Installation
-
Clone the repository:
git clone https://github.com/redbuilding/ollama-chat-with-mcp.git cd ollama-chat-with-mcp
-
Install dependencies:
pip install -r requirements.txt
-
Create a
.env
file in the project root with your Serper.dev API key:SERPER_API_KEY=your_serper_api_key_here
-
Ensure Ollama is installed and the hardcoded model is available (default qwen2.5:14b):
ollama pull qwen2.5:14b
Usage
Starting the Web Interface
To use the web-based interface:
python chat_frontend.py
This will start the Gradio web interface, typically accessible at http://localhost:7860
Using the Terminal Client
To use the command-line interface:
python chat_client.py
Search Commands
In both interfaces, you can use special commands to trigger web searches:
- Search and summarize:
#search for "financial market outlook April 2025"
- Search and answer a question:
#search for "reality TV this week" and what happened recently?
Other Commands
- Clear conversation history:
#clear
- Exit the application:
exit
orquit
How It Works
- The MCP server exposes a web search capability as a tool
- When a user requests search information, the client sends a query to the MCP server
- The server processes the request through Serper.dev and returns formatted results
- The client constructs an enhanced prompt including the search results
- The local Ollama model receives this prompt and generates an informed response
- The response is displayed to the user with search attribution
File Structure
-
server.py
- MCP server with web search tool -
chat_client.py
- Terminal-based chat client -
chat_frontend.py
- Gradio web interface client -
requirements.txt
- Project dependencies -
.env
- Configuration for API keys (create this file & add your key for Serper)
Customization
- Change the Ollama model by modifying the model name in the chat client files
- Adjust the number of search results by changing the
max_results
parameter - Modify the prompt templates to better suit your specific use case
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
License
This project is licensed under the MIT License - see the LICENSE file for details.
相关推荐
I find academic articles and books for research and literature reviews.
Confidential guide on numerology and astrology, based of GG33 Public information
Converts Figma frames into front-end code for various mobile frameworks.
Advanced software engineer GPT that excels through nailing the basics.
Take an adjectivised noun, and create images making it progressively more adjective!
Embark on a thrilling diplomatic quest across a galaxy on the brink of war. Navigate complex politics and alien cultures to forge peace and avert catastrophe in this immersive interstellar adventure.
Discover the most comprehensive and up-to-date collection of MCP servers in the market. This repository serves as a centralized hub, offering an extensive catalog of open-source and proprietary MCP servers, complete with features, documentation links, and contributors.
Micropython I2C-based manipulation of the MCP series GPIO expander, derived from Adafruit_MCP230xx
The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, No-code agent builder, MCP compatibility, and more.
Bridge between Ollama and MCP servers, enabling local LLMs to use Model Context Protocol tools
🧑🚀 全世界最好的LLM资料总结(Agent框架、辅助编程、数据处理、模型训练、模型推理、o1 模型、MCP、小语言模型、视觉语言模型) | Summary of the world's best LLM resources.
Reviews

user_uGvNmODy
I'm a big fan of the "ollama-chat-with-mcp" by redbuilding. This application offers a seamless chat experience with MCP, backed by solid programming and a user-friendly interface. It's a great tool for both developers and chat enthusiasts. Check it out at https://github.com/redbuilding/ollama-chat-with-mcp. Highly recommend!