
Private
Ollama-MCP-CLIENT
2025-04-14
本地Ollama模型的MCP客户端
3 years
Works with Finder
16
Github Watches
1
Github Forks
16
Github Stars
Ollama MCP (Model Context Protocol)
Ollama MCP is a tool for connecting Ollama-based language models with external tools and services using the Model Context Protocol (MCP). This integration enables LLMs to interact with various systems like Git repositories, shell commands, and other tool-enabled services.
Features
- Seamless integration between Ollama language models and MCP servers
- Support for Git operations through MCP Git server
- Extensible tool management system
- Interactive command-line assistant interface
Installation
- Ensure you have Python 3.13+ installed
- Clone this repository
- Install dependencies:
# Create a virtual environment
uv add ruff check
# Activate the virtual environment
source .venv/bin/activate
# Install the package in development mode
uv pip install -e .
Usage
Running the Git Assistant
uv run main.py
To run tests
pytest -xvs tests/test_ollama_toolmanager.py
This will start an interactive CLI where you can ask the assistant to perform Git operations.
Extending with Custom Tools
You can extend the system by:
- Creating new tool wrappers
- Registering them with the
OllamaToolManager
- Connecting to different MCP servers
Components
- OllamaToolManager: Manages tool registrations and execution
- MCPClient: Handles communication with MCP servers
- OllamaAgent: Orchestrates Ollama LLM and tool usage
Examples
# Creating a Git-enabled agent
git_params = StdioServerParameters(
command="uvx",
args=["mcp-server-git", "--repository", "/path/to/repo"],
env=None
)
# Connect and register tools
async with MCPClient(git_params) as client:
# Register tools with the agent
# Use the agent for Git operations
Requirements
- Python 3.13+
- MCP 1.5.0+
- Ollama 0.4.7+
相关推荐
😎简单易用、🧩丰富生态 -大模型原生即时通信机器人平台| 适配QQ / 微信(企业微信、个人微信) /飞书 /钉钉 / discord / telegram / slack等平台| 支持chatgpt,deepseek,dify,claude,基于LLM的即时消息机器人平台,支持Discord,Telegram,微信,Lark,Dingtalk,QQ,Slack
Reviews
5
(0)