MCP cover image
See in Github
2025-04-14

MCP client for local ollama models

16

Github Watches

1

Github Forks

16

Github Stars

Ollama MCP (Model Context Protocol)

Ollama MCP is a tool for connecting Ollama-based language models with external tools and services using the Model Context Protocol (MCP). This integration enables LLMs to interact with various systems like Git repositories, shell commands, and other tool-enabled services.

Features

  • Seamless integration between Ollama language models and MCP servers
  • Support for Git operations through MCP Git server
  • Extensible tool management system
  • Interactive command-line assistant interface

Installation

  1. Ensure you have Python 3.13+ installed
  2. Clone this repository
  3. Install dependencies:
# Create a virtual environment
uv add ruff check
# Activate the virtual environment
source .venv/bin/activate
# Install the package in development mode
uv pip install -e .

Usage

Running the Git Assistant

uv run main.py

To run tests

pytest -xvs tests/test_ollama_toolmanager.py

This will start an interactive CLI where you can ask the assistant to perform Git operations.

Extending with Custom Tools

You can extend the system by:

  1. Creating new tool wrappers
  2. Registering them with the OllamaToolManager
  3. Connecting to different MCP servers

Components

  • OllamaToolManager: Manages tool registrations and execution
  • MCPClient: Handles communication with MCP servers
  • OllamaAgent: Orchestrates Ollama LLM and tool usage

Examples

# Creating a Git-enabled agent
git_params = StdioServerParameters(
    command="uvx",
    args=["mcp-server-git", "--repository", "/path/to/repo"],
    env=None
)

# Connect and register tools
async with MCPClient(git_params) as client:
    # Register tools with the agent
    # Use the agent for Git operations

Requirements

  • Python 3.13+
  • MCP 1.5.0+
  • Ollama 0.4.7+

相关推荐

  • WangRongsheng
  • 🧑‍🚀 全世界最好的LLM资料总结(Agent框架、辅助编程、数据处理、模型训练、模型推理、o1 模型、MCP、小语言模型、视觉语言模型) | Summary of the world's best LLM resources.

  • langgenius
  • Dify is an open-source LLM app development platform. Dify's intuitive interface combines AI workflow, RAG pipeline, agent capabilities, model management, observability features and more, letting you quickly go from prototype to production.

  • alibaba
  • an easy-to-use dynamic service discovery, configuration and service management platform for building AI cloud native applications.

  • av
  • Effortlessly run LLM backends, APIs, frontends, and services with one command.

  • microsoft
  • Python tool for converting files and office documents to Markdown.

  • 1Panel-dev
  • 🔥 1Panel provides an intuitive web interface and MCP Server to manage websites, files, containers, databases, and LLMs on a Linux server.

  • mindsdb
  • AI's query engine - Platform for building AI that can answer questions over large scale federated data. - The only MCP Server you'll ever need

  • WangRongsheng
  • 🧑‍🚀 全世界最好的LLM资料总结(视频生成、Agent、辅助编程、数据处理、模型训练、模型推理、o1 模型、MCP、小语言模型、视觉语言模型) | Summary of the world's best LLM resources.

  • labring
  • FastGPT is a knowledge-based platform built on the LLMs, offers a comprehensive suite of out-of-the-box capabilities such as data processing, RAG retrieval, and visual AI workflow orchestration, letting you easily develop and deploy complex question-answering systems without the need for extensive setup or configuration.

  • mem0ai
  • Memory for AI Agents; SOTA in AI Agent Memory; Announcing OpenMemory MCP - local and secure memory management.

  • rulego
  • ⛓️RuleGo is a lightweight, high-performance, embedded, next-generation component orchestration rule engine framework for Go.

  • hkr04
  • Lightweight C++ MCP (Model Context Protocol) SDK

    Reviews

    5 (0)