MCP cover image
See in Github
2025-02-13

Ollama的MCP服务器

2

Github Watches

5

Github Forks

41

Github Stars

Ollama MCP Server

smithery badge

An MCP (Model Context Protocol) server for Ollama that enables seamless integration between Ollama's local LLM models and MCP-compatible applications like Claude Desktop.

Features

  • List available Ollama models
  • Pull new models from Ollama
  • Chat with models using Ollama's chat API
  • Get detailed model information
  • Automatic port management
  • Environment variable configuration

Prerequisites

  • Node.js (v16 or higher)
  • npm
  • Ollama installed and running locally

Installation

Installing via Smithery

To install Ollama MCP Server for Claude Desktop automatically via Smithery:

npx -y @smithery/cli install @rawveg/ollama-mcp --client claude

Manual Installation

Install globally via npm:

npm install -g @rawveg/ollama-mcp

Installing in Other MCP Applications

To install the Ollama MCP Server in other MCP-compatible applications (like Cline or Claude Desktop), add the following configuration to your application's MCP settings file:

{
  "mcpServers": {
    "@rawveg/ollama-mcp": {
      "command": "npx",
      "args": [
        "-y",
        "@rawveg/ollama-mcp"
      ]
    }
  }
}

The settings file location varies by application:

  • Claude Desktop: claude_desktop_config.json in the Claude app data directory
  • Cline: cline_mcp_settings.json in the VS Code global storage

Usage

Starting the Server

Simply run:

ollama-mcp

The server will start on port 3456 by default. You can specify a different port using the PORT environment variable:

PORT=3457 ollama-mcp

Environment Variables

  • PORT: Server port (default: 3456). Can be used both when running directly and during Smithery installation:
    # When running directly
    PORT=3457 ollama-mcp
    
    # When installing via Smithery
    PORT=3457 npx -y @smithery/cli install @rawveg/ollama-mcp --client claude
    
  • OLLAMA_API: Ollama API endpoint (default: http://localhost:11434)

API Endpoints

  • GET /models - List available models
  • POST /models/pull - Pull a new model
  • POST /chat - Chat with a model
  • GET /models/:name - Get model details

Development

  1. Clone the repository:
git clone https://github.com/rawveg/ollama-mcp.git
cd ollama-mcp
  1. Install dependencies:
npm install
  1. Build the project:
npm run build
  1. Start the server:
npm start

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

MIT

Related

相关推荐

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Yusuf Emre Yeşilyurt
  • I find academic articles and books for research and literature reviews.

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • Carlos Ferrin
  • Encuentra películas y series en plataformas de streaming.

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • Alexandru Strujac
  • Efficient thumbnail creator for YouTube videos

  • lumpenspace
  • Take an adjectivised noun, and create images making it progressively more adjective!

  • Lists Tailwind CSS classes in monospaced font

  • https://appia.in
  • Siri Shortcut Finder – your go-to place for discovering amazing Siri Shortcuts with ease

  • apappascs
  • 发现市场上最全面,最新的MCP服务器集合。该存储库充当集中式枢纽,提供了广泛的开源和专有MCP服务器目录,并提供功能,文档链接和贡献者。

  • ShrimpingIt
  • MCP系列GPIO Expander的基于Micropython I2C的操作,源自ADAFRUIT_MCP230XX

  • Mintplex-Labs
  • 带有内置抹布,AI代理,无代理构建器,MCP兼容性等的多合一桌面和Docker AI应用程序。

  • jae-jae
  • MCP服务器使用剧作《无头浏览器》获取网页内容。

  • ravitemer
  • 一个功能强大的Neovim插件,用于管理MCP(模型上下文协议)服务器

  • patruff
  • Ollama和MCP服务器之间的桥梁,使本地LLMS可以使用模型上下文协议工具

  • pontusab
  • 光标与风浪冲浪社区,查找规则和MCP

  • WangRongsheng
  • 🧑‍🚀 llm 资料总结(数据处理、模型训练、模型部署、 o1 模型、mcp 、小语言模型、视觉语言模型)|摘要世界上最好的LLM资源。

  • n8n-io
  • 具有本机AI功能的公平代码工作流程自动化平台。将视觉构建与自定义代码,自宿主或云相结合,400+集成。

  • open-webui
  • 用户友好的AI接口(支持Ollama,OpenAi API,...)

    Reviews

    5 (1)
    Avatar
    user_6smHBYLE
    2025-04-17

    I've been using ollama-mcp for some time now, and it has truly enhanced my project management workflow. The tool is intuitive, user-friendly, and backed by a supportive community. Rawveg has done an excellent job with the interface and functionality. Highly recommend checking it out through the GitHub link provided!