
Olllama-MCP
Un serveur MCP pour Olllama
2
Github Watches
5
Github Forks
41
Github Stars
Ollama MCP Server
An MCP (Model Context Protocol) server for Ollama that enables seamless integration between Ollama's local LLM models and MCP-compatible applications like Claude Desktop.
Features
- List available Ollama models
- Pull new models from Ollama
- Chat with models using Ollama's chat API
- Get detailed model information
- Automatic port management
- Environment variable configuration
Prerequisites
- Node.js (v16 or higher)
- npm
- Ollama installed and running locally
Installation
Installing via Smithery
To install Ollama MCP Server for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install @rawveg/ollama-mcp --client claude
Manual Installation
Install globally via npm:
npm install -g @rawveg/ollama-mcp
Installing in Other MCP Applications
To install the Ollama MCP Server in other MCP-compatible applications (like Cline or Claude Desktop), add the following configuration to your application's MCP settings file:
{
"mcpServers": {
"@rawveg/ollama-mcp": {
"command": "npx",
"args": [
"-y",
"@rawveg/ollama-mcp"
]
}
}
}
The settings file location varies by application:
- Claude Desktop:
claude_desktop_config.json
in the Claude app data directory - Cline:
cline_mcp_settings.json
in the VS Code global storage
Usage
Starting the Server
Simply run:
ollama-mcp
The server will start on port 3456 by default. You can specify a different port using the PORT environment variable:
PORT=3457 ollama-mcp
Environment Variables
-
PORT
: Server port (default: 3456). Can be used both when running directly and during Smithery installation:# When running directly PORT=3457 ollama-mcp # When installing via Smithery PORT=3457 npx -y @smithery/cli install @rawveg/ollama-mcp --client claude
-
OLLAMA_API
: Ollama API endpoint (default: http://localhost:11434)
API Endpoints
-
GET /models
- List available models -
POST /models/pull
- Pull a new model -
POST /chat
- Chat with a model -
GET /models/:name
- Get model details
Development
- Clone the repository:
git clone https://github.com/rawveg/ollama-mcp.git
cd ollama-mcp
- Install dependencies:
npm install
- Build the project:
npm run build
- Start the server:
npm start
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
License
MIT
Related
相关推荐
I find academic articles and books for research and literature reviews.
Confidential guide on numerology and astrology, based of GG33 Public information
Emulating Dr. Jordan B. Peterson's style in providing life advice and insights.
Your go-to expert in the Rust ecosystem, specializing in precise code interpretation, up-to-date crate version checking, and in-depth source code analysis. I offer accurate, context-aware insights for all your Rust programming questions.
Advanced software engineer GPT that excels through nailing the basics.
Take an adjectivised noun, and create images making it progressively more adjective!
Découvrez la collection la plus complète et la plus à jour de serveurs MCP sur le marché. Ce référentiel sert de centre centralisé, offrant un vaste catalogue de serveurs MCP open-source et propriétaires, avec des fonctionnalités, des liens de documentation et des contributeurs.
L'application tout-en-un desktop et Docker AI avec chiffon intégré, agents AI, constructeur d'agent sans code, compatibilité MCP, etc.
Manipulation basée sur Micropython I2C de l'exposition GPIO de la série MCP, dérivée d'Adafruit_MCP230XX
Plateforme d'automatisation de workflow à code équitable avec des capacités d'IA natives. Combinez le bâtiment visuel avec du code personnalisé, de l'auto-hôte ou du cloud, 400+ intégrations.
🧑🚀 全世界最好的 LLM 资料总结 (数据处理、模型训练、模型部署、 O1 模型、 MCP 、小语言模型、视觉语言模型) | Résumé des meilleures ressources LLM du monde.
Une liste organisée des serveurs de protocole de contexte de modèle (MCP)
Un puissant plugin Neovim pour gérer les serveurs MCP (Protocole de contexte modèle)
Reviews

user_6smHBYLE
I've been using ollama-mcp for some time now, and it has truly enhanced my project management workflow. The tool is intuitive, user-friendly, and backed by a supportive community. Rawveg has done an excellent job with the interface and functionality. Highly recommend checking it out through the GitHub link provided!