Confidential guide on numerology and astrology, based of GG33 Public information

Ollama-MCP
Ein MCP -Server für Ollama
3 years
Works with Finder
2
Github Watches
5
Github Forks
41
Github Stars
Ollama MCP Server
An MCP (Model Context Protocol) server for Ollama that enables seamless integration between Ollama's local LLM models and MCP-compatible applications like Claude Desktop.
Features
- List available Ollama models
- Pull new models from Ollama
- Chat with models using Ollama's chat API
- Get detailed model information
- Automatic port management
- Environment variable configuration
Prerequisites
- Node.js (v16 or higher)
- npm
- Ollama installed and running locally
Installation
Installing via Smithery
To install Ollama MCP Server for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install @rawveg/ollama-mcp --client claude
Manual Installation
Install globally via npm:
npm install -g @rawveg/ollama-mcp
Installing in Other MCP Applications
To install the Ollama MCP Server in other MCP-compatible applications (like Cline or Claude Desktop), add the following configuration to your application's MCP settings file:
{
"mcpServers": {
"@rawveg/ollama-mcp": {
"command": "npx",
"args": [
"-y",
"@rawveg/ollama-mcp"
]
}
}
}
The settings file location varies by application:
- Claude Desktop:
claude_desktop_config.json
in the Claude app data directory - Cline:
cline_mcp_settings.json
in the VS Code global storage
Usage
Starting the Server
Simply run:
ollama-mcp
The server will start on port 3456 by default. You can specify a different port using the PORT environment variable:
PORT=3457 ollama-mcp
Environment Variables
-
PORT
: Server port (default: 3456). Can be used both when running directly and during Smithery installation:# When running directly PORT=3457 ollama-mcp # When installing via Smithery PORT=3457 npx -y @smithery/cli install @rawveg/ollama-mcp --client claude
-
OLLAMA_API
: Ollama API endpoint (default: http://localhost:11434)
API Endpoints
-
GET /models
- List available models -
POST /models/pull
- Pull a new model -
POST /chat
- Chat with a model -
GET /models/:name
- Get model details
Development
- Clone the repository:
git clone https://github.com/rawveg/ollama-mcp.git
cd ollama-mcp
- Install dependencies:
npm install
- Build the project:
npm run build
- Start the server:
npm start
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
License
MIT
Related
相关推荐
Converts Figma frames into front-end code for various mobile frameworks.
Advanced software engineer GPT that excels through nailing the basics.
Take an adjectivised noun, and create images making it progressively more adjective!
I find academic articles and books for research and literature reviews.
Entdecken Sie die umfassendste und aktuellste Sammlung von MCP-Servern auf dem Markt. Dieses Repository dient als zentraler Hub und bietet einen umfangreichen Katalog von Open-Source- und Proprietary MCP-Servern mit Funktionen, Dokumentationslinks und Mitwirkenden.
MCP -Server für den Fetch -Webseiteninhalt mit dem Headless -Browser von Dramatikern.
Ein leistungsstarkes Neovim -Plugin für die Verwaltung von MCP -Servern (Modellkontextprotokoll)
Brücke zwischen Ollama und MCP -Servern und ermöglicht es lokalen LLMs, Modellkontextprotokoll -Tools zu verwenden
Die All-in-One-Desktop & Docker-AI-Anwendung mit integriertem Lappen, AI-Agenten, No-Code-Agent Builder, MCP-Kompatibilität und vielem mehr.
🧑🚀 全世界最好的 llm 资料总结(数据处理、模型训练、模型部署、 O1 模型、 MCP 、小语言模型、视觉语言模型) | Zusammenfassung der weltbesten LLM -Ressourcen.
Awesome MCP -Server - eine kuratierte Liste von Modellkontext -Protokollservern für Modellkontext
Reviews

user_6smHBYLE
I've been using ollama-mcp for some time now, and it has truly enhanced my project management workflow. The tool is intuitive, user-friendly, and backed by a supportive community. Rawveg has done an excellent job with the interface and functionality. Highly recommend checking it out through the GitHub link provided!