MCP-Ollama-agente
Un ejemplo mecanografiado que muestra la integración de Ollama con los servidores del Protocolo de contexto del modelo (MCP). Este proyecto proporciona una interfaz interactiva de línea de comandos para un agente de IA que puede utilizar las herramientas de múltiples servidores MCP.
1
Github Watches
5
Github Forks
14
Github Stars
TypeScript MCP Agent with Ollama Integration
This project demonstrates integration between Model Context Protocol (MCP) servers and Ollama, allowing AI models to interact with various tools through a unified interface.
✨ Features
- Supports multiple MCP servers (both uvx and npx tested)
- Built-in support for file system operations and web research
- Easy configuration through
mcp-config.jsonsimilar toclaude_desktop_config.json - Interactive chat interface with Ollama integration that should support any tools
- Standalone demo mode for testing web and filesystem tools without an LLM
🚀 Getting Started
-
Prerequisites:
-
Node.js (version 18 or higher)
-
Ollama installed and running
-
Install the MCP tools globally that you want to use:
# For filesystem operations npm install -g @modelcontextprotocol/server-filesystem # For web research npm install -g @mzxrai/mcp-webresearch
-
-
Clone and install:
git clone https://github.com/ausboss/mcp-ollama-agent.git cd mcp-ollama-agent npm install -
Configure your tools and tool supported Ollama model in
mcp-config.json:{ "mcpServers": { "filesystem": { "command": "npx", "args": ["@modelcontextprotocol/server-filesystem", "./"] }, "webresearch": { "command": "npx", "args": ["-y", "@mzxrai/mcp-webresearch"] } }, "ollama": { "host": "http://localhost:11434", "model": "qwen2.5:latest" } } -
Run the demo to test filesystem and webresearch tools without an LLM:
npx tsx ./src/demo.ts -
Or start the chat interface with Ollama:
npm start
⚙️ Configuration
-
MCP Servers: Add any MCP-compatible server to the
mcpServerssection - Ollama: Configure host and model (must support function calling)
- Supports both Python (uvx) and Node.js (npx) MCP servers
💡 Example Usage
This example used this model qwen2.5:latest
Chat started. Type "exit" to end the conversation.
You: can you use your list directory tool to see whats in test-directory then use your read file tool to read it to me?
Model is using tools to help answer...
Using tool: list_directory
With arguments: { path: 'test-directory' }
Tool result: [ { type: 'text', text: '[FILE] test.txt' } ]
Assistant:
Model is using tools to help answer...
Using tool: read_file
With arguments: { path: 'test-directory/test.txt' }
Tool result: [ { type: 'text', text: 'rosebud' } ]
Assistant: The content of the file `test.txt` in the `test-directory` is:
rosebud
You: thanks
Assistant: You're welcome! If you have any other requests or need further assistance, feel free to ask.
System Prompts
Some local models may need help with tool selection. Customize the system prompt in ChatManager.ts to improve tool usage.
🤝 Contributing
Contributions welcome! Feel free to submit issues or pull requests.
相关推荐
I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.
I find academic articles and books for research and literature reviews.
Evaluator for marketplace product descriptions, checks for relevancy and keyword stuffing.
Confidential guide on numerology and astrology, based of GG33 Public information
Emulating Dr. Jordan B. Peterson's style in providing life advice and insights.
Advanced software engineer GPT that excels through nailing the basics.
Descubra la colección más completa y actualizada de servidores MCP en el mercado. Este repositorio sirve como un centro centralizado, que ofrece un extenso catálogo de servidores MCP de código abierto y propietarios, completos con características, enlaces de documentación y colaboradores.
La aplicación AI de escritorio todo en uno y Docker con trapo incorporado, agentes de IA, creador de agentes sin código, compatibilidad de MCP y más.
Manipulación basada en Micrypthon I2C del expansor GPIO de la serie MCP, derivada de AdaFruit_MCP230xx
Plataforma de automatización de flujo de trabajo de código justo con capacidades de IA nativas. Combine el edificio visual con código personalizado, auto-anfitrión o nube, más de 400 integraciones.
🧑🚀 全世界最好的 llM 资料总结(数据处理、模型训练、模型部署、 O1 模型、 MCP 、小语言模型、视觉语言模型) | Resumen de los mejores recursos del mundo.
Una lista curada de servidores de protocolo de contexto del modelo (MCP)
Espejo dehttps: //github.com/agentience/practices_mcp_server
Reviews
user_92TZqk0j
I've been using the Cloud Storage MCP Server by gitskyflux for several months now, and I'm thoroughly impressed. The seamless integration and reliable performance make it a standout choice for cloud storage solutions. Highly recommended for businesses looking for robust and efficient storage options. Check it out at https://mcp.so/server/cloudstorage-mcp/gitskyflux!