Cover image
Try Now
2025-01-01

Un exemple de dactylographie présentant l'intégration d'Ollama avec les serveurs de protocole de contexte modèle (MCP). Ce projet fournit une interface de ligne de commande interactive pour un agent AI qui peut utiliser les outils de plusieurs serveurs MCP.

3 years

Works with Finder

1

Github Watches

5

Github Forks

14

Github Stars

TypeScript MCP Agent with Ollama Integration

This project demonstrates integration between Model Context Protocol (MCP) servers and Ollama, allowing AI models to interact with various tools through a unified interface.

✨ Features

  • Supports multiple MCP servers (both uvx and npx tested)
  • Built-in support for file system operations and web research
  • Easy configuration through mcp-config.json similar to claude_desktop_config.json
  • Interactive chat interface with Ollama integration that should support any tools
  • Standalone demo mode for testing web and filesystem tools without an LLM

🚀 Getting Started

  1. Prerequisites:

    • Node.js (version 18 or higher)

    • Ollama installed and running

    • Install the MCP tools globally that you want to use:

      # For filesystem operations
      npm install -g @modelcontextprotocol/server-filesystem
      
      # For web research
      npm install -g @mzxrai/mcp-webresearch
      
  2. Clone and install:

    git clone https://github.com/ausboss/mcp-ollama-agent.git
    cd mcp-ollama-agent
    npm install
    
    
  3. Configure your tools and tool supported Ollama model in mcp-config.json:

    {
      "mcpServers": {
        "filesystem": {
          "command": "npx",
          "args": ["@modelcontextprotocol/server-filesystem", "./"]
        },
        "webresearch": {
          "command": "npx",
          "args": ["-y", "@mzxrai/mcp-webresearch"]
        }
      },
      "ollama": {
        "host": "http://localhost:11434",
        "model": "qwen2.5:latest"
      }
    }
    
  4. Run the demo to test filesystem and webresearch tools without an LLM:

    npx tsx ./src/demo.ts
    
  5. Or start the chat interface with Ollama:

    npm start
    

⚙️ Configuration

  • MCP Servers: Add any MCP-compatible server to the mcpServers section
  • Ollama: Configure host and model (must support function calling)
  • Supports both Python (uvx) and Node.js (npx) MCP servers

💡 Example Usage

This example used this model qwen2.5:latest

Chat started. Type "exit" to end the conversation.
You: can you use your list directory tool to see whats in test-directory then use your read file tool to read it to me?
Model is using tools to help answer...
Using tool: list_directory
With arguments: { path: 'test-directory' }
Tool result: [ { type: 'text', text: '[FILE] test.txt' } ]
Assistant:
Model is using tools to help answer...
Using tool: read_file
With arguments: { path: 'test-directory/test.txt' }
Tool result: [ { type: 'text', text: 'rosebud' } ]
Assistant: The content of the file `test.txt` in the `test-directory` is:
rosebud
You: thanks
Assistant: You're welcome! If you have any other requests or need further assistance, feel free to ask.

System Prompts

Some local models may need help with tool selection. Customize the system prompt in ChatManager.ts to improve tool usage.

🤝 Contributing

Contributions welcome! Feel free to submit issues or pull requests.

相关推荐

  • NiKole Maxwell
  • I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

  • Andris Teikmanis
  • Latvian GPT assistant for developing GPT applications

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Navid RezaeiSarchoghaei
  • Professional Flask/SQLAlchemy code guide. Follow: https://x.com/navid_re

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • https://cantaspinar.com
  • Summarizes videos and answers related questions.

  • Khalid kalib
  • Write professional emails

  • https://tovuti.be
  • Oede knorrepot die vasthoudt an de goeie ouwe tied van 't boerenleven

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • apappascs
  • Découvrez la collection la plus complète et la plus à jour de serveurs MCP sur le marché. Ce référentiel sert de centre centralisé, offrant un vaste catalogue de serveurs MCP open-source et propriétaires, avec des fonctionnalités, des liens de documentation et des contributeurs.

  • ShrimpingIt
  • Manipulation basée sur Micropython I2C de l'exposition GPIO de la série MCP, dérivée d'Adafruit_MCP230XX

  • OffchainLabs
  • Aller la mise en œuvre de la preuve de la participation Ethereum

  • huahuayu
  • Une passerelle API unifiée pour intégrer plusieurs API d'explorateur de blockchain de type étherscan avec la prise en charge du protocole de contexte modèle (MCP) pour les assistants d'IA.

  • deemkeen
  • Contrôlez votre MBOT2 avec un combo d'alimentation: MQTT + MCP + LLM

  • zhaoyunxing92
  • 本项目是一个钉钉 MCP (Protocole de connecteur de message) 服务 , 提供了与钉钉企业应用交互的 API 接口。项目基于 Go 语言开发 , 支持员工信息查询和消息发送等功能。

  • pontusab
  • La communauté du curseur et de la planche à voile, recherchez des règles et des MCP

    Reviews

    5 (1)
    Avatar
    user_92TZqk0j
    2025-04-15

    I've been using the Cloud Storage MCP Server by gitskyflux for several months now, and I'm thoroughly impressed. The seamless integration and reliable performance make it a standout choice for cloud storage solutions. Highly recommended for businesses looking for robust and efficient storage options. Check it out at https://mcp.so/server/cloudstorage-mcp/gitskyflux!