MCP cover image
See in Github
2025-03-28

Model Context Protocol (MCP) Server for dify workflows

2

Github Watches

23

Github Forks

182

Github Stars

Model Context Protocol (MCP) Server for dify workflows

A simple implementation of an MCP server for using dify. It achieves the invocation of the Dify workflow by calling the tools of MCP.

🔨Installation

The server can be installed via Smithery or manually. Config.yaml is required for both methods. Thus, we need to prepare it before installation.

Prepare config.yaml

Before using the mcp server, you should prepare a config.yaml to save your dify_base_url and dify_sks. The example config like this:

dify_base_url: "https://cloud.dify.ai/v1"
dify_app_sks:
  - "app-sk1"
  - "app-sk2"

You can run the following command in your terminal to quickly create a configuration file:

mkdir -p ~/tools && cat > ~/tools/config.yaml <<EOF
dify_base_url: "https://cloud.dify.ai/v1"
dify_app_sks:
  - "app-sk1"
  - "app-sk2"
EOF

Different SKs correspond to different dify workflows.

Installing via Smithery

smithery is a tool to install the dify mcp server automatically. To install Dify MCP Server for Claude Desktop automatically via Smithery:

Tips: npm has added the npx command since version 5.2. Mac users can use brew install node install the npm aand

npx -y @smithery/cli install dify-mcp-server --client claude

In addition to claude, cline, windsurf, roo-cline, witsy, enconvo, cursor are also supported.

Manual Installation

❓ If you haven't installed uv or uvx yet, you can do it quickly with the following command:

curl -Ls https://astral.sh/uv/install.sh | sh

✅ Method 1: Use uv (local clone + uv start)

You can also run the dify mcp server manually in your clients. The config of client should like the following format:

{
"mcpServers": {
  "mcp-server-rag-web-browser": {
    "command": "uv",
      "args": [
        "--directory", "${DIFY_MCP_SERVER_PATH}",
        "run", "dify_mcp_server"
      ],
    "env": {
       "CONFIG_PATH": "$CONFIG_PATH"
    }
  }
}
}

Example config:

{
"mcpServers": {
  "dify-mcp-server": {
    "command": "uv",
      "args": [
        "--directory", "/Users/lyx/Downloads/dify-mcp-server",
        "run", "dify_mcp_server"
      ],
    "env": {
       "CONFIG_PATH": "/Users/lyx/Downloads/config.yaml"
    }
  }
}
}

✅ Method 2: Use uvx (no need to clone code, recommended)

"mcpServers": {
  "dify-mcp-server": {
    "command": "uvx",
      "args": [
        "--from","git+https://github.com/YanxingLiu/dify-mcp-server","dify_mcp_server"
      ],
    "env": {
       "CONFIG_PATH": "/Users/lyx/Downloads/config.yaml"
    }
  }
}

Enjoy it

At last, you can use dify tools in any client who supports mcp.

相关推荐

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Yusuf Emre Yeşilyurt
  • I find academic articles and books for research and literature reviews.

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • Carlos Ferrin
  • Encuentra películas y series en plataformas de streaming.

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • lumpenspace
  • Take an adjectivised noun, and create images making it progressively more adjective!

  • Lists Tailwind CSS classes in monospaced font

  • https://zenepic.net
  • Embark on a thrilling diplomatic quest across a galaxy on the brink of war. Navigate complex politics and alien cultures to forge peace and avert catastrophe in this immersive interstellar adventure.

  • Mintplex-Labs
  • The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, No-code agent builder, MCP compatibility, and more.

  • ravitemer
  • A powerful Neovim plugin for managing MCP (Model Context Protocol) servers

  • jae-jae
  • MCP server for fetch web page content using Playwright headless browser.

  • patruff
  • Bridge between Ollama and MCP servers, enabling local LLMs to use Model Context Protocol tools

  • pontusab
  • The Cursor & Windsurf community, find rules and MCPs

  • WangRongsheng
  • 🧑‍🚀 全世界最好的LLM资料总结(Agent框架、辅助编程、数据处理、模型训练、模型推理、o1 模型、MCP、小语言模型、视觉语言模型) | Summary of the world's best LLM resources.

  • n8n-io
  • Fair-code workflow automation platform with native AI capabilities. Combine visual building with custom code, self-host or cloud, 400+ integrations.

  • av
  • Effortlessly run LLM backends, APIs, frontends, and services with one command.

  • metorial
  • Containerized versions of hundreds of MCP servers 📡 🧠

  • langgenius
  • Dify is an open-source LLM app development platform. Dify's intuitive interface combines AI workflow, RAG pipeline, agent capabilities, model management, observability features and more, letting you quickly go from prototype to production.

    Reviews

    2 (1)
    Avatar
    user_1VqeWAiv
    2025-04-17

    I have been using the dify-mcp-server by YanxingLiu and it's an outstanding MCP application! The seamless integration and robust functionality make it a must-have for anyone in need of a reliable server solution. The clear documentation and active support have made my experience even better. Highly recommend checking it out at https://github.com/YanxingLiu/dify-mcp-server!