
dify-mcp-server
A Model Context Protocol server for Dify
1
Github Watches
7
Github Forks
33
Github Stars
dify-server MCP 服务器
一个集成 Dify AI API 的 Model Context Protocol 服务器
这是一个基于 TypeScript 的 MCP 服务器,通过集成 Dify AI API 来提供 Ant Design 业务组件的代码生成能力。它展示了以下核心 MCP 概念:
- 集成 Dify AI API 实现聊天完成功能
- 支持文本和图片输入
- 流式响应处理
功能特性
Tools
-
antd-component-codegen-mcp-tool
- 生成 Ant Design 业务组件代码- 支持文本和可选的图片输入
- 处理图片文件上传
- 支持来自 Dify AI API 的流式响应
开发指南
安装依赖:
npm install
开发模式(自动重新构建):
npm run watch
构建服务器:
npm run build
安装说明
在 Continue 中集成
在~/.continue/config.json
中添加以下配置:
{
"experimental": {
"modelContextProtocolServers": [
{
"transport": {
"type": "stdio",
"command": "node",
"args": ["your/path/dify-server/build/index.js"],
"env": {
"DIFY_API_KEY": "***"
}
}
}
]
}
}
在 Cline 中集成
在your/path/cline_mcp_settings.json
中添加以下配置:
{
"mcpServers": {
"dify-server": {
"command": "node",
"args": ["your/path/dify-server/build/index.js"],
"env": {
"DIFY_API_KEY": "***"
}
}
}
}
调试
由于 MCP 服务器通过标准输入输出(stdio)进行通信,调试可能会比较困难。我们推荐使用 MCP Inspector,可通过以下命令启动:
npm run inspector
Inspector 将提供一个可在浏览器中访问的调试工具 URL。
相关推荐
I find academic articles and books for research and literature reviews.
Confidential guide on numerology and astrology, based of GG33 Public information
Emulating Dr. Jordan B. Peterson's style in providing life advice and insights.
Your go-to expert in the Rust ecosystem, specializing in precise code interpretation, up-to-date crate version checking, and in-depth source code analysis. I offer accurate, context-aware insights for all your Rust programming questions.
Advanced software engineer GPT that excels through nailing the basics.
Converts Figma frames into front-end code for various mobile frameworks.
Take an adjectivised noun, and create images making it progressively more adjective!
Embark on a thrilling diplomatic quest across a galaxy on the brink of war. Navigate complex politics and alien cultures to forge peace and avert catastrophe in this immersive interstellar adventure.
Discover the most comprehensive and up-to-date collection of MCP servers in the market. This repository serves as a centralized hub, offering an extensive catalog of open-source and proprietary MCP servers, complete with features, documentation links, and contributors.
Micropython I2C-based manipulation of the MCP series GPIO expander, derived from Adafruit_MCP230xx
The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, No-code agent builder, MCP compatibility, and more.
Fair-code workflow automation platform with native AI capabilities. Combine visual building with custom code, self-host or cloud, 400+ integrations.
🧑🚀 全世界最好的LLM资料总结(Agent框架、辅助编程、数据处理、模型训练、模型推理、o1 模型、MCP、小语言模型、视觉语言模型) | Summary of the world's best LLM resources.
Bridge between Ollama and MCP servers, enabling local LLMs to use Model Context Protocol tools
Reviews

user_tv4NejXk
I've been using the dify-mcp-server by AI-FE and it’s truly impressive. The GitHub repository (https://github.com/AI-FE/dify-mcp-server) contains everything you need to get started. It’s well-documented, and the server performance is top-notch. If you are in need of a robust mcp server solution, I highly recommend giving this one a try.