I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

mcp-n8n-server
MCP server for n8n API integration - Connect Claude AI to n8n workflows
3 years
Works with Finder
1
Github Watches
1
Github Forks
3
Github Stars
MCP n8n API Server
An mcp server that provides an interface to interact with n8n workflows through the Model Context Protocol (MCP).
Features
- List all n8n workflows
- Trigger specific workflows with custom data
- Integration with Claude Desktop and other MCP clients
Installation
Global Installation (Recommended for Remote Usage)
npm install -g @ahmad.soliman/mcp-n8n-server
Then configure your n8n connection:
- Create a
.env
file in your working directory - Add your n8n API information (see Configuration section below)
Using with npx (No Installation Required)
You can run the server directly with npx:
npx -y @ahmad.soliman/mcp-n8n-server
Local Installation
git clone https://github.com/ahmadsoliman/mcp-n8n-server.git
cd mcp-n8n-server
npm install
Configuration
Create a .env
file with the following variables:
# n8n Host URL (required)
N8N_HOST_URL=https://your-n8n-instance.com
# n8n Project ID (optional - only needed for cloud instances)
PROJECT_ID=your_project_id_here
# n8n API Key (required)
N8N_API_KEY=your_api_key_here
Usage
As a Remote MCP Server
After installing globally, you can use it as a remote MCP server with Claude AI:
- Configure Claude AI to use this as a remote MCP server using the following JSON configuration:
{
"mcpServers": {
"n8n": {
"command": "npx",
"args": ["-y", "@ahmad.soliman/mcp-n8n-server"],
"env": {
"N8N_HOST_URL": "",
"PROJECT_ID": "",
"N8N_API_KEY": ""
}
}
}
}
- Add the following to your prompt or instructions to Claude:
You have access to a remote MCP server for n8n integration. Use it to:
- List all n8n workflows
- Trigger webhooks and workflows
- Get information about available webhooks
As a Local MCP Server
You can run the server locally and connect to it from Claude Desktop:
# Start the server
npm start
Then configure Claude Desktop to use this MCP server:
Edit ~/Library/Application Support/Claude/claude_desktop_config.json
:
{
"mcpServers": {
"n8n": {
"command": "npx",
"args": ["-y", "@ahmad.soliman/mcp-n8n-server"],
"env": {
"N8N_HOST_URL": "",
"PROJECT_ID": "",
"N8N_API_KEY": ""
}
}
}
}
Alternatively, if you've cloned the repository locally:
{
"mcpServers": {
"n8n-server": {
"command": "node",
"args": ["/ABSOLUTE/PATH/TO/server-n8n/build/index.js"]
}
}
}
Setup
- Install dependencies:
npm install
- Configure environment variables:
- Copy
.env.example
to.env
(if not already done) - Update the following variables in
.env
:-
N8N_API_URL
: Your n8n instance URL (default: http://localhost:5678) -
N8N_API_KEY
: Your n8n API key
-
- Start the server:
# Start the MCP server (for integration with Claude Desktop and other MCP clients)
npm run mcp
For development with auto-reload:
npm run dev
MCP Tools (for LLM Integration)
The MCP server exposes the following tools for use with Claude Desktop or other MCP clients:
List Workflows
The list-workflows
tool returns a list of all available n8n workflows.
List Workflow Webhooks
The list-workflow-webhooks
tool returns all webhooks from a specific workflow.
Parameters:
-
id
: The ID of the workflow to get webhooks from
Call Webhook (GET)
The call-webhook-get
tool allows calling a webhook with a GET request.
Parameters:
-
url
: The webhook URL to call
Call Webhook (POST)
The call-webhook-post
tool allows calling a webhook with a POST request.
Parameters:
-
url
: The webhook URL to call -
data
: Data to send in the POST request body
相关推荐
Confidential guide on numerology and astrology, based of GG33 Public information
Converts Figma frames into front-end code for various mobile frameworks.
Advanced software engineer GPT that excels through nailing the basics.
Take an adjectivised noun, and create images making it progressively more adjective!
Siri Shortcut Finder – your go-to place for discovering amazing Siri Shortcuts with ease
Discover the most comprehensive and up-to-date collection of MCP servers in the market. This repository serves as a centralized hub, offering an extensive catalog of open-source and proprietary MCP servers, complete with features, documentation links, and contributors.
Micropython I2C-based manipulation of the MCP series GPIO expander, derived from Adafruit_MCP230xx
Mirror ofhttps://github.com/agentience/practices_mcp_server
Mirror ofhttps://github.com/bitrefill/bitrefill-mcp-server
An AI chat bot for small and medium-sized teams, supporting models such as Deepseek, Open AI, Claude, and Gemini. 专为中小团队设计的 AI 聊天应用,支持 Deepseek、Open AI、Claude、Gemini 等模型。
Bridge between Ollama and MCP servers, enabling local LLMs to use Model Context Protocol tools
Reviews

user_7DVR4RyL
As a dedicated user of the mcp-n8n-server, I highly recommend this tool to anyone in need of efficient workflow automation. Created by ahmadsoliman, it's a powerful server that seamlessly integrates with n8n. It's user-friendly, reliable, and has significantly enhanced my productivity. Check it out on GitHub for more details!