I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.
mcp-crew-ai
MCP Crew AI Server is a lightweight Python-based server designed to run, manage and create CrewAI workflows.
1
Github Watches
2
Github Forks
5
Github Stars
MCP Crew AI Server
MCP Crew AI Server is a lightweight Python-based server designed to run, manage and create CrewAI workflows. This project leverages the Model Context Protocol (MCP) to communicate with Large Language Models (LLMs) and tools such as Claude Desktop or Cursor IDE, allowing you to orchestrate multi-agent workflows with ease.
Features
-
Automatic Configuration: Automatically loads agent and task configurations from two YAML files (
agents.ymlandtasks.yml), so you don't need to write custom code for basic setups. -
Command Line Flexibility: Pass custom paths to your configuration files via command line arguments (
--agentsand--tasks). -
Seamless Workflow Execution: Easily run pre-configured workflows through the MCP
run_workflowtool. - Local Development: Run the server locally in STDIO mode, making it ideal for development and testing.
Installation
There are several ways to install the MCP Crew AI server:
Option 1: Install from PyPI (Recommended)
pip install mcp-crew-ai
Option 2: Install from GitHub
pip install git+https://github.com/adam-paterson/mcp-crew-ai.git
Option 3: Clone and Install
git clone https://github.com/adam-paterson/mcp-crew-ai.git
cd mcp-crew-ai
pip install -e .
Requirements
- Python 3.11+
- MCP SDK
- CrewAI
- PyYAML
Configuration
- agents.yml: Define your agents with roles, goals, and backstories.
- tasks.yml: Define tasks with descriptions, expected outputs, and assign them to agents.
Example agents.yml:
zookeeper:
role: Zookeeper
goal: Manage zoo operations
backstory: >
You are a seasoned zookeeper with a passion for wildlife conservation...
Example tasks.yml:
write_stories:
description: >
Write an engaging zoo update capturing the day's highlights.
expected_output: 5 engaging stories
agent: zookeeper
output_file: zoo_report.md
Usage
Once installed, you can run the MCP CrewAI server using either of these methods:
Standard Python Command
mcp-crew-ai --agents path/to/agents.yml --tasks path/to/tasks.yml
Using UV Execution (uvx)
For a more streamlined experience, you can use the UV execution command:
uvx mcp-crew-ai --agents path/to/agents.yml --tasks path/to/tasks.yml
Or run just the server directly:
uvx mcp-crew-ai-server
This will start the server using default configuration from environment variables.
Command Line Options
-
--agents: Path to the agents YAML file (required) -
--tasks: Path to the tasks YAML file (required) -
--topic: The main topic for the crew to work on (default: "Artificial Intelligence") -
--process: Process type to use (choices: "sequential" or "hierarchical", default: "sequential") -
--verbose: Enable verbose output -
--variables: JSON string or path to JSON file with additional variables to replace in YAML files -
--version: Show version information and exit
Advanced Usage
You can also provide additional variables to be used in your YAML templates:
mcp-crew-ai --agents examples/agents.yml --tasks examples/tasks.yml --topic "Machine Learning" --variables '{"year": 2025, "focus": "deep learning"}'
These variables will replace placeholders in your YAML files. For example, {topic} will be replaced with "Machine Learning" and {year} with "2025".
Contributing
Contributions are welcome! Please open issues or submit pull requests with improvements, bug fixes, or new features.
Licence
This project is licensed under the MIT Licence. See the LICENSE file for details.
Happy workflow orchestration!
相关推荐
I find academic articles and books for research and literature reviews.
Evaluator for marketplace product descriptions, checks for relevancy and keyword stuffing.
Confidential guide on numerology and astrology, based of GG33 Public information
Emulating Dr. Jordan B. Peterson's style in providing life advice and insights.
Advanced software engineer GPT that excels through nailing the basics.
Your go-to expert in the Rust ecosystem, specializing in precise code interpretation, up-to-date crate version checking, and in-depth source code analysis. I offer accurate, context-aware insights for all your Rust programming questions.
Converts Figma frames into front-end code for various mobile frameworks.
Discover the most comprehensive and up-to-date collection of MCP servers in the market. This repository serves as a centralized hub, offering an extensive catalog of open-source and proprietary MCP servers, complete with features, documentation links, and contributors.
The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, No-code agent builder, MCP compatibility, and more.
Micropython I2C-based manipulation of the MCP series GPIO expander, derived from Adafruit_MCP230xx
Fair-code workflow automation platform with native AI capabilities. Combine visual building with custom code, self-host or cloud, 400+ integrations.
🧑🚀 全世界最好的LLM资料总结(Agent框架、辅助编程、数据处理、模型训练、模型推理、o1 模型、MCP、小语言模型、视觉语言模型) | Summary of the world's best LLM resources.
Mirror ofhttps://github.com/agentience/practices_mcp_server
Reviews
user_fgVCwbjC
The MCP Documentation Server by esakrissa is an exceptional tool for managing and accessing documentation effortlessly. Its user-friendly interface and efficient performance make it indispensable for developers. Highly recommended! Check it out here: https://mcp.so/server/mcp-doc/esakrissa