Confidential guide on numerology and astrology, based of GG33 Public information

locuste-MCP-serveur
Une implémentation du serveur de protocole de contexte de modèle (MCP) pour exécuter des tests de charge de locuste. Ce serveur permet une intégration transparente des capacités de test de charge des locustes avec des environnements de développement alimentés en AI.
3 years
Works with Finder
1
Github Watches
0
Github Forks
1
Github Stars
🚀 ⚡️ locust-mcp-server
A Model Context Protocol (MCP) server implementation for running Locust load tests. This server enables seamless integration of Locust load testing capabilities with AI-powered development environments.
✨ Features
- Simple integration with Model Context Protocol framework
- Support for headless and UI modes
- Configurable test parameters (users, spawn rate, runtime)
- Easy-to-use API for running Locust load tests
- Real-time test execution output
- HTTP/HTTPS protocol support out of the box
- Custom task scenarios support
🔧 Prerequisites
Before you begin, ensure you have the following installed:
- Python 3.13 or higher
- uv package manager (Installation guide)
📦 Installation
- Clone the repository:
git clone https://github.com/qainsights/locust-mcp-server.git
- Install the required dependencies:
uv pip install -r requirements.txt
- Set up environment variables (optional):
Create a
.env
file in the project root:
LOCUST_HOST=http://localhost:8089 # Default host for your tests
LOCUST_USERS=3 # Default number of users
LOCUST_SPAWN_RATE=1 # Default user spawn rate
LOCUST_RUN_TIME=10s # Default test duration
🚀 Getting Started
- Create a Locust test script (e.g.,
hello.py
):
from locust import HttpUser, task, between
class QuickstartUser(HttpUser):
wait_time = between(1, 5)
@task
def hello_world(self):
self.client.get("/hello")
self.client.get("/world")
@task(3)
def view_items(self):
for item_id in range(10):
self.client.get(f"/item?id={item_id}", name="/item")
time.sleep(1)
def on_start(self):
self.client.post("/login", json={"username":"foo", "password":"bar"})
- Configure the MCP server using the below specs in your favorite MCP client (Claude Desktop, Cursor, Windsurf and more):
{
"mcpServers": {
"locust": {
"command": "/Users/naveenkumar/.local/bin/uv",
"args": [
"--directory",
"/Users/naveenkumar/Gits/locust-mcp-server",
"run",
"locust_server.py"
]
}
}
}
- Now ask the LLM to run the test e.g.
run locust test for hello.py
. The Locust MCP server will use the following tool to start the test:
-
run_locust
: Run a test with configurable options for headless mode, host, runtime, users, and spawn rate
📝 API Reference
Run Locust Test
run_locust(
test_file: str,
headless: bool = True,
host: str = "http://localhost:8089",
runtime: str = "10s",
users: int = 3,
spawn_rate: int = 1
)
Parameters:
-
test_file
: Path to your Locust test script -
headless
: Run in headless mode (True) or with UI (False) -
host
: Target host to load test -
runtime
: Test duration (e.g., "30s", "1m", "5m") -
users
: Number of concurrent users to simulate -
spawn_rate
: Rate at which users are spawned
✨ Use Cases
- LLM powered results analysis
- Effective debugging with the help of LLM
🤝 Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
📄 License
This project is licensed under the MIT License - see the LICENSE file for details.
相关推荐
Advanced software engineer GPT that excels through nailing the basics.
Take an adjectivised noun, and create images making it progressively more adjective!
Siri Shortcut Finder – your go-to place for discovering amazing Siri Shortcuts with ease
Découvrez la collection la plus complète et la plus à jour de serveurs MCP sur le marché. Ce référentiel sert de centre centralisé, offrant un vaste catalogue de serveurs MCP open-source et propriétaires, avec des fonctionnalités, des liens de documentation et des contributeurs.
Manipulation basée sur Micropython I2C de l'exposition GPIO de la série MCP, dérivée d'Adafruit_MCP230XX
Miroir dehttps: //github.com/bitrefill/bitrefill-mcp-server
MCP Server pour récupérer le contenu de la page Web à l'aide du navigateur sans tête du dramwright.
Un bot de chat IA pour les petites et moyennes équipes, soutenant des modèles tels que Deepseek, Open AI, Claude et Gemini. 专为中小团队设计的 Ai 聊天应用 , 支持 Deepseek 、 Open Ai 、 Claude 、 Gemini 等模型。
Un puissant plugin Neovim pour gérer les serveurs MCP (Protocole de contexte modèle)
Pont entre les serveurs Olllama et MCP, permettant aux LLM locaux d'utiliser des outils de protocole de contexte de modèle
Reviews

user_QER0WheO
I've been using the locust-mcp-server developed by QAInsights and it has significantly improved my load testing workflow. Its seamless integration with Locust and easy-to-navigate interface are commendable. It streamlines the process of scaling Locust workers and managing multiple load test scenarios efficiently. Highly recommended for anyone serious about load testing! Check it out here: https://github.com/QAInsights/locust-mcp-server