
BloodHound MCP
BloodHound MCP (Model Context Protocol) is an innovative extension of the BloodHound tool, designed to enable Large Language Models (LLMs) to interact with and analyze Active Directory (AD) and Azure Active Directory (AAD) environments through natural language queries. By leveraging the power of LLMs, BloodHound MCP allows users to perform complex queries and retrieve insights from their AD/AAD environments using simple, conversational commands.
Features
- Natural Language Queries: Use conversational language to query your AD/AAD environment without needing to write Cypher queries manually.
- LLM-Powered Analysis: Harness the capabilities of Large Language Models to interpret and execute queries on your behalf.
- Seamless Integration: Works with existing BloodHound data stored in Neo4j, providing a user-friendly interface for complex analysis.
- Customizable: Easily configure the system to work with your specific environment and tools.
Configure the MCP Server
{
"mcpServers": {
"BloodHound": {
"name": "BloodHound",
"isActive": true,
"command": "uv",
"args": [
"run",
"--with",
"mcp[cli],neo4j",
"mcp",
"run",
"<PATH_TO_THE_PROJECT>server.py"
],
"env": {
"BLOODHOUND_URI": "bolt://localhost:7687",
"BLOODHOUND_USERNAME": "neo4j",
"BLOODHOUND_PASSWORD": "bloodhound"
}
}
}
}
Usage
Configuration
To customize BloodHound MCP, update the configuration file in your MCP-supported tool. Key settings include:
- Neo4j Database Connection:
-
BLOODHOUND_URI
: The URI of your Neo4j database (e.g., bolt://localhost:7687). -
BLOODHOUND_USERNAME
: Your Neo4j username. -
BLOODHOUND_PASSWORD
: Your Neo4j password.
-
- Server Settings: Adjust the command and args to match your environment and tool requirements.
Contributing
We welcome contributions to BloodHound MCP! To get involved:
- Fork the Repository: Create your own copy on GitHub.
- Create a Branch: Work on your feature or fix in a new branch.
- Submit a Pull Request: Include a clear description of your changes.
Special Thanks
Custom queries from : https://github.com/CompassSecurity/BloodHoundQueries
相关推荐
I find academic articles and books for research and literature reviews.
Confidential guide on numerology and astrology, based of GG33 Public information
Emulating Dr. Jordan B. Peterson's style in providing life advice and insights.
Your go-to expert in the Rust ecosystem, specializing in precise code interpretation, up-to-date crate version checking, and in-depth source code analysis. I offer accurate, context-aware insights for all your Rust programming questions.
Advanced software engineer GPT that excels through nailing the basics.
Take an adjectivised noun, and create images making it progressively more adjective!
Découvrez la collection la plus complète et la plus à jour de serveurs MCP sur le marché. Ce référentiel sert de centre centralisé, offrant un vaste catalogue de serveurs MCP open-source et propriétaires, avec des fonctionnalités, des liens de documentation et des contributeurs.
L'application tout-en-un desktop et Docker AI avec chiffon intégré, agents AI, constructeur d'agent sans code, compatibilité MCP, etc.
Manipulation basée sur Micropython I2C de l'exposition GPIO de la série MCP, dérivée d'Adafruit_MCP230XX
Plateforme d'automatisation de workflow à code équitable avec des capacités d'IA natives. Combinez le bâtiment visuel avec du code personnalisé, de l'auto-hôte ou du cloud, 400+ intégrations.
🧑🚀 全世界最好的 LLM 资料总结 (数据处理、模型训练、模型部署、 O1 模型、 MCP 、小语言模型、视觉语言模型) | Résumé des meilleures ressources LLM du monde.
Une liste organisée des serveurs de protocole de contexte de modèle (MCP)
Un puissant plugin Neovim pour gérer les serveurs MCP (Protocole de contexte modèle)
Pont entre les serveurs Olllama et MCP, permettant aux LLM locaux d'utiliser des outils de protocole de contexte de modèle
Reviews

user_6z88krmi
BloodHound-MCP is an impressive tool by stevenyu113228 that has significantly boosted my productivity. Its intuitive interface and efficient functionalities make it a must-have for anyone serious about their MCP applications. Plus, the community support and documentation available on the GitHub page are exceptional. I highly recommend it!