
servidor Brest-MCP
3 years
Works with Finder
0
Github Watches
0
Github Forks
0
Github Stars
Brest MCP Server
Table of Contents
Description
Brest MCP Server is a server implementation of the Model Context Protocol (MCP) for the Brest region. It provides a robust infrastructure for managing MCP-based interactions and includes an MCP inspector for debugging and monitoring.
The goal of this project is to facilitate the deployment and management of MCP services with a focus on simplicity and reliability.
Technologies
- Language: Python 3.12.3 or compatible
- Dependency Management: uv
-
Inspector: MCP Inspector via Node.js (
npx
) -
Environment: Virtual environment managed by
uv
-
Inspector: MCP Inspector via Node.js (
npx
) - Node.js : Pour l'inspecteur MCP
Installation
To install and configure Brest MCP Server locally, follow these steps:
-
Install
uv
(if not already installed):- On Linux/macOS:
curl -LsSf https://astral.sh/uv/install.sh | sh
- On Windows:
powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"
- On Linux/macOS:
-
Clone the repository:
git clone https://github.com/Nijal-AI/Brest-mcp-server.git cd Brest-mcp-server
-
Create and activate the virtual environment:
uv venv source .venv/bin/activate # On Windows: .venv\Scripts\activate
-
Install the dependencies:
uv sync
Usage
To run the server locally, proceed as follows:
-
Ensure the virtual environment is activated:
source .venv/bin/activate # On Windows: .venv\Scripts\activate
-
Start the server with the MCP Inspector:
npx @modelcontextprotocol/inspector uv run brest-mcp
-
Access the MCP Inspector in your browser:
- Proxy:
http://localhost:3000
- Web interface:
http://localhost:5173
- Proxy:
Example output:
Starting MCP inspector...
Proxy server listening on port 3000
🔍 MCP Inspector is up and running at http://localhost:5173 🚀
Development
For developers wishing to contribute or work on advanced features, follow these additional steps:
-
Ensure the virtual environment is set up and dependencies are installed:
uv venv uv sync
-
Use the MCP Inspector to debug and monitor the server:
npx @modelcontextprotocol/inspector uv run brest-mcp
-
Refer to the
pyproject.toml
file for details on dependencies and configurations.
Troubleshooting
-
Error
ECONNREFUSED 127.0.0.1:3001
: Ensure thatbrest-mcp
is running and listening on port 3001 (SSE). Verify that the port is not in use by another process. -
Corrupted dependencies: Delete the
.venv
folder anduv.lock
file, then recreate the environment:uv venv uv sync
Contributing
Contributions are welcome! To propose changes, follow the CONTRIBUTING.md file.
License
This project is licensed under the MIT License. See the LICENSE file for details.
相关推荐
🔥 1Panel proporciona una interfaz web intuitiva y un servidor MCP para administrar sitios web, archivos, contenedores, bases de datos y LLM en un servidor de Linux.
🧑🚀 全世界最好的 llM 资料总结(数据处理、模型训练、模型部署、 O1 模型、 MCP 、小语言模型、视觉语言模型) | Resumen de los mejores recursos del mundo.
⛓️Rulego es un marco de motor de regla de orquestación de componentes de alta generación de alto rendimiento, de alto rendimiento y de alto rendimiento para GO.
Cree fácilmente herramientas y agentes de LLM utilizando funciones Plain Bash/JavaScript/Python.
😎简单易用、🧩丰富生态 - 大模型原生即时通信机器人平台 | 适配 Qq / 微信(企业微信、个人微信) / 飞书 / 钉钉 / Discord / Telegram / Slack 等平台 | 支持 Chatgpt 、 Deepseek 、 DiFy 、 Claude 、 Gemini 、 Xai 、 PPIO 、 Ollama 、 LM Studio 、阿里云百炼、火山方舟、 Siliconflow 、 Qwen 、 Moonshot 、 Chatglm 、 SillyTraven 、 MCP 等 LLM 的机器人 / Agente | Plataforma de bots de mensajería instantánea basada en LLM, admite Discord, Telegram, WeChat, Lark, Dingtalk, QQ, Slack
Iniciar aplicaciones de múltiples agentes empoderadas con Building LLM de manera más fácil.
Reviews

user_rPe2Waxv
As a loyal user of brest-mcp-server, I must say it's an outstanding product by Nijal-AI. It simplifies complex server management tasks, making operations smooth and efficient. This tool is an absolute time-saver and a must-have for any server admin. Highly recommended!

user_bHYgUzJy
The brest-mcp-server is a game-changer for my projects. Nijal-AI has truly outdone themselves with this robust server solution. It seamlessly integrates with my workflow, offering top-tier performance and ease of use. Highly recommend to anyone in need of a reliable server application.