Cover image
Try Now
2025-04-14

3 years

Works with Finder

0

Github Watches

0

Github Forks

0

Github Stars

Brest MCP Server

Table of Contents

Description

Brest MCP Server is a server implementation of the Model Context Protocol (MCP) for the Brest region. It provides a robust infrastructure for managing MCP-based interactions and includes an MCP inspector for debugging and monitoring.

The goal of this project is to facilitate the deployment and management of MCP services with a focus on simplicity and reliability.

Technologies

  • Language: Python 3.12.3 or compatible
  • Dependency Management: uv
  • Inspector: MCP Inspector via Node.js (npx)
  • Environment: Virtual environment managed by uv
  • Inspector: MCP Inspector via Node.js (npx)
  • Node.js : Pour l'inspecteur MCP

Installation

To install and configure Brest MCP Server locally, follow these steps:

  1. Install uv (if not already installed):

    • On Linux/macOS:
      curl -LsSf https://astral.sh/uv/install.sh | sh
      
    • On Windows:
      powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"
      
  2. Clone the repository:

    git clone https://github.com/Nijal-AI/Brest-mcp-server.git
    cd Brest-mcp-server
    
  3. Create and activate the virtual environment:

    uv venv
    source .venv/bin/activate  # On Windows: .venv\Scripts\activate
    
  4. Install the dependencies:

    uv sync
    

Usage

To run the server locally, proceed as follows:

  1. Ensure the virtual environment is activated:

    source .venv/bin/activate  # On Windows: .venv\Scripts\activate
    
  2. Start the server with the MCP Inspector:

    npx @modelcontextprotocol/inspector uv run brest-mcp
    
  3. Access the MCP Inspector in your browser:

    • Proxy: http://localhost:3000
    • Web interface: http://localhost:5173

Example output:

Starting MCP inspector...
Proxy server listening on port 3000
🔍 MCP Inspector is up and running at http://localhost:5173 🚀

Development

For developers wishing to contribute or work on advanced features, follow these additional steps:

  1. Ensure the virtual environment is set up and dependencies are installed:

    uv venv
    uv sync
    
  2. Use the MCP Inspector to debug and monitor the server:

    npx @modelcontextprotocol/inspector uv run brest-mcp
    
  3. Refer to the pyproject.toml file for details on dependencies and configurations.

Troubleshooting

  • Error ECONNREFUSED 127.0.0.1:3001: Ensure that brest-mcp is running and listening on port 3001 (SSE). Verify that the port is not in use by another process.
  • Corrupted dependencies: Delete the .venv folder and uv.lock file, then recreate the environment:
    uv venv
    uv sync
    

Contributing

Contributions are welcome! To propose changes, follow the CONTRIBUTING.md file.

License

This project is licensed under the MIT License. See the LICENSE file for details.

相关推荐

  • av
  • 毫不费力地使用一个命令运行LLM后端,API,前端和服务。

  • 1Panel-dev
  • 🔥1Panel提供了直观的Web接口和MCP服务器,用于在Linux服务器上管理网站,文件,容器,数据库和LLMS。

  • WangRongsheng
  • 🧑‍🚀 llm 资料总结(数据处理、模型训练、模型部署、 o1 模型、mcp 、小语言模型、视觉语言模型)|摘要世界上最好的LLM资源。

  • rulego
  • ⛓️Rulego是一种轻巧,高性能,嵌入式,下一代组件编排规则引擎框架。

  • sigoden
  • 使用普通的bash/javascript/python函数轻松创建LLM工具和代理。

  • hkr04
  • 轻巧的C ++ MCP(模型上下文协议)SDK

  • RockChinQ
  • 😎简单易用、🧩丰富生态 -大模型原生即时通信机器人平台| 适配QQ / 微信(企业微信、个人微信) /飞书 /钉钉 / discord / telegram / slack等平台| 支持chatgpt,deepseek,dify,claude,基于LLM的即时消息机器人平台,支持Discord,Telegram,微信,Lark,Dingtalk,QQ,Slack

  • dmayboroda
  • 带有可配置容器的本地对话抹布

  • paulwing
  • 使用MCP服务创建的测试存储库

    Reviews

    5 (2)
    Avatar
    user_rPe2Waxv
    2025-04-24

    As a loyal user of brest-mcp-server, I must say it's an outstanding product by Nijal-AI. It simplifies complex server management tasks, making operations smooth and efficient. This tool is an absolute time-saver and a must-have for any server admin. Highly recommended!

    Avatar
    user_bHYgUzJy
    2025-04-24

    The brest-mcp-server is a game-changer for my projects. Nijal-AI has truly outdone themselves with this robust server solution. It seamlessly integrates with my workflow, offering top-tier performance and ease of use. Highly recommend to anyone in need of a reliable server application.