MCP cover image
See in Github
2025-04-14

Discord Bot zur Unterstützung von AI/LLM -Chat -Anwendungen, die vom Modellkontextprotokoll (MCP) betrieben werden, und zahlreiche Integrationen ermöglicht

0

Github Watches

0

Github Forks

0

Github Stars

Discord-AIBot

Discord bot for supporting AI/LLM chat applications powered by the Model Context Protocol (MCP), allowing for numerous integrations

For general MCP resources, see Arkestra:cookbook/mcp/README.md

Configuration

cp config/example.main.toml config/main.toml

Then edit main.toml as needed. It specifies your LLM endpoint and model context resources such as MCP servers.

Running

If you included toys.b4a.toml in your main.toml you'll need to have that MCP server running. In a separate terminal cd demo_server then run

uv pip install -Ur requirements.txt
uvicorn toy_mcp_server:create_app --factory --host 127.0.0.1 --port 8902

Make sure you set up any other MCP or other resources you've specified in your B4A. Now you can run the bot.

# Assumes you've exported DISCORD_TOKEN="YOUR_TOKEN"
python mcp_discord_bot.py --discord-token $DISCORD_TOKEN --config-path config

Structlog/rich tracebacks can be elaborate, so there is a --classic-tracebacks option to tame them

For very copious logging you can add --loglevel DEBUG

Note: you can use the environment rather than --discord-token & --config-path

export AIBOT_DISCORD_TOKEN="YOUR_TOKEN"
export AIBOT_DISCORD_CONFIG_PATH="./config"
python mcp_discord_bot.py # Reads from env vars

Implementation notes

Using mlx_lm.server

uv pip install mlx-omni-server
mlx-omni-server --port 1234
# uv pip install mlx mlx_lm
# mlx_lm.server --model mlx-community/Llama-3.2-3B-Instruct-4bit --port 1234

Note: with mlx-omni-server ran into RuntimeError: Failed to generate completion: generate_step() got an unexpected keyword argument 'user'

Fixed with this patch:

diff --git a/chat/mlx/mlx_model.py b/chat/mlx/mlx_model.py
index da7aef5..094ae9c 100644
--- a/chat/mlx/mlx_model.py
+++ b/chat/mlx/mlx_model.py
@@ -45,6 +45,9 @@ class MLXModel(BaseTextModel):
 
     def _get_generation_params(self, request: ChatCompletionRequest) -> Dict[str, Any]:
         params = request.get_extra_params()
+        # Exclude user. See #37
+        if "user" in params:
+            del params["user"]
         known_params = {
             "top_k",
             "min_tokens_to_keep",

There are many local MLX models from which you can pick

Checking MCP servers

For SSE servers, you can check with curl, e.g.

curl -N http://localhost:8901/sse

相关推荐

  • Aurity Ltd
  • Create and Publish Business Websites in seconds. AI will gather all the details about your website and generate link to your website.

  • Convincible Ltd
  • You're in a stone cell – can you get out? A classic choose-your-adventure interactive fiction game, based on a meticulously-crafted playbook. With a medieval fantasy setting, infinite choices and outcomes, and dice!

  • John Rafferty
  • Text your favorite pet, after answering 10 questions about their everyday lives!

  • Ian O'Connell
  • Provide players' names or enter Quickstart to start the game!

  • analogchat.com
  • Efficient Spotify assistant for personalized music data.

  • justben.fyi
  • Digital sommelier for specific wine bottle recommendations.

  • WangRongsheng
  • 🧑‍🚀 全世界最好的 llm 资料总结(数据处理、模型训练、模型部署、 O1 模型、 MCP 、小语言模型、视觉语言模型) | Zusammenfassung der weltbesten LLM -Ressourcen.

  • langgenius
  • av
  • Führen Sie mühelos LLM -Backends, APIs, Frontends und Dienste mit einem Befehl aus.

  • alibaba
  • 1Panel-dev
  • 🔥 1Panel bietet eine intuitive Weboberfläche und einen MCP -Server, um Websites, Dateien, Container, Datenbanken und LLMs auf einem Linux -Server zu verwalten.

  • Byaidu
  • PDF wissenschaftliche Papierübersetzung mit erhaltenen Formaten - 基于 ai 完整保留排版的 pdf 文档全文双语翻译 , 支持 支持 支持 支持 google/deeptl/ollama/openai 等服务 提供 cli/gui/mcp/docker/zotero

  • microsoft
  • Python tool for converting files and office documents to Markdown.

  • mindsdb
  • AI's query engine - Platform for building AI that can answer questions over large scale federated data. - The only MCP Server you'll ever need

  • rulego
  • ⛓️Rugele ist ein leichter, leistungsstarker, leistungsstarker, eingebetteter Komponenten-Orchestrierungsregel-Motor-Rahmen für GO.

  • AstrBotDevs
  • ✨ 易上手的多平台 llm 聊天机器人及开发框架 ✨ 平台支持 qq 、 qq 频道、 Telegramm 、微信、企微、飞书 | MCP 服务器、 Openai 、 Deepseek 、 Gemini 、硅基流动、月之暗面、 ullama 、 Oneapi 、 Diffy 等。附带 Webui。

  • hkr04
  • Leichtes C ++ MCP (Modellkontextprotokoll) SDK

  • nbonamy
  • Witsy: Desktop -AI -Assistent

    Reviews

    3 (2)
    Avatar
    user_ojOuPwls
    2025-04-24

    As a dedicated user of MCP applications, I am immensely impressed with the Discord-AI-Agent by OoriData. This tool seamlessly integrates with Discord to offer intelligent responses, enhancing the overall user experience. Its advanced AI capabilities make managing conversations effortless and efficient. Highly recommended for anyone looking to boost their Discord server's interactivity!

    Avatar
    user_g8QyIefr
    2025-04-24

    Discord-AI-Agent by OoriData is a game changer for our Discord server. Its seamless integration and intelligent response system have significantly enhanced our communication and productivity. Whether managing tasks or providing instant support, it’s reliable and efficient. Highly recommend this tool for anyone looking to upgrade their server experience!