MCP cover image
See in Github
2025-04-14

Discord bot for supporting AI/LLM chat applications powered by the Model Context Protocol (MCP), allowing for numerous integrations

0

Github Watches

0

Github Forks

0

Github Stars

Discord-AIBot

Discord bot for supporting AI/LLM chat applications powered by the Model Context Protocol (MCP), allowing for numerous integrations

For general MCP resources, see Arkestra:cookbook/mcp/README.md

Configuration

cp config/example.main.toml config/main.toml

Then edit main.toml as needed. It specifies your LLM endpoint and model context resources such as MCP servers.

Running

If you included toys.b4a.toml in your main.toml you'll need to have that MCP server running. In a separate terminal cd demo_server then run

uv pip install -Ur requirements.txt
uvicorn toy_mcp_server:create_app --factory --host 127.0.0.1 --port 8902

Make sure you set up any other MCP or other resources you've specified in your B4A. Now you can run the bot.

# Assumes you've exported DISCORD_TOKEN="YOUR_TOKEN"
python mcp_discord_bot.py --discord-token $DISCORD_TOKEN --config-path config

Structlog/rich tracebacks can be elaborate, so there is a --classic-tracebacks option to tame them

For very copious logging you can add --loglevel DEBUG

Note: you can use the environment rather than --discord-token & --config-path

export AIBOT_DISCORD_TOKEN="YOUR_TOKEN"
export AIBOT_DISCORD_CONFIG_PATH="./config"
python mcp_discord_bot.py # Reads from env vars

Implementation notes

Using mlx_lm.server

uv pip install mlx-omni-server
mlx-omni-server --port 1234
# uv pip install mlx mlx_lm
# mlx_lm.server --model mlx-community/Llama-3.2-3B-Instruct-4bit --port 1234

Note: with mlx-omni-server ran into RuntimeError: Failed to generate completion: generate_step() got an unexpected keyword argument 'user'

Fixed with this patch:

diff --git a/chat/mlx/mlx_model.py b/chat/mlx/mlx_model.py
index da7aef5..094ae9c 100644
--- a/chat/mlx/mlx_model.py
+++ b/chat/mlx/mlx_model.py
@@ -45,6 +45,9 @@ class MLXModel(BaseTextModel):
 
     def _get_generation_params(self, request: ChatCompletionRequest) -> Dict[str, Any]:
         params = request.get_extra_params()
+        # Exclude user. See #37
+        if "user" in params:
+            del params["user"]
         known_params = {
             "top_k",
             "min_tokens_to_keep",

There are many local MLX models from which you can pick

Checking MCP servers

For SSE servers, you can check with curl, e.g.

curl -N http://localhost:8901/sse

相关推荐

  • Aurity Ltd
  • Create and Publish Business Websites in seconds. AI will gather all the details about your website and generate link to your website.

  • John Rafferty
  • Text your favorite pet, after answering 10 questions about their everyday lives!

  • Ian O'Connell
  • Provide players' names or enter Quickstart to start the game!

  • analogchat.com
  • Efficient Spotify assistant for personalized music data.

  • Convincible Ltd
  • You're in a stone cell – can you get out? A classic choose-your-adventure interactive fiction game, based on a meticulously-crafted playbook. With a medieval fantasy setting, infinite choices and outcomes, and dice!

  • justben.fyi
  • Digital sommelier for specific wine bottle recommendations.

  • WangRongsheng
  • 🧑‍🚀 全世界最好的LLM资料总结(Agent框架、辅助编程、数据处理、模型训练、模型推理、o1 模型、MCP、小语言模型、视觉语言模型) | Summary of the world's best LLM resources.

  • av
  • Effortlessly run LLM backends, APIs, frontends, and services with one command.

  • 1Panel-dev
  • 🔥 1Panel provides an intuitive web interface and MCP Server to manage websites, files, containers, databases, and LLMs on a Linux server.

  • langgenius
  • Dify is an open-source LLM app development platform. Dify's intuitive interface combines AI workflow, RAG pipeline, agent capabilities, model management, observability features and more, letting you quickly go from prototype to production.

  • alibaba
  • an easy-to-use dynamic service discovery, configuration and service management platform for building AI cloud native applications.

  • rulego
  • ⛓️RuleGo is a lightweight, high-performance, embedded, next-generation component orchestration rule engine framework for Go.

  • Byaidu
  • PDF scientific paper translation with preserved formats - 基于 AI 完整保留排版的 PDF 文档全文双语翻译,支持 Google/DeepL/Ollama/OpenAI 等服务,提供 CLI/GUI/MCP/Docker/Zotero

  • lasso-security
  • A plugin-based gateway that orchestrates other MCPs and allows developers to build upon it enterprise-grade agents.

  • hkr04
  • Lightweight C++ MCP (Model Context Protocol) SDK

    Reviews

    3 (2)
    Avatar
    user_ojOuPwls
    2025-04-24

    As a dedicated user of MCP applications, I am immensely impressed with the Discord-AI-Agent by OoriData. This tool seamlessly integrates with Discord to offer intelligent responses, enhancing the overall user experience. Its advanced AI capabilities make managing conversations effortless and efficient. Highly recommended for anyone looking to boost their Discord server's interactivity!

    Avatar
    user_g8QyIefr
    2025-04-24

    Discord-AI-Agent by OoriData is a game changer for our Discord server. Its seamless integration and intelligent response system have significantly enhanced our communication and productivity. Whether managing tasks or providing instant support, it’s reliable and efficient. Highly recommend this tool for anyone looking to upgrade their server experience!