I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

MCP-Agent8
Serveur MCP pour agent8
3 years
Works with Finder
3
Github Watches
2
Github Forks
0
Github Stars
MCP Server for Agent8
A server implementing the Model Context Protocol (MCP) to support Agent8 SDK development. Developed with TypeScript and pnpm, supporting stdio and SSE transports.
Features
This Agent8 MCP Server implements the following MCP specification capabilities:
Prompts
-
System Prompt for Agent8 SDK: Provides optimized guidelines for Agent8 SDK development through the
system-prompt-for-agent8-sdk
prompt template.
Tools
-
Code Examples Search: Retrieves relevant Agent8 game development code examples from a vector database using the
search_code_examples
tool. -
Game Resource Search: Searches for game development assets (sprites, animations, sounds, etc.) using semantic similarity matching via the
search_game_resources
tool.
Installation
# Install dependencies
pnpm install
# Build
pnpm build
Using Docker
You can run this application using Docker in several ways:
Option 1: Pull from GitHub Container Registry (Recommended)
# Pull the latest image
docker pull ghcr.io/planetarium/mcp-agent8:latest
# Run the container
docker run -p 3333:3333 --env-file .env ghcr.io/planetarium/mcp-agent8:latest
Option 2: Build Locally
# Build the Docker image
docker build -t agent8-mcp-server .
# Run the container with environment variables
docker run -p 3333:3333 --env-file .env agent8-mcp-server
Docker Environment Configuration
There are three ways to configure environment variables when running with Docker:
-
Using
--env-file
(Recommended):# Create and configure your .env file first cp .env.example .env nano .env # Run with .env file docker run -p 3000:3000 --env-file .env agent8-mcp-server
-
Using individual
-e
flags:docker run -p 3000:3000 \ -e SUPABASE_URL=your_supabase_url \ -e SUPABASE_SERVICE_ROLE_KEY=your_service_role_key \ -e OPENAI_API_KEY=your_openai_api_key \ -e MCP_TRANSPORT=sse \ -e PORT=3000 \ -e LOG_LEVEL=info \ agent8-mcp-server
-
Using Docker Compose (for development/production setup):
The project includes a pre-configured
docker-compose.yml
file with:- Automatic port mapping from .env configuration
- Environment variables loading
- Volume mounting for data persistence
- Container auto-restart policy
- Health check configuration
To run the server:
docker compose up
To run in detached mode:
docker compose up -d
Required Environment Variables:
-
SUPABASE_URL
: Supabase URL for database connection -
SUPABASE_SERVICE_ROLE_KEY
: Supabase service role key for authentication -
OPENAI_API_KEY
: OpenAI API key for AI functionality
The Dockerfile uses a multi-stage build process to create a minimal production image:
- Uses Node.js 20 Alpine as the base image for smaller size
- Separates build and runtime dependencies
- Only includes necessary files in the final image
- Exposes port 3000 by default
Usage
Command Line Options
# View help
pnpm start --help
# View version information
pnpm start --version
Supported options:
-
--debug
: Enable debug mode -
--transport <type>
: Transport type (stdio or sse), default: stdio -
--port <number>
: Port to use for SSE transport, default: 3000 -
--log-destination <dest>
: Log destination (stdout, stderr, file, none) -
--log-file <path>
: Path to log file (when log-destination is file) -
--log-level <level>
: Log level (debug, info, warn, error), default: info -
--env-file <path>
: Path to .env file
Using Environment Variables
The server supports configuration via environment variables, which can be set directly or via a .env
file.
- Create a
.env
file in the project root (see.env.example
for reference):
# Copy the example file
cp .env.example .env
# Edit the .env file with your settings
nano .env
- Run the server (it will automatically load the
.env
file):
pnpm start
- Or specify a custom path to the
.env
file:
pnpm start --env-file=/path/to/custom/.env
Configuration Priority
The server uses the following priority order when determining configuration values:
- Command line arguments (highest priority)
- Environment variables (from
.env
file or system environment) - Default values (lowest priority)
This allows you to set baseline configuration in your .env
file while overriding specific settings via command line arguments when needed.
Supported Environment Variables
Variable | Description | Default |
---|---|---|
MCP_TRANSPORT | Transport type (stdio or sse) | stdio |
PORT | Port to use for SSE transport | 3000 |
LOG_LEVEL | Log level (debug, info, warn, error) | info |
LOG_DESTINATION | Log destination (stdout, stderr, file, none) | stderr (for stdio transport), stdout (for sse transport) |
LOG_FILE | Path to log file (when LOG_DESTINATION is file) | (none) |
DEBUG | Enable debug mode (true/false) | false |
SUPABASE_URL | Supabase URL for database connection | (required) |
SUPABASE_SERVICE_ROLE_KEY | Supabase service role key for authentication | (required) |
OPENAI_API_KEY | OpenAI API key for AI functionality | (required) |
ENABLE_ALL_TOOLS | Enable or disable all tools globally | true |
ENABLE_VECTOR_SEARCH_TOOLS | Enable or disable all vector search tools | true |
ENABLE_CINEMATIC_TOOLS | Enable or disable all cinematic tools | true |
ENABLE_CODE_EXAMPLE_SEARCH_TOOL | Enable or disable code example search tool | true |
ENABLE_GAME_RESOURCE_SEARCH_TOOL | Enable or disable game resource search tool | true |
Tool Activation Priority: The tool activation settings follow this priority order:
- Individual tool settings (e.g.,
ENABLE_CODE_EXAMPLE_SEARCH_TOOL
) - Tool group settings (e.g.,
ENABLE_VECTOR_SEARCH_TOOLS
) - Global tool setting (
ENABLE_ALL_TOOLS
)
For example, if you set ENABLE_ALL_TOOLS=false
but ENABLE_VECTOR_SEARCH_TOOLS=true
, only vector search tools will be enabled while other tools remain disabled. Similarly, individual tool settings override their respective group settings.
Examples:
# Enable only vector search tools
ENABLE_ALL_TOOLS=false
ENABLE_VECTOR_SEARCH_TOOLS=true
# Disable a specific tool while keeping others enabled
ENABLE_ALL_TOOLS=true
ENABLE_CODE_EXAMPLE_SEARCH_TOOL=false
Using Stdio Transport
# Build and run
pnpm build
pnpm start --transport=stdio
Using SSE Transport
# Build and run (default port: 3000)
pnpm build
pnpm start --transport=sse --port=3000
Debug Mode
# Run in debug mode
pnpm start --debug
Available Prompts
-
systemprompt-agent8-sdk
Client Integration
Using with Claude Desktop
- Add the following to Claude Desktop configuration file (
claude_desktop_config.json
):
{
"mcpServers": {
"Agent8": {
"command": "npx",
"args": ["--yes", "agent8-mcp-server"]
}
}
}
- Restart Claude Desktop
Adding New Prompts
Add new prompts to the registerSamplePrompts
method in the src/prompts/provider.ts
file.
License
MIT
相关推荐
Confidential guide on numerology and astrology, based of GG33 Public information
Advanced software engineer GPT that excels through nailing the basics.
A medical specialist offering assistance grounded in clinical guidelines. Disclaimer: This is intended for research and is NOT safe for clinical use!
A world class elite tech co-founder entrepreneur, expert in software development, entrepreneurship, marketing, coaching style leadership and aligned with ambition for excellence, global market penetration and worldy perspectives.
Découvrez la collection la plus complète et la plus à jour de serveurs MCP sur le marché. Ce référentiel sert de centre centralisé, offrant un vaste catalogue de serveurs MCP open-source et propriétaires, avec des fonctionnalités, des liens de documentation et des contributeurs.
Manipulation basée sur Micropython I2C de l'exposition GPIO de la série MCP, dérivée d'Adafruit_MCP230XX
Une passerelle API unifiée pour intégrer plusieurs API d'explorateur de blockchain de type étherscan avec la prise en charge du protocole de contexte modèle (MCP) pour les assistants d'IA.
Miroir dehttps: //github.com/suhail-ak-s/mcp-typeseense-server
本项目是一个钉钉 MCP (Protocole de connecteur de message) 服务 , 提供了与钉钉企业应用交互的 API 接口。项目基于 Go 语言开发 , 支持员工信息查询和消息发送等功能。
La communauté du curseur et de la planche à voile, recherchez des règles et des MCP
Reviews

user_OXKuJB8f
The MCP Server Meraki by Censini is a top-notch server solution that offers robust performance and reliability. I've been using it for months, and it's significantly improved my workflow. The integration is seamless, and the support from the Censini team is outstanding. The user-friendly interface and comprehensive features make it a must-have for anyone looking to enhance their server management. Highly recommend checking it out!