Cover image
Try Now
2025-04-14

代理的MCP服务器8

3 years

Works with Finder

3

Github Watches

2

Github Forks

0

Github Stars

MCP Server for Agent8

A server implementing the Model Context Protocol (MCP) to support Agent8 SDK development. Developed with TypeScript and pnpm, supporting stdio and SSE transports.

Features

This Agent8 MCP Server implements the following MCP specification capabilities:

Prompts

  • System Prompt for Agent8 SDK: Provides optimized guidelines for Agent8 SDK development through the system-prompt-for-agent8-sdk prompt template.

Tools

  • Code Examples Search: Retrieves relevant Agent8 game development code examples from a vector database using the search_code_examples tool.
  • Game Resource Search: Searches for game development assets (sprites, animations, sounds, etc.) using semantic similarity matching via the search_game_resources tool.

Installation

# Install dependencies
pnpm install

# Build
pnpm build

Using Docker

You can run this application using Docker in several ways:

Option 1: Pull from GitHub Container Registry (Recommended)

# Pull the latest image
docker pull ghcr.io/planetarium/mcp-agent8:latest

# Run the container
docker run -p 3333:3333 --env-file .env ghcr.io/planetarium/mcp-agent8:latest

Option 2: Build Locally

# Build the Docker image
docker build -t agent8-mcp-server .

# Run the container with environment variables
docker run -p 3333:3333 --env-file .env agent8-mcp-server

Docker Environment Configuration

There are three ways to configure environment variables when running with Docker:

  1. Using --env-file (Recommended):

    # Create and configure your .env file first
    cp .env.example .env
    nano .env
    
    # Run with .env file
    docker run -p 3000:3000 --env-file .env agent8-mcp-server
    
  2. Using individual -e flags:

    docker run -p 3000:3000 \
      -e SUPABASE_URL=your_supabase_url \
      -e SUPABASE_SERVICE_ROLE_KEY=your_service_role_key \
      -e OPENAI_API_KEY=your_openai_api_key \
      -e MCP_TRANSPORT=sse \
      -e PORT=3000 \
      -e LOG_LEVEL=info \
      agent8-mcp-server
    
  3. Using Docker Compose (for development/production setup):

    The project includes a pre-configured docker-compose.yml file with:

    • Automatic port mapping from .env configuration
    • Environment variables loading
    • Volume mounting for data persistence
    • Container auto-restart policy
    • Health check configuration

    To run the server:

    docker compose up
    

    To run in detached mode:

    docker compose up -d
    

Required Environment Variables:

  • SUPABASE_URL: Supabase URL for database connection
  • SUPABASE_SERVICE_ROLE_KEY: Supabase service role key for authentication
  • OPENAI_API_KEY: OpenAI API key for AI functionality

The Dockerfile uses a multi-stage build process to create a minimal production image:

  • Uses Node.js 20 Alpine as the base image for smaller size
  • Separates build and runtime dependencies
  • Only includes necessary files in the final image
  • Exposes port 3000 by default

Usage

Command Line Options

# View help
pnpm start --help

# View version information
pnpm start --version

Supported options:

  • --debug: Enable debug mode
  • --transport <type>: Transport type (stdio or sse), default: stdio
  • --port <number>: Port to use for SSE transport, default: 3000
  • --log-destination <dest>: Log destination (stdout, stderr, file, none)
  • --log-file <path>: Path to log file (when log-destination is file)
  • --log-level <level>: Log level (debug, info, warn, error), default: info
  • --env-file <path>: Path to .env file

Using Environment Variables

The server supports configuration via environment variables, which can be set directly or via a .env file.

  1. Create a .env file in the project root (see .env.example for reference):
# Copy the example file
cp .env.example .env

# Edit the .env file with your settings
nano .env
  1. Run the server (it will automatically load the .env file):
pnpm start
  1. Or specify a custom path to the .env file:
pnpm start --env-file=/path/to/custom/.env

Configuration Priority

The server uses the following priority order when determining configuration values:

  1. Command line arguments (highest priority)
  2. Environment variables (from .env file or system environment)
  3. Default values (lowest priority)

This allows you to set baseline configuration in your .env file while overriding specific settings via command line arguments when needed.

Supported Environment Variables

Variable Description Default
MCP_TRANSPORT Transport type (stdio or sse) stdio
PORT Port to use for SSE transport 3000
LOG_LEVEL Log level (debug, info, warn, error) info
LOG_DESTINATION Log destination (stdout, stderr, file, none) stderr (for stdio transport), stdout (for sse transport)
LOG_FILE Path to log file (when LOG_DESTINATION is file) (none)
DEBUG Enable debug mode (true/false) false
SUPABASE_URL Supabase URL for database connection (required)
SUPABASE_SERVICE_ROLE_KEY Supabase service role key for authentication (required)
OPENAI_API_KEY OpenAI API key for AI functionality (required)
ENABLE_ALL_TOOLS Enable or disable all tools globally true
ENABLE_VECTOR_SEARCH_TOOLS Enable or disable all vector search tools true
ENABLE_CINEMATIC_TOOLS Enable or disable all cinematic tools true
ENABLE_CODE_EXAMPLE_SEARCH_TOOL Enable or disable code example search tool true
ENABLE_GAME_RESOURCE_SEARCH_TOOL Enable or disable game resource search tool true

Tool Activation Priority: The tool activation settings follow this priority order:

  1. Individual tool settings (e.g., ENABLE_CODE_EXAMPLE_SEARCH_TOOL)
  2. Tool group settings (e.g., ENABLE_VECTOR_SEARCH_TOOLS)
  3. Global tool setting (ENABLE_ALL_TOOLS)

For example, if you set ENABLE_ALL_TOOLS=false but ENABLE_VECTOR_SEARCH_TOOLS=true, only vector search tools will be enabled while other tools remain disabled. Similarly, individual tool settings override their respective group settings.

Examples:

# Enable only vector search tools
ENABLE_ALL_TOOLS=false
ENABLE_VECTOR_SEARCH_TOOLS=true

# Disable a specific tool while keeping others enabled
ENABLE_ALL_TOOLS=true
ENABLE_CODE_EXAMPLE_SEARCH_TOOL=false

Using Stdio Transport

# Build and run
pnpm build
pnpm start --transport=stdio

Using SSE Transport

# Build and run (default port: 3000)
pnpm build
pnpm start --transport=sse --port=3000

Debug Mode

# Run in debug mode
pnpm start --debug

Available Prompts

  • systemprompt-agent8-sdk

Client Integration

Using with Claude Desktop

  1. Add the following to Claude Desktop configuration file (claude_desktop_config.json):
{
  "mcpServers": {
    "Agent8": {
      "command": "npx",
      "args": ["--yes", "agent8-mcp-server"]
    }
  }
}
  1. Restart Claude Desktop

Adding New Prompts

Add new prompts to the registerSamplePrompts method in the src/prompts/provider.ts file.

License

MIT

相关推荐

  • NiKole Maxwell
  • I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • Callycode Limited
  • A geek-themed horoscope generator blending Bitcoin prices, tech jargon, and astrological whimsy.

  • Khalid kalib
  • Write professional emails

  • ANGEL LEON
  • A world class elite tech co-founder entrepreneur, expert in software development, entrepreneurship, marketing, coaching style leadership and aligned with ambition for excellence, global market penetration and worldy perspectives.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • INFOLAB OPERATIONS 2
  • A medical specialist offering assistance grounded in clinical guidelines. Disclaimer: This is intended for research and is NOT safe for clinical use!

  • Yasir Eryilmaz
  • AI scriptwriting assistant for short, engaging video content.

  • apappascs
  • 发现市场上最全面,最新的MCP服务器集合。该存储库充当集中式枢纽,提供了广泛的开源和专有MCP服务器目录,并提供功能,文档链接和贡献者。

  • ShrimpingIt
  • MCP系列GPIO Expander的基于Micropython I2C的操作,源自ADAFRUIT_MCP230XX

  • OffchainLabs
  • 进行以太坊的实施

  • huahuayu
  • 统一的API网关,用于将多个Etherscan样区块链Explorer API与对AI助手的模型上下文协议(MCP)支持。

  • deemkeen
  • 用电源组合控制您的MBOT2:MQTT+MCP+LLM

  • zhaoyunxing92
  • MCP(消息连接器协议)服务

  • pontusab
  • 光标与风浪冲浪社区,查找规则和MCP

    Reviews

    5 (1)
    Avatar
    user_OXKuJB8f
    2025-04-16

    The MCP Server Meraki by Censini is a top-notch server solution that offers robust performance and reliability. I've been using it for months, and it's significantly improved my workflow. The integration is seamless, and the support from the Censini team is outstanding. The user-friendly interface and comprehensive features make it a must-have for anyone looking to enhance their server management. Highly recommend checking it out!