Cover image

用于制作MCP(模型上下文协议)与RUST兼容程序的板条箱

3 years

Works with Finder

1

Github Watches

8

Github Forks

15

Github Stars

mcp.rs

A Rust implementation of the Model Context Protocol (MCP), providing a standardized way for AI models to access external context and resources.

Quickstart

Client

# List resources
cargo run --bin client list-resources

# Read a specific file
cargo run --bin client read-resource -u "file:///path/to/file"

# Use a prompt
cargo run --bin client get-prompt -n "code_review" -a '{"code": "fn main() {}", "language": "rust"}'

# Call a tool
cargo run --bin client -- --server "http://127.0.0.1:3000" call-tool --name "file_system" --args '{\"operation\": \"read_file\", \"path\": \"Config.toml\"}'

# Set log level
cargo run --bin client set-log-level -l "debug"

# Use SSE transport
cargo run --bin client -t sse -s http://localhost:3000 list-resources

Server

# Run with test config
cargo run --bin server -- --config "../servers/test.json"

Overview

mcp.rs is a high-performance, type-safe Rust implementation of the Model Context Protocol, designed to enable seamless communication between AI applications and their integrations. It provides a robust foundation for building MCP servers that can expose various types of resources (files, data, APIs) to AI models.

Features

  • Multiple Transport Types:

    • Standard Input/Output (stdio) transport for CLI tools
    • HTTP with Server-Sent Events (SSE) for web integrations
    • Extensible transport system for custom implementations
  • Resource Management:

    • File system resource provider
    • Resource templating support
    • Real-time resource updates
    • Resource subscription capabilities
  • Flexible Configuration:

    • YAML/JSON configuration files
    • Environment variable overrides
    • Command-line arguments
    • Sensible defaults
  • Security:

    • Built-in access controls
    • Path traversal protection
    • Rate limiting
    • CORS support

Installation

Add mcp.rs to your project's Cargo.toml:

[dependencies]
mcp = "0.1.0"

Quick Start

  1. Create a basic MCP server:
use mcp::{McpServer, ServerConfig};

#[tokio::main]
async fn main() -> Result<(), mcp::error::McpError> {
    // Create server with default configuration
    let server = McpServer::new(ServerConfig::default());
    
    // Run the server
    server.run().await
}
  1. Configure via command line:
# Run with stdio transport
mcp-server -t stdio

# Run with SSE transport on port 3000
mcp-server -t sse -p 3000

# Enable debug logging
mcp-server -l debug
  1. Or use a configuration file:
server:
  name: "my-mcp-server"
  version: "1.0.0"
  transport: sse
  port: 3000

resources:
  root_path: "./resources"
  allowed_schemes:
    - file
  max_file_size: 10485760

security:
  enable_auth: false
  allowed_origins:
    - "*"

logging:
  level: "info"
  format: "pretty"

Architecture

mcp.rs follows a modular architecture:

  • Transport Layer: Handles communication between clients and servers
  • Protocol Layer: Implements the MCP message format and routing
  • Resource Layer: Manages access to external resources
  • Configuration: Handles server settings and capabilities

Configuration Options

Option Description Default
transport Transport type (stdio, sse) stdio
port Server port for network transports 3000
log_level Logging level info
resource_root Root directory for resources ./resources

API Reference

For detailed API documentation, run:

cargo doc --open

Examples (TODO)

Check out the /examples directory for:

  • Basic server implementation
  • Custom resource provider
  • Configuration examples
  • Integration patterns

Contributing

Contributions are welcome! Please read our Contributing Guidelines before submitting pull requests.

Development Requirements

  • Rust 1.70 or higher
  • Cargo
  • Optional: Docker for containerized testing

Running Tests

# Run all tests
cargo test

# Run with logging
RUST_LOG=debug cargo test

License

This project is licensed under the MIT License.

Acknowledgments

This implementation is based on the Model Context Protocol specification and inspired by the reference implementation.

Contact

相关推荐

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Alexandru Strujac
  • Efficient thumbnail creator for YouTube videos

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • Lists Tailwind CSS classes in monospaced font

  • lumpenspace
  • Take an adjectivised noun, and create images making it progressively more adjective!

  • https://appia.in
  • Siri Shortcut Finder – your go-to place for discovering amazing Siri Shortcuts with ease

  • Carlos Ferrin
  • Encuentra películas y series en plataformas de streaming.

  • Yusuf Emre Yeşilyurt
  • I find academic articles and books for research and literature reviews.

  • tomoyoshi hirata
  • Sony α7IIIマニュアルアシスタント

  • apappascs
  • 发现市场上最全面,最新的MCP服务器集合。该存储库充当集中式枢纽,提供了广泛的开源和专有MCP服务器目录,并提供功能,文档链接和贡献者。

  • ShrimpingIt
  • MCP系列GPIO Expander的基于Micropython I2C的操作,源自ADAFRUIT_MCP230XX

  • jae-jae
  • MCP服务器使用剧作《无头浏览器》获取网页内容。

  • ravitemer
  • 一个功能强大的Neovim插件,用于管理MCP(模型上下文协议)服务器

  • patruff
  • Ollama和MCP服务器之间的桥梁,使本地LLMS可以使用模型上下文协议工具

  • pontusab
  • 光标与风浪冲浪社区,查找规则和MCP

  • av
  • 毫不费力地使用一个命令运行LLM后端,API,前端和服务。

  • WangRongsheng
  • 🧑‍🚀 llm 资料总结(数据处理、模型训练、模型部署、 o1 模型、mcp 、小语言模型、视觉语言模型)|摘要世界上最好的LLM资源。

  • Mintplex-Labs
  • 带有内置抹布,AI代理,无代理构建器,MCP兼容性等的多合一桌面和Docker AI应用程序。

  • appcypher
  • 很棒的MCP服务器 - 模型上下文协议服务器的策划列表

    Reviews

    4 (1)
    Avatar
    user_nUAgWU7t
    2025-04-17

    As a dedicated user of mcp, I can attest to its remarkable functionality and ease of use. Created by EmilLindfors and hosted on GitHub, this product has significantly improved my workflow. The clear documentation and robust performance make it a standout tool in its category. Highly recommend checking it out!