
servidor de rlang-mcp
1
Github Watches
0
Github Forks
2
Github Stars
R-Server MCP
A specialized Model Context Protocol (MCP) server that enables AI models to generate data visualizations using R's ggplot2 library and execute R scripts.
Overview
This MCP server provides a streamlined interface for creating statistical visualizations and executing R scripts without requiring direct access to an R environment. It exposes two MCP tools:
-
render_ggplot
: Generates visualizations from R code containing ggplot2 commands -
execute_r_script
: Executes any R script and returns the text output
Features
- ggplot2 Rendering: Execute R code containing ggplot2 commands and return the resulting visualization
- R Script Execution: Execute any R script and return the text output
- Format Options: Support for PNG, JPEG, PDF, and SVG output formats
- Customization: Control image dimensions and resolution
- Error Handling: Clear error messages for invalid R code or rendering failures
- MCP Protocol Compliance: Full implementation of the Model Context Protocol
- Docker Integration: Secure execution of R code in isolated containers
Requirements
- Go 1.22 or later
- R 4.0 or later with ggplot2 package
- Docker (for containerized execution)
Building
# Build the Docker image
task docker:build
# Run the server in Docker
task docker:run
Using Docker with stdin/stdout
The server can be run in Docker while preserving stdin/stdout communication, which is essential for MCP:
# Build and run using docker-compose
./start_server.sh --docker
Or set an environment variable:
USE_DOCKER=true ./start_server.sh
This approach ensures that stdin and stdout are properly connected between the host and the container, allowing seamless MCP communication.
Usage
MCP Integration
To use this server with an MCP client, configure it in your MCP settings file:
Local Execution
{
"mcpServers": {
"r-server": {
"command": "/path/to/r-server",
"disabled": false,
"autoApprove": []
}
}
}
Docker Execution
{
"mcpServers": {
"r-server": {
"command": "/path/to/start_server.sh",
"args": ["--docker"],
"disabled": false,
"autoApprove": []
}
}
}
The MCP client will automatically communicate with the server using stdio transport, which is the recommended approach for stability and reliability. The dockerized version maintains this communication pattern while providing isolation and dependency management.
License
Creative Commons Attribution-NonCommercial 4.0 International (CC-BY-NC 4.0)
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
See the LICENSE file for details.
相关推荐
I find academic articles and books for research and literature reviews.
Confidential guide on numerology and astrology, based of GG33 Public information
Emulating Dr. Jordan B. Peterson's style in providing life advice and insights.
Your go-to expert in the Rust ecosystem, specializing in precise code interpretation, up-to-date crate version checking, and in-depth source code analysis. I offer accurate, context-aware insights for all your Rust programming questions.
Advanced software engineer GPT that excels through nailing the basics.
Converts Figma frames into front-end code for various mobile frameworks.
Take an adjectivised noun, and create images making it progressively more adjective!
Descubra la colección más completa y actualizada de servidores MCP en el mercado. Este repositorio sirve como un centro centralizado, que ofrece un extenso catálogo de servidores MCP de código abierto y propietarios, completos con características, enlaces de documentación y colaboradores.
La aplicación AI de escritorio todo en uno y Docker con trapo incorporado, agentes de IA, creador de agentes sin código, compatibilidad de MCP y más.
Manipulación basada en Micrypthon I2C del expansor GPIO de la serie MCP, derivada de AdaFruit_MCP230xx
Espejo dehttps: //github.com/agentience/practices_mcp_server
Plataforma de automatización de flujo de trabajo de código justo con capacidades de IA nativas. Combine el edificio visual con código personalizado, auto-anfitrión o nube, más de 400 integraciones.
🧑🚀 全世界最好的 llM 资料总结(数据处理、模型训练、模型部署、 O1 模型、 MCP 、小语言模型、视觉语言模型) | Resumen de los mejores recursos del mundo.
Espejo de https: //github.com/bitrefill/bitrefill-mcp-server
Reviews

user_txdEnm0Q
As a dedicated user of rlang-mcp-server, I am thoroughly impressed by its robust performance and seamless integration capabilities. Created by gdbelvin, this server stands out for its reliability and efficiency. Whether you're dealing with complex data analysis or straightforward tasks, this tool consistently delivers excellent results. Check it out at https://github.com/gdbelvin/rlang-mcp-server and see how it can enhance your workflow.