Confidential guide on numerology and astrology, based of GG33 Public information

mcp_safe_local_python_executor
Stdio MCP Server wrapping custom Python runtime (LocalPythonExecutor) from Hugging Faces' `smolagents` framework. The runtime combines the ease of setup (compared to docker, VM, cloud runtimes) while providing safeguards and limiting operations/imports that are allowed inside the runtime.
3 years
Works with Finder
1
Github Watches
2
Github Forks
10
Github Stars
Safe Local Python Executor
An MCP server (stdio transport) that wraps Hugging Face's LocalPythonExecutor
(from the smolagents
framework). It is a custom Python runtime that
provides basic isolation/security when running Python code generated by LLMs locally. It does not require Docker or VM.
This package allows to expose the Python executor via MCP (Model Context Protocol) as a tool for LLM apps like Claude Desktop, Cursor or any other MCP compatible client.
In case of Claude Desktop this tool is an easy way to add a missing Code Interpreter (available as a plugin in ChatGPT for quite a while already).
Features
- Exposes
run_python
tool - Safer execution of Python code compared to direct use of Python
eva()l
- Ran via uv in Python venv
- No file I/O ops are allowed
- Restricted list of imports
- collections
- datetime
- itertools
- math
- queue
- random
- re
- stat
- statistics
- time
- unicodedata
Security
Be careful with execution of code produced by LLM on your machine, stay away from MCP servers that run Python via command line or using eval()
. The safest option is using a VM or a docker container, though it requires some effort to set-up, consumes resources/slower. There're 3rd party servcices providing Python runtime, though they require registration, API keys etc.
LocalPythonExecutor
provides a good balance between direct use of local Python environment (which is easier to set-up) AND remote execution in Dokcer container or a VM/3rd party service (which is safe). Hugginng Face team has invested time into creating a quick and safe option to run LLM generated code used by their code agents. This MCP server builds upon it:
To add a first layer of security, code execution in smolagents is not performed by the vanilla Python interpreter. We have re-built a more secure LocalPythonExecutor from the ground up.
Read more here.
Installation and Execution
- Install
uv
(e.h.brew install uv
on macOS or use official docs) - Clone the repo, change the directory
cd mcp_safe_local_python_executor
- The server can be started via command line
uv run mcp_server.py
, venv will be created automatically, depedencies (smollagents, mcp) will be installed
Configuring Claude Desktop
-
Make sure you have Claude for Desktop installed (download from claude.ai)
-
Edit your Claude for Desktop configuration file:
- macOS:
~/Library/Application Support/Claude/claude_desktop_config.json
- Windows:
%APPDATA%\Claude\claude_desktop_config.json
- Or open Claude Desktop -> Settings -> Developer -> click "Edit Config" button
- macOS:
-
Add the following configuration:
{
"mcpServers": {
"safe-local-python-executor": {
"command": "uv",
"args": [
"--directory",
"/path/to/mcp_local_python_executor/",
"run",
"mcp_server.py"
]
}
}
}
- Restart Claude for Desktop
- The Python executor tool will now be available in Claude (you'll see hammer icon in the message input field)
Example Prompts
Once configured, you can use prompts like:
- "Calculate the factorial of 5 using Python"
- "Create a list of prime numbers up to 100"
- "Solve this equation (use Python): x^2 + 5x + 6 = 0"
Development
Clone the repo. Use uv
to create venv, install dev dependencies, run tests:
uv venv .venv
uv sync --group dev
python -m pytest tests/
相关推荐
Converts Figma frames into front-end code for various mobile frameworks.
Advanced software engineer GPT that excels through nailing the basics.
Take an adjectivised noun, and create images making it progressively more adjective!
Siri Shortcut Finder – your go-to place for discovering amazing Siri Shortcuts with ease
I find academic articles and books for research and literature reviews.
Discover the most comprehensive and up-to-date collection of MCP servers in the market. This repository serves as a centralized hub, offering an extensive catalog of open-source and proprietary MCP servers, complete with features, documentation links, and contributors.
Micropython I2C-based manipulation of the MCP series GPIO expander, derived from Adafruit_MCP230xx
Mirror ofhttps://github.com/agentience/practices_mcp_server
Mirror ofhttps://github.com/bitrefill/bitrefill-mcp-server
An AI chat bot for small and medium-sized teams, supporting models such as Deepseek, Open AI, Claude, and Gemini. 专为中小团队设计的 AI 聊天应用,支持 Deepseek、Open AI、Claude、Gemini 等模型。
Bridge between Ollama and MCP servers, enabling local LLMs to use Model Context Protocol tools
Reviews

user_choFvTul
As a devoted user of the mcp_safe_local_python_executor developed by maxim-saplin, I must say this tool has significantly streamlined my local Python script executions. The product ensures robust security features, offering a safe environment to run scripts without the usual risks. For anyone looking to enhance their local Python workflows, I highly recommend checking it out via its GitHub link: https://github.com/maxim-saplin/mcp_safe_local_python_executor.