I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

mcpx-py
Python Client Library Forhttps: //mcp.run- Appelez des outils portables et sécurisés pour vos agents et applications d'IA
3 years
Works with Finder
4
Github Watches
1
Github Forks
16
Github Stars
mcpx-py
A Python library for interacting with LLMs using mcp.run tools
Features
AI Provider Support
mcpx-py
supports all models supported by PydanticAI
Dependencies
-
uv
-
npm
-
ollama
(optional)
mcp.run Setup
You will need to get an mcp.run session ID by running:
npx --yes -p @dylibso/mcpx gen-session --write
This will generate a new session and write the session ID to a configuration file that can be used
by mcpx-py
.
If you need to store the session ID in an environment variable you can run gen-session
without the --write
flag:
npx --yes -p @dylibso/mcpx gen-session
which should output something like:
Login successful!
Session: kabA7w6qH58H7kKOQ5su4v3bX_CeFn4k.Y4l/s/9dQwkjv9r8t/xZFjsn2fkLzf+tkve89P1vKhQ
Then set the MPC_RUN_SESSION_ID
environment variable:
$ export MCP_RUN_SESSION_ID=kabA7w6qH58H7kKOQ5su4v3bX_CeFn4k.Y4l/s/9dQwkjv9r8t/xZFjsn2fkLzf+tkve89P1vKhQ
Python Usage
Installation
Using uv
:
uv add mcpx-py
Or pip
:
pip install mcpx-py
Example code
from mcpx_py import Chat
llm = Chat("claude-3-5-sonnet-latest")
# Or OpenAI
# llm = Chat("gpt-4o")
# Or Ollama
# llm = Chat("ollama:qwen2.5")
# Or Gemini
# llm = Chat("gemini-2.0-flash")
response = llm.send_message_sync(
"summarize the contents of example.com"
)
print(response.data)
It's also possible to get structured output by setting result_type
from mcpx_py import Chat, BaseModel, Field
from typing import List
class Summary(BaseModel):
"""
A summary of some longer text
"""
source: str = Field("The source of the original_text")
original_text: str = Field("The original text to be summarized")
items: List[str] = Field("A list of summary points")
llm = Chat("claude-3-5-sonnet-latest", result_type=Summary)
response = llm.send_message_sync(
"summarize the contents of example.com"
)
print(response.data)
More examples can be found in the examples/ directory
Command Line Usage
Installation
uv tool install mcpx-py
From git:
uv tool install git+https://github.com/dylibso/mcpx-py
Or from the root of the repo:
uv tool install .
uvx
mcpx-client can also be executed without being installed using uvx
:
uvx --from mcpx-py mcpx-client
Or from git:
uvx --from git+https://github.com/dylibso/mcpx-py mcpx-client
Running
Get usage/help
mcpx-client --help
Chat with an LLM
mcpx-client chat
List tools
mcpx-client list
Call a tool
mcpx-client tool eval-js '{"code": "2+2"}'
LLM Configuration
Provider Setup
Claude
- Sign up for an Anthropic API account at https://console.anthropic.com
- Get your API key from the console
- Set the environment variable:
ANTHROPIC_API_KEY=your_key_here
OpenAI
- Create an OpenAI account at https://platform.openai.com
- Generate an API key in your account settings
- Set the environment variable:
OPENAI_API_KEY=your_key_here
Gemini
- Create an Gemini account at https://aistudio.google.com
- Generate an API key in your account settings
- Set the environment variable:
GEMINI_API_KEY=your_key_here
Ollama
- Install Ollama from https://ollama.ai
- Pull your desired model:
ollama pull llama3.2
- No API key needed - runs locally
Llamafile
- Download a Llamafile model from https://github.com/Mozilla-Ocho/llamafile/releases
- Make the file executable:
chmod +x your-model.llamafile
- Run in JSON API mode:
./your-model.llamafile --json-api --host 127.0.0.1 --port 8080
- Use with the OpenAI provider pointing to
http://localhost:8080
相关推荐
Confidential guide on numerology and astrology, based of GG33 Public information
Oede knorrepot die vasthoudt an de goeie ouwe tied van 't boerenleven
Advanced software engineer GPT that excels through nailing the basics.
A medical specialist offering assistance grounded in clinical guidelines. Disclaimer: This is intended for research and is NOT safe for clinical use!
A world class elite tech co-founder entrepreneur, expert in software development, entrepreneurship, marketing, coaching style leadership and aligned with ambition for excellence, global market penetration and worldy perspectives.
Découvrez la collection la plus complète et la plus à jour de serveurs MCP sur le marché. Ce référentiel sert de centre centralisé, offrant un vaste catalogue de serveurs MCP open-source et propriétaires, avec des fonctionnalités, des liens de documentation et des contributeurs.
Manipulation basée sur Micropython I2C de l'exposition GPIO de la série MCP, dérivée d'Adafruit_MCP230XX
Une passerelle API unifiée pour intégrer plusieurs API d'explorateur de blockchain de type étherscan avec la prise en charge du protocole de contexte modèle (MCP) pour les assistants d'IA.
Miroir dehttps: //github.com/suhail-ak-s/mcp-typeseense-server
本项目是一个钉钉 MCP (Protocole de connecteur de message) 服务 , 提供了与钉钉企业应用交互的 API 接口。项目基于 Go 语言开发 , 支持员工信息查询和消息发送等功能。
La communauté du curseur et de la planche à voile, recherchez des règles et des MCP
Reviews

user_7Oy8a188
I've been using the Procesio MCP Server by Serenichron for a while now, and it has significantly enhanced my workflow efficiency. The server's seamless integration and robust performance stand out, making it an indispensable tool for any developer. Highly recommended!