mcpx-py
Python client library for https://mcp.run - call portable & secure tools for your AI Agents and Apps
claude mcp add --transport stdio dylibso-mcpx-py python -m mcpx_py \ --env MPC_RUN_SESSION_ID="Session ID generated by mcpx (mcpx-py) via MCP.run"
How to use
mcpx-py is a Python client library that integrates with MCP.run to enable LLM interactions through a unified, scriptable interface. It exposes a Chat class that can connect to multiple providers (including Claude, OpenAI, Gemini, and Ollama models) as well as tools for structured outputs. You can install mcpx-py and then import Chat to instantiate an LLM client by passing the identifier string for your desired model. The library also supports advanced usage like specifying a result_type to get structured output, letting you model responses as custom data classes. In addition to Python usage, MCP.run tooling can be accessed via the mcpx-client CLI (uvx mcpx-client) to list tools, perform chat, or call specific tools, making mcpx-py useful in both programmatic and scriptable workflows. The typical workflow starts with generating or loading an MCP.run session ID and exporting it as an environment variable so mcpx-py can authenticate and route requests through MCP.run.
How to install
Prerequisites:
- Python 3.8+ and pip
- Node.js and npm (for MCP.run tooling and optional CLI workflows via npx/uvx)
- uv (the universal runtime) for optional uv-based installation paths
Step 1: Install the mcpx-py Python package
- Using pip:
pip install mcpx-py
- Or via uv if you prefer uv-managed tooling (requires uv to be installed):
uv add mcpx-py
Step 2: Prepare MCP.run session
- Generate a new session and write it to a config file:
npx --yes -p @dylibso/mcpx gen-session --write
- If you prefer not to write to disk, generate the session and set it in an environment variable:
npx --yes -p @dylibso/mcpx gen-session
- Then set the session ID (example shown; replace with the actual session value you received):
export MPC_RUN_SESSION_ID=kabA7w6qH58H7kKOQ5su4v3bX_CeFn4k.Y4l/s/9dQwkjv9r8t/xZFjsn2fkLzf+tkve89P1vKhQ
Step 3: Verify installation
- Open a Python shell or script and try importing and using the library:
from mcpx_py import Chat
llm = Chat("claude-3-5-sonnet-latest")
response = llm.send_message_sync("summarize the contents of example.com")
print(response.data)
Optional: If you want to use the MCP.run CLI commands directly, ensure npm/uvx is installed and accessible as shown in the README, and run commands like mcpx-client or the various tool invocations described there.
Additional notes
Environment variables and configuration:
- MPC_RUN_SESSION_ID must be set for mcpx-py to authenticate with MCP.run
- Ollama is optional and can be installed to provide local model access; configure accordingly if using Ollama models
- If you use uvx for CLI access, you can run mcpx-client commands like mcpx-client chat, mcpx-client list, or mcpx-client tool eval-js. Common issues:
- If the session ID is invalid or expired, re-run the session generation step via MCP.run and update MPC_RUN_SESSION_ID
- Ensure network access to MCP.run endpoints and any provider API keys (OpenAI, Anthropic, Gemini) as required by the selected provider
- When using structured outputs, ensure your data model is properly defined and serializable by Pydantic
Related MCP Servers
nerve
The Simple Agent Development Kit.
npcpy
The python library for research and development in NLP, multimodal LLMs, Agents, ML, Knowledge Graphs, and more.
mesh
One secure endpoint for every MCP server. Deploy anywhere.
mcp -odoo
A Model Context Protocol (MCP) server that enables AI assistants to securely interact with Odoo ERP systems through standardized resources and tools for data retrieval and manipulation.
mcp-in-action
极客时间MCP新课已经上线!超2000同学一起开启MCP学习之旅!
BinAssistMCP
Binary Ninja plugin to provide MCP functionality.