Get the FREE Ultimate OpenClaw Setup Guide →

hotpath-rs

Rust Performance Profiler & Channels Monitoring Toolkit (TUI, MCP)

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio pawurb-hotpath-rs docker run -i pawurb/hotpath-rs \
  --env HOTPATH_CONFIG="Optional: override runtime config" \
  --env HOTPATH_LOG_LEVEL="debug|info|warn|error (default info)"

How to use

hotpath-rs exposes a built-in MCP server that lets AI agents query real-time profiling data from Rust programs instrumented with Hotpath. The MCP integration enables external agents (like LLMs or orchestration tools) to request live performance metrics, allocation data, and data-flow statistics while a target application is running. Use cases include comparing performance across code revisions, monitoring live systems during load, and driving optimization decisions with concrete timing and memory numbers. To interact with the MCP server, you’ll connect an MCP client to the server endpoint exposed by the container and issue standard Model Context Protocol requests to fetch timing, allocation, and thread metrics. The included live TUI and static reports in Hotpath are complemented by the MCP interface to provide streaming access to profiling data for automation and dashboards.

How to install

Prerequisites:

  • Docker installed on your machine with access to run containers
  • Optional: a host with network access to the container registry

Install and run:

  1. Pull and run the Hotpath MCP server container:
# Start the MCP-enabled Hotpath server (uses Docker)
docker run -i pawurb/hotpath-rs
  1. If you need to customize configuration, set environment variables (example):
# Example to override logging and config via env vars
docker run -i \
  -e HOTPATH_LOG_LEVEL=debug \
  -e HOTPATH_CONFIG="{...}" \
  pawurb/hotpath-rs
  1. Connect an MCP client to the running server. The MCP protocol will be served over the container’s default endpoint (check container logs for the exact host port and path). If you have a local dev setup, you can also adapt this to a Kubernetes deployment or a Docker Compose file to expose the MCP endpoint to your tooling.

Prerequisites for alternative install methods (not required for Docker):

  • Rust toolchain for building from source
  • Familiarity with Cargo feature flags to enable profiling components

If you prefer to build locally from source, clone the repo, install Rust, and run with the appropriate features:

cargo run --features="hotpath" --bin hotpath

Additional notes

Notes and tips:

  • The MCP server is designed to stream profiling data to AI agents in real time; ensure your network policy allows the MCP endpoint to be reached by the client.
  • When running in containers, consider mounting a volume for logs and enabling verbose logging during debugging.
  • If your environment uses a proxy or restricted network, configure Docker’s daemon or the host network to allow the MCP endpoint exposure.
  • The MCP interface follows the Model Context Protocol conventions; refer to the MCP client docs for message shapes and query types (timing, alloc, threads, and data flow metrics).
  • If you experience high overhead in production, tweak the Hotpath feature flags to disable nonessential instrumentation when needed to minimize impact.

Related MCP Servers

Sponsor this space

Reach thousands of developers