mcp-reticle
Reticle intercepts, visualizes, and profiles JSON-RPC traffic between your LLM and MCP servers in real-time, with zero latency overhead. Stop debugging blind. Start seeing everything.
claude mcp add --transport stdio soth-ai-mcp-reticle mcp-reticle run --name reticle -- <your-mcp-server-command>
How to use
Reticle is a proxy + UI that lets you observe, correlate, and profile MCP traffic in real time. It can wrap an existing MCP server that communicates over stdio, then expose a live view of requests, notifications, and responses, along with latency and token estimates. You can also run Reticle in log-only mode or proxy HTTP-based MCP servers to inspect traffic via the UI. Use the wrap command to start Reticle around your MCP server so you can see everything as it flows between your client and server. The included UI supports visualizing the transport layers (stdio, HTTP/SSE, WebSocket, etc.) and exporting captured sessions for later analysis. The CLI also offers a headless telemetry hub for collecting and streaming metrics.
How to install
Prerequisites:
- Node.js and npm (recommended): install from https://nodejs.org
- Optional: Python and pip if you plan to use the Python client or examples
- Optional: Homebrew for macOS users
Installation options:
-
Install the MCP proxy UI globally via npm: npm install -g mcp-reticle
-
Install via pip (Python): pip install mcp-reticle
-
Install via Homebrew (macOS): brew install labterminal/mcp-reticle/mcp-reticle
-
Build from source (advanced): git clone https://github.com/labterminal/mcp-reticle.git cd mcp-reticle just build
Basic usage after installation:
- Wrap an MCP server: mcp-reticle run --name my-server -- <your-server-command>
- Launch the UI: mcp-reticle ui
- Run in log-only mode (no UI): mcp-reticle run --log -- <your-server-command>
- Proxy an HTTP-based MCP server: mcp-reticle proxy --name api --upstream http://localhost:8080 --listen 3001
Additional notes
Tips and notes:
- Reticle supports multiple transports: stdio, Streamable HTTP, WebSocket, and HTTP/SSE. Ensure your MCP server is reachable by Reticle through the chosen transport.
- When wrapping a server, you can give it a descriptive name with --name for easier identification in the UI.
- The --log flag enables a lightweight, non-UI mode suitable for headless logging environments.
- For large sessions, export logs from the UI or the CLI to share with team members.
- If you encounter transport-specific issues, consult the CLI reference and wiki for transport configuration tips and troubleshooting steps.
- The current project ships as both an npm and a Python package; choose the installation method that aligns with your environment and workflow.
Related MCP Servers
awesome s
Awesome MCP Servers - A curated list of Model Context Protocol servers
ai-dev-tools-zoomcamp
AI Dev Tools Zoomcamp is a free course that helps you use AI tools to write better code, faster. We're starting the first cohort of this course on November 18, 2025! Sign up here to join us 👇🏼
mcpcat-typescript-sdk
MCPcat is an analytics platform for MCP server owners 🐱.
shinzo-ts
TypeScript SDK for MCP server observability, built on OpenTelemetry. Gain insight into agent usage patterns, contextualize tool calls, and analyze server performance across platforms. Integrate with any OpenTelemetry ingest service including the Shinzo platform.
mcpcat-python-sdk
MCPcat is an analytics platform for MCP server owners 🐱.
mcp -js
MCP server that exposes YepCode processes as callable tools for AI platforms. Securely connect AI assistants to your YepCode workflows, APIs, and automations.