proxypin
自动化流量分析 mcp 工具
claude mcp add --transport stdio lunfengchen-proxypin-mcp-server python proxypin_mcp_server.py \ --env MCP_PORT="17777 (MCP communication port for ProxyPin)" \ --env PROXY_PIN_HOST="127.0.0.1 (ProxyPin host, default localhost)"
How to use
ProxyPin MCP Server exposes the ability for your AI workflows to call into ProxyPin's HTTP interception features. It supports managing requests with advanced search and details, replaying captured requests, generating code samples (Python/JS/cURL) or raw curl commands, and applying or inspecting rewrite rules for requests and responses. You can also manage JavaScript scripts that run during request handling, control the ProxyPin proxy itself, map hosts, and import/export HAR data for analysis. The MCP interface is designed so your AI assistant can automate tasks like filtering captured traffic, extracting API endpoints, comparing requests, and exporting analytics. To use it, start the ProxyPin application as usual, then start the MCP server (the Python script proxypin_mcp_server.py) so the MCP endpoint at port 17777 is available. In your IDE or automation config, define an MCP entry (as shown in the readme) that points to your Python executable and the proxypin_mcp_server.py file, and enable the tools you want auto-approved for execution.
How to install
Prerequisites:
- Python 3.10 or newer
- About ProxyPin itself installed and reachable on port 17777 for MCP communication
- Optional: Python dependencies fastmcp and requests
Install steps:
- Install Python 3.10+ on your system (from python.org or your OS package manager).
- Install required Python dependencies (if not included in the script's environment):
pip install --upgrade pip
pip install fastmcp requests
- Obtain the proxypin MCP server script and integrator:
- Download proxypin_mcp_server.py (the Python MCP implementation) from the release or repository location.
- Ensure the file is accessible in your working directory or provide the full path.
- Start the MCP server:
python proxypin_mcp_server.py
- Ensure ProxyPin is running and listening for traffic on the default capture port (9099) and that the MCP endpoint is exposed on port 17777 as configured.
Note: If you plan to customize ports, adjust the MCP_PORT environment variable or the script's configuration accordingly.
Additional notes
Tips and common considerations:
- ProxyPin defaults: ProxyPin captures traffic on port 9099 for the app side and exposes MCP on 17777 for AI-driven calls. Do not confuse these with the 9099 capture port.
- Ensure your AI toolset can reach 127.0.0.1:17777 or the host-mapped address if deploying remotely.
- The MCP config in IDEs should reference the Python executable and the proxypin_mcp_server.py script, and you can selectively auto-approve the available tools as needed.
- If you run into connection failures, verify that both ProxyPin’s capture proxy and the MCP server are running, and that there is no firewall blocking port 17777.
- Optional environment variables can be expanded to tune ProxyPin behavior (e.g., system_proxy, ssl_capture) via set_config tool once MCP is running.
Related MCP Servers
mcp-vegalite
MCP server from isaacwasserman/mcp-vegalite-server
github-chat
A Model Context Protocol (MCP) for analyzing and querying GitHub repositories using the GitHub Chat API.
nautex
MCP server for guiding Coding Agents via end-to-end requirements to implementation plan pipeline
pagerduty
PagerDuty's official local MCP (Model Context Protocol) server which provides tools to interact with your PagerDuty account directly from your MCP-enabled client.
futu-stock
mcp server for futuniuniu stock
mcp -boilerplate
Boilerplate using one of the 'better' ways to build MCP Servers. Written using FastMCP