ida-headless
Headless IDA Pro binary analysis via Model Context Protocol
claude mcp add --transport stdio zboralski-ida-headless-mcp docker run -i ida-headless-mcp
How to use
This MCP server provides headless IDA Pro binary analysis via the Model Context Protocol. It orchestrates multiple analysis sessions by spawning a Python worker per session, with a Go-based MCP server coordinating session lifecycle, tools, and IPC over a Unix socket. The server exposes a suite of MCP tools (around 52) that can be invoked through the standard MCP Connect RPC interface. Clients such as Claude Desktop, Claude Code, or the MCP Inspector can connect over HTTP/SSE to the server and then proxy their tool calls to the corresponding Python worker that interacts with IDA through idalib. Typical workflows involve opening a binary, running automatic analysis, querying entry points, decompiling or inspecting functions, and then closing the session to clean up resources. The architecture supports multi-session concurrency with process isolation, automatic session timeouts (default 4 hours, configurable), and paginated results for long lists of data.
How to install
Prerequisites:
- IDA Pro 9.0+ (or IDA Essential 9.2+)
- IDA as a Library (idalib): install via the setup script ./scripts/setup_idalib.sh
- Go 1.21+ and protoc tools make install-tools
- Python 3.10+ with dependencies pip3 install -r python/requirements.txt
- Optional: Il2CppDumper for Unity analysis
- Optional: unflutter for Flutter/Dart analysis
Installation steps:
-
Clone the repository and navigate to the project root: git clone <repo-url> cd ida-headless-mcp
-
Run the setup script to prepare idalib and dependencies: make setup
-
Manual setup (alternative): ./scripts/setup_idalib.sh # Setup idalib (requires IDA Pro/Essential 9.x) make install-python # Install Python dependencies make build # Build Go server
-
Start the server locally (see usage docs for port and config): ./bin/ida-mcp-server
Practical notes:
- The server exposes port 17300 by default; you can modify via config.json, environment variables, or --port flag.
- If building manually, ensure Docker is available if you intend to use the docker-based mcp_config example.
Additional notes
Tips and common issues:
- Ensure IDA Pro and IDA Lib are correctly installed and accessible to the worker processes.
- If the Python worker fails to start, verify that idalib is properly installed and that the system libraries required by IDA are present.
- Session timeouts can be tuned via CLI flags or environment variables (e.g., IDA_MCP_SESSION_TIMEOUT_MIN).
- When using Docker-based runs, ensure the image ida-headless-mcp is built or pulled before starting the server.
- For troubleshooting ports, use lsof -ti:<port> to identify conflicts and consider starting on an alternate port if needed.
- The server supports both HTTP (for Claude/CLI) and SSE endpoints; ensure your client connects to http://host:17300/ and can handle streaming responses if using the streamable transport.
Related MCP Servers
claude-talk-to-figma
A Model Context Protocol (MCP) that allows Claude Desktop and other AI tools (Claude Code, Cursor, Antigravity, etc.) to read, analyze, and modify Figma designs
mcp-screenshot-website-fast
Quickly screenshots webpages and converts to an LLM friendly size
ask-user-questions
Better 'AskUserQuestion' - A lightweight MCP server/OpenCode plugin/Agent Skills + CLI tool that allows your LLMs ask questions to you. Be the human in the human-in-the-loop!
modular
A Model Context Protocol (MCP) proxy server that enables efficient management of large tool collections across multiple MCP servers by grouping them and loading tool schemas on-demand.
create -kit
Scaffold a production-ready Model Context Protocol (MCP) server in seconds.
RLM-Memory
A Model Context Protocol (MCP) server that provides AI agents with persistent memory and semantic file discovery.