Get the FREE Ultimate OpenClaw Setup Guide →

BinAssistMCP

Binary Ninja plugin to provide MCP functionality.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio jtang613-binassistmcp python -m binassist_mcp

How to use

BinAssistMCP provides an MCP 2025-11-25 compliant bridge between Binary Ninja and large language models. It exposes a suite of 36 tools wrapped around the Binary Ninja API, organized into categories such as Binary Management, Code Analysis, Cross-References, Comments, Variables, Types, Function Discovery, Symbol Management, Binary Information, Data Analysis, Navigation, and Task Management. Clients connect via Server-Sent Events (SSE) or Streamable HTTP transports to perform analysis, retrieve metadata, and drive guided prompts. The server also offers 8 browsable resources that expose binary metadata without invoking tools, enabling faster context provisioning for LLM-driven workflows. Prompts (7 total) provide guided workflows for common reverse-engineering tasks, such as analyzing a function, tracing data flow, and documenting code. The multi-binary sessions and LRU-backed analysis cache help scale analysis across multiple binaries while maintaining responsive interactions with the LLM.

To use BinAssistMCP, start the server and connect your MCP client to the appropriate transport (SSE or Streamable HTTP). Use the 36 tools to query binary state, retrieve disassembly (decompile, hlil, mlil, llil, disasm, pseudo_c), inspect cross-references, manage comments and variables, inspect types and structures, enumerate functions, locate symbols, fetch imports/exports/strings, and view memory layout. For guided workflows, invoke the pre-built prompts such as analyze_function, identify_vulnerability, document_function, trace_data_flow, or trace_network_data to drive structured analysis and generate human-readable output and documentation.

The server is designed for multi-binary sessions with asynchronous task support, so long-running operations can run without blocking the client. Outputs are delivered through the chosen transport, with the MCP protocol providing tool annotations and resource metadata to aid LLMs in producing accurate results and documentation.

How to install

Prerequisites:

  • Python 3.8+ (recommended 3.9+)
  • Binary Ninja installed (with Python API available)
  • Network access for dependencies

Installation steps:

  1. Clone the repository: git clone https://github.com/your-org/jtang613-binassistmcp.git cd jtang613-binassistmcp

  2. Create and activate a Python virtual environment (optional but recommended): python -m venv venv source venv/bin/activate # on Windows use: venv\Scripts\activate

  3. Install Python dependencies: pip install -r requirements.txt

  4. Configure dependencies for Binary Ninja if needed (install Binary Ninja API for your environment).

    • Ensure binja is importable in your Python environment.
  5. Run the MCP server: python -m binassist_mcp

    Alternatively, if a launcher script is provided:

    python server.py

  6. Verify the server starts and listens on the configured port (default usually 8000 or as defined in config).

  7. Connect your MCP client to the server using the mcp_config specification above.

Additional notes

Tips and considerations:

  • The server uses SSE and Streamable HTTP transports; choose the transport best suited for your client.
  • If you encounter issues with Binary Ninja integration, ensure the Binary Ninja Python API is accessible to the Python runtime running the server.
  • The 8 MCP resources are cached data endpoints; use them to pre-fetch binary metadata before tool-based queries to reduce latency.
  • When running multiple binaries, the multi-binary session manager will help isolate contexts; monitor the LRU cache behavior for memory considerations.
  • For long-running analyses, utilize the async task system to avoid blocking the MCP client; you can query task status to monitor progress.
  • If environment variables are required (e.g., API keys for LLM providers or Binary Ninja paths), expose them via the env block in mcp_config and document their purpose in your deployment docs.

Related MCP Servers

Sponsor this space

Reach thousands of developers