Get the FREE Ultimate OpenClaw Setup Guide →

AIDA

AI-Driven Security Assessment - Connect AI to 400+ pentesting tools via MCP

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio vasco0x4-aida /bin/bash /absolute/path/to/AIDA/start_mcp.sh

How to use

AIDA exposes an MCP server that lets an AI client drive a full-fledged pentesting lab powered by Exegol and the 400+ tools it contains. Through the MCP interface, you can load and manage assessments, create and annotate findings as cards, collect reconnaissance data, and trigger commands from within the AI workflow. The MCP tools section exposes commands for executing arbitrary Exegol commands, performing quick scans (nmap, gobuster, ffuf, nuclei, etc.), enumerating subdomains, analyzing SSL/TLS configurations, and managing credentials. This setup enables your AI to perform end-to-end security engagements—from recon to exploitation—while automatically documenting findings and linking discoveries in an attack-chain style workflow.

How to install

Prerequisites:

  • Docker Desktop installed on the host (or a compatible container runtime)
  • Git installed to clone the repository
  • Optional: Python for CLI usage (aida.py) if you plan to use the AIDA CLI wrapper

Recommended installation approaches:

Path A — Docker-based deployment (recommended for quick start):

# Clone the repository
git clone https://github.com/Vasco0x4/AIDA.git
cd AIDA

# Start the platform with Docker Compose
docker-compose up -d

# Open the dashboard (default port 5173)
# In your browser: http://localhost:5173

Path B — Local development (Python FastAPI backend with MCP server):

# Prerequisites
# Ensure Python 3.9+ is installed

# Create and activate a virtual environment
python3 -m venv venv
source venv/bin/activate

# Install dependencies
pip install -r backend/requirements.txt

# Run the MCP server locally (typical entry might be aida.py or uvicorn)
# Example (adjust to your setup):
python backend/main.py

Path C — Using the MCP config example (as shown in the docs):

# Ensure MCP server is reachable (adjust paths accordingly)
# The following is a placeholder example matching the MCP config structure:
"""mcpServers": {
  "aida-mcp": {
    "command": "/bin/bash",
    "args": ["/absolute/path/to/AIDA/start_mcp.sh"]
  }
}"""

Prerequisites summary:

  • Ensure you have access to the AIDA repository and the start_mcp.sh script or equivalent entrypoint used to boot the MCP integration.
  • If using the Docker-based approach, ensure Docker Compose is properly configured as in docker-compose.yml and any required environment variables are set in .env or the compose file.

After installation, follow the Quick Start steps in the README to launch the platform and connect your AI client via MCP.

Additional notes

Tips and caveats:

  • Alpha release: run locally behind a firewall; avoid publicly exposing the UI without proper authentication and access controls.
  • The MCP config example uses an absolute path to start_mcp.sh; customize this to your deployment layout.
  • If you encounter MCP connection issues, verify that the aida-mcp server is reachable from the AI client and that the MCP version is compatible with your client.
  • The AIDA dashboard stores findings as cards; make sure to configure the database (and migrations) according to Docs/ARCHITECTURE.md and your deployment environment.
  • When using external AI clients, ensure MCP tooling permissions align with your security policy to prevent unintended command execution.
  • Check the .env file for defaults and update database credentials before deployment.

Related MCP Servers

Sponsor this space

Reach thousands of developers