nuclei-api
Nuclei API is an advanced, AI-augmented platform for automated vulnerability scanning and template management, built around Nuclei.
claude mcp add --transport stdio miladhzzzz-nuclei-api docker run -i miladhzzzz/nuclei-api:latest \ --env REDIS_URL="redis://localhost:6379/0" \ --env OLLAMA_HOST="http://localhost:11434" \ --env NUCLEI_API_HOST="http://localhost:8000"
How to use
Nuclei API provides a REST API to manage vulnerability detection templates, generate AI-assisted Nuclei templates, validate and refine them, and run scans against targets. It orchestrates an asynchronous pipeline using Celery with Redis as the broker and cache, and leverages Ollama for LLM-driven template generation and refinement. You can interact with the API to fetch vulnerabilities, generate AI-generated templates, upload your own YAML templates, and trigger scans on IPs or domains. The MCP integration exposes tool-like endpoints that allow an LLM agent to call into the Nuclei scanning and template management capabilities, enabling AI-powered security workflows. The platform runs inside Docker containers and can be accessed through its API surface or the provided frontend when deployed with the accompanying stack.
How to install
Prerequisites
- Docker and Docker Compose
- Python 3.8+ (for development or backend scripts) and Node.js 18+ (for frontend if needed)
- Redis (or Dockerized Redis as part of the stack)
- Ollama (or a compatible LLM API) for template generation
Step-by-step installation
-
Install dependencies and clone the repository
git clone <repository-url> cd nuclei-api
-
Install and start the stack with Docker Compose (recommended for production-like setup)
docker compose up -d
-
Optional: Run locally for development (fast bootstrap)
Start Redis and Ollama containers (as needed by the app)
docker compose -f docker-compose.yml up -d redis ollama
Run the API using uvicorn (if developing locally)
uvicorn app.main:app --reload --host 0.0.0.0 --port 8080
-
Access the API and frontend
- API: http://localhost:8000 (Docker deployment) or http://localhost:8080 (local uvicorn)
- Frontend (if enabled): http://localhost:3000
Additional notes
Tips and common issues:
- Ensure Redis is reachable at the configured URL (default redis://localhost:6379/0) since Celery relies on Redis for broker and results backend.
- Ollama must be running and accessible to generate/refine templates; verify the Ollama endpoint and model availability.
- When using Docker-compose, you can scale celery workers as needed with docker compose up -d --scale celery_worker=3 to handle higher loads.
- For MCP integration, expose the tool manifest and ensure LLM agents can discover and call the available nuclei-api routes (NucleiRoutes, PipelineRoutes, MCP Routes).
- If you’re implementing a custom frontend, the API exposes endpoints for template generation, storage, and scans; review the API structure under app/api for exact routes and payloads.
Related MCP Servers
Unified -Tool-Graph
Instead of dumping 1000+ tools into a model’s prompt and expecting it to choose wisely, the Unified MCP Tool Graph equips your LLM with structure, clarity, and relevance. It fixes tool confusion, prevents infinite loops, and enables modular, intelligent agent workflows.
mcp-cyberbro
Using MCP is fun with Cyberbro!
alris
Alris is an AI automation tool that transforms natural language commands into task execution.
mcp-jira-stdio
MCP server for Jira integration with stdio transport. Issue management, project tracking, and workflow automation via Model Context Protocol.
Gemini-Vuln-Scanner
Vulnerability Scanning and Reconnaissance App with Gemini integrated workflow
idb
An open-source MCP server and Python library that wraps Facebook IDB to control iOS simulators for automation. Built by AskUI.