nmap
An example MCP server with a couple nmap scans as tools.
claude mcp add --transport stdio jarrodcoulter-nmap-mcp-server docker run -i nmap-mcp-server
How to use
This MCP server packages an Nmap-based toolkit inside a container and exposes it through the MCP interface so an OpenAI-powered agent can invoke network scanning tools. The Nmap server runs in Docker and provides a set of tools including ping_host, scan_network, all_scan_network, all_ports_scan_network, and smb_share_enum_scan, which can be requested by the agent to perform targeted scans from the host where Docker is running. You can interact with the Nmap tools indirectly through the Gradio-based frontend or via the OpenAI agent workflow, which decides when to call a specific tool based on your query. To use, ensure the container image nmap-mcp-server is built and running, then issue MCP tool calls from your agent or UI to execute the corresponding Nmap tasks (e.g., quick ping, port scans, or SMB enumeration).
How to install
Prerequisites:
- Docker: Install and run Docker (latest).
- Python 3.9+ (for the app and tooling) and optionally a Python virtual environment.
- OpenAI API Key: Set OPENAI_API_KEY in your environment.
Installation steps:
- Clone the repository:
git clone <your-repository-url>
cd <your-repository-directory>
- Set the OpenAI API key:
# Linux/macOS
export OPENAI_API_KEY='your_api_key_here'
# Windows (Command Prompt)
set OPENAI_API_KEY=your_api_key_here
# Windows (PowerShell)
$env:OPENAI_API_KEY='your_api_key_here'
- Build the Nmap Docker image (as described in the repo):
# In the directory containing Dockerfile and nmap-server files
docker build -t nmap-mcp-server .
- Run the application via Docker MCP server:
# Ensure Docker is running, then start the container for the MCP server
# This may be managed by your MCP orchestration; examples:
# docker run -d --name nmap-mcp-server -e OPENAI_API_KEY="$OPENAI_API_KEY" nmap-mcp-server
- Start the Python application (if you are running the app locally with Gradio UI and agent integration):
python app.py
- Access the Gradio UI in your browser as configured by the app and begin interacting with the NPC agent that will call the Nmap MCP tools as needed.
Additional notes
Tips and considerations:
- Ensure Docker daemon is running and you have permission to run containers.
- The MCP server image should be built with the correct MCP package name; if you customize the Dockerfile, adjust the image tag accordingly (the example uses nmap-mcp-server).
- Set OPENAI_API_KEY in the environment before starting the app to enable the OpenAI agent workflow.
- If you encounter tool invocation failures, check that the Docker container is running and that the agent can reach the container runtime.
- The filesystem MCP server is available via npx in this setup; the primary capability here is the containerized Nmap toolkit exposed through MCP.
- When testing scans, use non-destructive or limited scope first to avoid noisy network traffic during development.
Related MCP Servers
mcp-vegalite
MCP server from isaacwasserman/mcp-vegalite-server
github-chat
A Model Context Protocol (MCP) for analyzing and querying GitHub repositories using the GitHub Chat API.
nautex
MCP server for guiding Coding Agents via end-to-end requirements to implementation plan pipeline
pagerduty
PagerDuty's official local MCP (Model Context Protocol) server which provides tools to interact with your PagerDuty account directly from your MCP-enabled client.
futu-stock
mcp server for futuniuniu stock
mcp -boilerplate
Boilerplate using one of the 'better' ways to build MCP Servers. Written using FastMCP