llamator
MCP server for llamator: automate LLM red teaming workflows
claude mcp add --transport stdio romiconez-llamator-mcp-server docker compose up --build
How to use
LLAMATOR MCP server provides a Streamable HTTP transport layered on top of a FastAPI application. It exposes two integration surfaces: the HTTP API for submitting and monitoring LLAMATOR-based red teaming runs, and the MCP interface mounted inside the FastAPI app that lets external tooling invoke LLAMATOR runs as MCP tools. The system orchestrates jobs asynchronously via ARQ with Redis, and artifacts produced by LLAMATOR are stored in MinIO and retrieved via presigned URLs. You can interact with the HTTP API to create runs, check status, and fetch artifacts, or leverage the MCP tools create_llamator_run and get_llamator_run to integrate LLAMATOR runs into your tooling pipelines using the standardized MCP protocol.
To use the MCP interface, connect to the MCP endpoint mounted at /mcp in the running FastAPI service. The exposed tools include create_llamator_run, which submits a run, waits for completion, and returns aggregated metrics along with an artifacts download URL when available; and get_llamator_run, which retrieves the metrics for a finished run and the optional artifacts URL. The API supports optional API key protection via X-API-Key and exposes OpenAPI for exploration. The complete stack relies on Redis for durable state, MinIO for artifacts, and a configurable OpenAI-like model interface for LLAMATOR execution.
How to install
Prerequisites:
- Docker and Docker Compose installed on your machine
- Basic familiarity with running services in containers
Installation steps:
-
Clone the repository: git clone <repository-url> llamator-mcp-server cd llamator-mcp-server
-
Copy the example environment configuration (if provided) and tailor it to your environment. For local development, you can typically copy the example env file: cp .env.example .env
Edit .env to configure Redis DSN, MinIO, API keys, etc.
-
Start the services with Docker Compose: docker compose up --build
-
Verify the services are running:
- HTTP API: http://localhost:8000
- MinIO S3 endpoint: http://localhost:9000
- MinIO console: http://localhost:9001
-
Optional: run the local development stack without Docker (if you prefer a Python-based setup):
Install dependencies
poetry install
Run the API server
uvicorn llamator_mcp_server.main:app --host 0.0.0.0 --port 8000
Run the worker (ARQ) for LLAMATOR executions
arq llamator_mcp_server.worker_settings.WorkerSettings
Prerequisites recap:
- Docker and Docker Compose for the recommended Docker-based deployment
- Optional Python tooling (Poetry, uvicorn) for local development without containers
Additional notes
Notes and tips:
- The MCP interface uses Streamable HTTP transport; ensure your tooling can send/receive the expected RPC payloads.
- If you enable API key protection, supply X-API-Key headers when invoking both HTTP and MCP endpoints.
- Artifacts are stored under MinIO and accessed via presigned URLs; the API does not redirect for artifact downloads.
- Healthcheck endpoint is typically at /v1/health on the HTTP API; monitor /metrics for Prometheus metrics as needed.
- Environment variables prefixed with LLAMATOR_MCP_ control Redis, MinIO, API security, and LLAMATOR execution endpoints; see DOCUMENTATION.md for a complete reference.
- When running locally with Docker Compose, ensure port mappings (default 8000, 9000, 9001) do not conflict with other services.
Related MCP Servers
magic
Super Magic. The first open-source all-in-one AI productivity platform (Generalist AI Agent + Workflow Engine + IM + Online collaborative office system)
AI-Infra-Guard
A full-stack AI Red Teaming platform securing AI ecosystems via AI Infra scan, MCP scan, Agent skills scan, and LLM jailbreak evaluation.
trpc-agent-go
trpc-agent-go is a powerful Go framework for building intelligent agent systems using large language models (LLMs) and tools.
nono
Secure, kernel-enforced sandbox CLI and SDKs for AI agents. Capability-based isolation with secure key management, atomic rollback, cryptographic immutable audit chain of provenance. Run your agents in a zero-trust environment.
headroom
The Context Optimization Layer for LLM Applications
TradingAgents mode
TradingAgents-MCPmode 是一个创新的多智能体交易分析系统,集成了 Model Context Protocol (MCP) 工具,实现了智能化的股票分析和交易决策流程。系统通过多个专业化智能体的协作,提供全面的市场分析、投资建议和风险管理。