frama-c
MCP server for Frama-C — enabling AI agents to interact with static analysis and formal verification
claude mcp add --transport stdio lihaokun-frama-c-mcp-server docker run -i lihaokun/frama-c-mcp-server --socket /tmp/frama-c.sock \ --env SOCKET="/tmp/frama-c.sock"
How to use
This MCP server acts as a bridge between an AI agent and Frama-C, enabling formal verification of C programs through EVA (abstract interpretation), WP (deductive proof), and ACSL annotation reasoning. The Rust-based MCP server communicates upstream via JSON-RPC over stdio and talks downstream to Frama-C through a custom Unix socket protocol. Agents can load or reload C sources, run EVA to surface potential runtime errors, and then progressively add ACSL annotations that the WP prover can verify. The available tools cover loading, analysis, result querying, and navigation, enabling iterative verification workflows where EVA findings are refined by ACSL and subsequently WP proofs. The server provides status, alarms, call graphs, and annotation status to guide the agent through verification tasks.
Usage flow (typical):
- Start Frama-C with a server socket (frama-c your_program.c -server-socket /tmp/frama-c.sock).
- Start the MCP server with the appropriate socket path (as configured in the deployment, e.g., --socket /tmp/frama-c.sock).
- Use MCP tools such as reload_project to load C sources, run_eva to detect potential issues, get_eva_alarms to inspect alarms, and investigate_alarm for root-cause analysis. If needed, inject ACSL annotations and reload the project, then run_wp to prove correctness and finally query get_wp_goals to confirm statuses. This enables an iterative loop: EVA uncovers problems, ACSL is added to address them, and WP validates the annotations.
How to install
Prerequisites:
- Frama-C >= 31.0 (Gallium)
- Rust toolchain (edition 2021) and cargo
- OCaml >= 4.14 with opam (for Frama-C tooling if building from source)
- Build the MCP server (example for a Rust-based binary):
# Install Rust/Cpp dependencies if needed and build the project
cargo build --release
- Run Frama-C server (on host): Frama-C must be started first to listen on a socket, e.g.:
frama-c your_program.c -server-socket /tmp/frama-c.sock
- Run the MCP server (example using the compiled binary):
./target/release/frama-c-mcp-server --socket /tmp/frama-c.sock
- (If using Docker) Start the MCP server container with the host socket accessible as needed (adjust volume/mounts as appropriate for your environment) and ensure the container uses the same /tmp/frama-c.sock path or a mapped path.
Prerequisites note: Ensure the Frama-C server is reachable at the specified socket path before starting the MCP server. If you are using Docker, you may need to adapt the socket sharing and path mapping to your host environment.
Additional notes
Tips and caveats:
- Frama-C uses a custom binary protocol for its downstream communication; commands include GET/SET/EXEC/POLL and a framing scheme with S or L prefixes. Coordinate SET and EXEC with POLL as described in Frama-C Ivette-style workflows.
- AST reload requires a proper sequence: setFiles([]) → setFiles(files) → compute. Re-loading with identical setFiles may be a no-op, so follow the two-step reload approach.
- WP markers must be introduced via PVDecl markers (e.g., #v<vid>) and require that printDeclaration is called to register labels before starting proofs.
- The iterative workflow typically cycles through EVA findings, ACSL annotation injection, reloading, and WP verification. The example workflow in the README demonstrates a complete loop validated by test suites.
- If you switch environments (e.g., from local to Docker), ensure that the socket paths used by Frama-C and the MCP server are accessible to each other, and align any environment variables accordingly.
Related MCP Servers
goose
an open source, extensible AI agent that goes beyond code suggestions - install, execute, edit, and test with any LLM
cunzhi
告别AI提前终止烦恼,助力AI更加持久
probe
AI-friendly semantic code search engine for large codebases. Combines ripgrep speed with tree-sitter AST parsing. Powers AI coding assistants with precise, context-aware code understanding.
mcp-guardian
Manage / Proxy / Secure your MCP Servers
mcp-center
A centralized platform for managing and connecting MCP servers. MCP Center provides a high-performance proxy service that enables seamless communication between MCP clients and multiple MCP servers.
backlog -rust
MCP server for Backlog, project management service.