Get the FREE Ultimate OpenClaw Setup Guide →

dotnet-dump

šŸ¤– An AI-native MCP server for .NET memory dump investigations. Leverage LLMs (Claude/Gemini) to analyze heaps, threads, and exceptions using natural language. Built with ClrMD, token-efficient paging, and Docker support for cross-architecture (.NET/AMD64/ARM64) post-mortem analysis.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio frankbolero-dotnet-dump-mcp docker run --rm -i -v /Users/yourname/dumps:/dumps -e DUMP_PATH=/dumps/crash.core dotnet-dump-mcp-server \
  --env DUMP_PATH="/dumps/crash.core"

How to use

This MCP server wraps the Microsoft.Diagnostics.Runtime (ClrMD) library to help an AI agent inspect and diagnose .NET memory dumps. It exposes tools for heap analysis, object inspection, thread and stack analysis, and module visibility, enabling tasks like identifying memory leaks, analyzing object references, and understanding CLR behavior during dumps. The server is designed to be invoked by MCP clients (e.g., Claude CLI, Gemini CLI, Zed Agent, or Cursor) and supports loading a dump path via an environment variable or by using the load_dump tool after startup. Typical usage is to start the server in a container or locally, then issue MCP tool calls such as dump_heap, dump_obj, clr_threads, clr_stack, and more to interrogate the dump and correlate findings with AI-generated insights.

How to install

Prerequisites:

  • Docker installed (for the recommended option)
  • Optional: .NET SDK (if you prefer running locally instead of Docker)
  • Access to a memory dump file (.dmp or .core)

Option A - Docker (recommended):

  1. Build the image (once you have a Dockerfile configured): docker build -t dotnet-dump-mcp-server .

  2. Run the server with a mounted dumps directory (adjust paths as needed). The DUMP_PATH environment variable is optional; if omitted, you can load a dump later using the load_dump tool: docker run --rm -i
    -v "/path/to/your/dumps:/dumps"
    -e DUMP_PATH=/dumps/your_dump.core
    dotnet-dump-mcp-server

  3. If you are on a Mac with Linux dumps, add --platform linux/amd64 to the run command to avoid architecture mismatches: docker run --rm -i --platform linux/amd64
    -v "/path/to/your/dumps:/dumps"
    -e DUMP_PATH=/dumps/your_dump.core
    dotnet-dump-mcp-server

Option B - Local (dotnet):

  1. Ensure .NET 8/9/10 SDKs are installed and the OS matches the dump:

  2. Run the server project directly (example for net9.0): export DUMP_PATH="/path/to/your/dump.core" dotnet run --project src/DotNetDump.Server/DotNetDump.Server.csproj --framework net9.0

  3. If running on Windows PowerShell: $env:DUMP_PATH = "C:\path\to\your\dump.dmp" dotnet run --project src\DotNetDump.Server\DotNetDump.Server.csproj --framework net9.0

Additional notes

Tips and notes:

  • The DUMP_PATH environment variable is optional. If you omit it at startup, you can later load a dump using the load_dump tool exposed by the MCP server.
  • When using Docker on Apple Silicon with Linux dumps, consider adding --platform linux/amd64 to ensure compatible architecture.
  • The server exposes a family of tools for analysis: dump_heap, list_objects, ee_heap, dump_obj, gc_root, gchandles, clr_threads, clr_stack, thread_pool, clr_modules, and sync_blk. These can be combined with the MCP client to construct rich prompts for the AI agent.
  • If your dumps are large, some operations may run long; you can increase the MCP client timeout to accommodate lengthy processing times (e.g., 600000 ms as shown in the Docker example).
  • Ensure your container has access to any necessary native dependencies required by ClrMD and the host OS compatibility with the dump.
  • When using the inspector tool for MCP testing, you can launch a UI that lets you invoke server tools directly and view outputs in a browser.

Related MCP Servers

Sponsor this space

Reach thousands of developers ↗