ummon
The semantic layer for software engineering: Connect code to meaning, build on understanding
claude mcp add --transport stdio nayshins-ummon docker run -i ummon
How to use
Ummon is a code analysis tool that builds knowledge graphs from codebases to enable advanced understanding and querying of software systems. It supports multiple languages (Rust, Python, JavaScript, Java) and provides a knowledge graph of code entities and their relationships, along with a powerful querying system (natural language and structured queries) and a relevance agent to surface the most pertinent files for a given change or task. You can index a codebase to incrementally update the graph or perform a full rebuild, and you can query using natural language (for example: show me authentication-related functions) or via structured queries (select functions where name like 'auth%') with options for output formats like text, JSON, CSV, or tree. The tool also supports domain model extraction to link business concepts to implementation details, helping both humans and AI assistants reason about the codebase.
To use Ummon, start by indexing a codebase to build the knowledge graph. You can then query the graph or use the relevance-assist features to identify files related to a proposed change. For example, you can run queries to list all functions, find relationships between entities, or generate AI-assisted recommendations. If you run into performance or stability concerns, you can enable domain extraction or adjust incremental vs full rebuild behavior as needed. Before querying, ensure that sensitive API keys for LLM services are available via the configured environment variables (OPENROUTER_API_KEY is used for LLM access).
How to install
Prerequisites:
- A system with Rust toolchain installed (recommended via rustup) or Docker if you prefer containerized usage.
- Basic familiarity with the command line.
Option 1: Install from source (Rust)
-
Install Rust and Cargo: follow https://www.rust-lang.org/tools/install
-
Build and install Ummon:
cargo install ummon
-
Run Ummon locally (examples):
cargo run -- index /path/to/codebase cargo run -- query "show all authentication functions"
Option 2: Run via Docker (containerized)
-
Ensure Docker is installed and running.
-
Pull and run the Ummon image (example):
docker pull ummon/ummon docker run -it --rm -v /path/to/codebase:/code ummon/ummon index /code
Notes:
- The project is in early development and APIs may change. Refer to the official docs for the latest usage and flags.
- Environment variables such as OPENROUTER_API_KEY are required for LLM-related features when enabling queries and domain extraction.
Additional notes
Environment variable: OPENROUTER_API_KEY is required for LLM services used by queries and domain extraction. Ummon supports incremental indexing by default; use the --full option to force a complete rebuild of the knowledge graph. If indexing large codebases, consider enabling domain extraction with --enable-domain-extraction and setting a domain directory with --domain-dir. For best results, ensure you have read access to the codebase and provide a reasonable path filter or limit to avoid excessive processing. If you encounter parsing issues for specific languages, check for updates in the corresponding language parsers and consult the knowledge-graph and query-system documentation for language-specific notes.
Related MCP Servers
git
An MCP (Model Context Protocol) server enabling LLMs and AI agents to interact with Git repositories. Provides tools for comprehensive Git operations including clone, commit, branch, diff, log, status, push, pull, merge, rebase, worktree, tag management, and more, via the MCP standard. STDIO & HTTP.
agentql
Model Context Protocol server that integrates AgentQL's data extraction capabilities.
neurolink
Universal AI Development Platform with MCP server integration, multi-provider support, and professional CLI. Build, test, and deploy AI applications with multiple ai providers.
codemesh
The Self-Improving MCP Server - Agents write code to orchestrate multiple MCP servers with intelligent TypeScript execution and auto-augmentation
mode-manager
MCP Memory Agent Server - A VS Code chatmode and instruction manager with library integration
notification
A Model Context Protocol server that allows AI agents to play a notification sound via a tool when a task is completed.