cortex-ast
A powerful Model Context Protocol (MCP) server and Omni-AST engine. It empowers AI agents to seamlessly parse complex codebases, execute secure cross-project operations, and dynamically fetch token-optimized rules.
claude mcp add --transport stdio cortex-works-cortex-ast /path/to/cortexast
How to use
CortexAST is a production-grade MCP server that provides AI agents with semantic code navigation, AST-aware time travel, and self-evolving WASM parsers. It offers tools to explore codebases, analyze symbols, manage AST languages, and run diagnostics—all through standardized MCP actions. Typical usage involves running the CortexAST server as an MCP backend and configuring your MCP client to send actions like map_overview, find_usages, save_checkpoint, compare_checkpoint, and language management commands. This server shines when paired with CortexSync-backed workflows for memory retrieval, rules, and cross-project context, enabling robust, AST-aware agent behavior across large codebases.
How to install
Prerequisites:
- Rust 1.80+ (stable toolchain)
- Cargo (comes with Rust)
- Optional: Ollama or LM Studio for Auto-Healer features
Installation steps:
- Clone the repository: git clone https://github.com/DevsHero/CortexAST
- Build the project in release mode: cd CortexAST cargo build --release
- Run the MCP server (stdio):
From the built binary
./target/release/cortexast
Configuration:
- Use the MCP config example and place it at ~/.cursor/mcp.json or your preferred MCP config location, e.g.:
{
"mcpServers": {
"cortexast": {
"command": "/path/to/cortexast",
"args": []
}
}
}
Notes:
- Ensure /path/to/cortexast points to the compiled binary and is executable.
- When running in a container or production environment, you may adapt the command/args to match your deployment method.
Additional notes
Tips and common considerations:
- CortexAST supports self-evolving WASM language parsers loaded at runtime; ensure your environment allows wasm module downloads if you enable dynamic language support.
- For cross-project workflows, pair CortexAST with CortexSync to leverage memory retriever, rule fetching, and network discovery commands.
- If you encounter issues with planning or parsing, consult the Chronos AST snapshot tooling to compare pre/post-change states.
- The MCP config uses stdio; if you run cortexast as a background service, you may need to adapt the command to launch in the foreground for MCP integration or expose a suitable IPC interface.
- Prerequisites mention Rust 1.80+; keep your toolchain up to date to ensure compatibility with WASM hot-reload and language support features.
Related MCP Servers
awesome-claude-skills
A curated list of awesome Claude Skills, resources, and tools for customizing Claude AI workflows
cursor-talk-to-figma
TalkToFigma: MCP integration between AI Agent (Cursor, Claude Code) and Figma, allowing Agentic AI to communicate with Figma for reading designs and modifying them programmatically.
ios-simulator-skill
An IOS Simulator Skill for ClaudeCode. Use it to optimise Claude's ability to build, run and interact with your apps, without using up any of the available token/context budget.
shodh-memory
Cognitive memory for AI agents — learns from use, forgets what's irrelevant, strengthens what matters. Single binary, fully offline.
flowlens
FlowLens is an open-source MCP server that gives your coding agent (Claude Code, Cursor, Copilot, Codex) full browser context for in-depth debugging and regression testing.
rtfmbro
rtfmbro provides always-up-to-date, version-specific package documentation as context for coding agents. An alternative to context7