cie
Code Intelligence Engine — indexes your codebase and gives AI assistants deep understanding via MCP (semantic search, call graphs, 20+ tools)
claude mcp add --transport stdio kraklabs-cie cie --mcp
How to use
CIE runs as an MCP server that exposes a suite of 20+ tools for code intelligence, all operating locally on your machine. When you enable MCP mode, you can query and orchestrate tools like semantic search, call graph tracing, endpoint discovery, and code understanding directly from your MCP client (e.g., Claude Code, Cursor, or any compatible MCP client). The server is designed to index your local codebase and provide fast, private analysis, with semantic search enabled through embeddings when you choose to enable an embedding provider. Tools are categorized into navigation and search, call graph analysis, code understanding, HTTP/API discovery, security and verification, and system utilities. This enables workflows such as finding a function by meaning, tracing how a function is called across files, listing all API endpoints, and retrieving function source code, all through a consistent MCP interface.
To use the MCP server, start it in MCP mode and configure it in your MCP client settings to point at the cie server with the provided command and arguments. Once running, you can invoke tools like cie_semantic_search for meaning-based code search, cie_trace_path to map call paths, cie_list_endpoints to enumerate HTTP endpoints, cie_get_function_code to fetch a function’s source, and cie_index_status to monitor indexing health. Embedding-backed semantic features require an embeddings provider (such as Ollama); however, many structural tools remain available and work offline, ensuring private, local analysis of your codebase.
How to install
Prerequisites:
- A modern operating system (macOS, Linux, or Windows with WSL).
- curl or a similar network tool.
- Optional: Homebrew for macOS/Linux package management.
Installation steps (choose one):
- Homebrew (macOS/Linux):
brew tap kraklabs/cie
brew install cie
- Install Script (recommended):
curl -sSL https://raw.githubusercontent.com/kraklabs/cie/main/install.sh | sh
- Manual / GitHub Releases:
Download the latest binary from the GitHub Releases page and place it in your PATH.
# Example (adjust for OS/Arch):
curl -L https://github.com/kraklabs/cie/releases/latest/download/cie-linux-amd64 -o cie
chmod +x cie
sudo mv cie /usr/local/bin/
After installation, you can run the MCP server using the configured command and arguments shown in the mcp_config snippet.
Additional notes
Notes and tips:
- CIE supports both embedding-backed semantic search and pure structural tools; embeddings are required for cie_semantic_search, while other tools (grep, call graph, function finder, etc.) work without embeddings.
- The MCP mode is designed for integration with Claude Code and other MCP clients; ensure your client is configured to call the cie MCP endpoint via the provided command and arguments.
- If you run into indexing slowness, verify your local disk performance (RocksDB backend) and consider indexing smaller submodules first. Embedding provider availability (e.g., Ollama) can affect semantic capabilities; ensure the provider is running and accessible at the configured base_url.
- Use cie_index_status to monitor index health and stats, and cie_reset --yes if you need to purge and re-index.
- For updates and troubleshooting, refer to the official repository documentation and Tools Reference in docs/tools-reference.md.
Related MCP Servers
axon
Graph-powered code intelligence engine — indexes codebases into a knowledge graph, exposed via MCP tools for AI agents and a CLI for developers.
mpm-vibe-coding
MPM is an MCP context-engineering layer for Vibe Coding, focused on three delivery bottlenecks: context loss, uncontrolled changes, and non-verifiable outcomes.
google-ai-mode
MCP server for free Google AI Mode search with citations. Query optimization, CAPTCHA handling, multi-agent support. Works with Claude Code, Cursor, Cline, Windsurf.
crawl4ai
🕷️ A lightweight Model Context Protocol (MCP) server that exposes Crawl4AI web scraping and crawling capabilities as tools for AI agents. Similar to Firecrawl's API but self-hosted and free. Perfect for integrating web scraping into your AI workflows with OpenAI Agents SDK, Cursor, Claude Code, and other MCP-compatible tools.
omega-memory
Persistent memory for AI coding agents
web-developer
A Model Context Protocol (MCP) server that provides web development tools for AI assistants. Enables browser automation, DOM inspection, network monitoring, and console analysis through Playwright.