code_intelligence_mcp_server
This server indexes your codebase locally to provide fast, semantic, and structure-aware code navigation to tools like ClaudeCode, OpenCode, Trae, and Cursor.
claude mcp add --transport stdio iceinvein-code_intelligence_mcp_server npx -y @iceinvein/code-intelligence-mcp
How to use
This MCP server indexes a codebase locally to enable fast, semantic code search and navigation for LLM-assisted tooling. It builds a local knowledge graph of symbols, types, and call relationships, then serves search results for tools like Claude Code, OpenCode, Trae, and Cursor. The server combines keyword search (via Tantivy) with vector search (via LanceDB and embeddings) and provides human-friendly symbol descriptions generated by a local LLM. It supports multi-repo indexing, on-device model usage, and per-repo leadership handling to optimize indexing and avoid redundant work. After starting, you can connect your MCP clients to the server to perform semantic searches, retrieve definitions, references, call graphs, type graphs, and impact analyses across your codebase.
Available capabilities include: search_code for semantic symbol lookup, get_definition and find_references for symbol navigation, get_call_hierarchy and get_type_graph for call/type graphs, find_affected_code and trace_data_flow for impact analysis, and additional tooling for repository-aware queries. The server can index multiple repositories, watch for file changes on macOS, and run entirely locally with embedding and LLM models resident on the host.
How to install
Prerequisites:
- Node.js (for npx usage) or a system capable of running npm packages via npx.
- Internet access to fetch the MCP package on first run.
Installation steps:
-
Ensure Node.js and npm are installed. You can verify with: node -v npm -v
-
Install or run the MCP server via npx (no local Rust toolchain required): npx -y @iceinvein/code-intelligence-mcp
The first run will download the MCP package and initialize with defaults. You can customize the configuration in your MCP config file as needed.
-
(Optional) If you prefer to pin a specific version, replace the package spec with a version tag, e.g. @iceinvein/code-intelligence-mcp@1.2.3.
-
For standalone server mode or advanced usage, refer to the package's documentation for environment variables and flags that control host, port, and indexing behavior.
Notes:
- The server stores data under a user directory (default), including embeddings (~531MB) and LLM model data (~1.1GB). Ensure sufficient disk space.
- When using Claude/OpenCode clients, update their MCP settings to point at the appropriate command (as shown in the Quick Start).
Additional notes
Tips and common considerations:
- The MCP server supports multi-repo indexing and sharing of the embedding model across repos (standalone mode shares models across repos to reduce duplication).
- On macOS, file watching uses the notify crate with FSEvents for instant re-indexing; ensure appropriate permissions for directory monitoring.
- If you encounter network or download issues, verify your environment has access to npm registry and, if behind a proxy, configure npm accordingly.
- The server can run in local standalone mode or per-client stdio transport mode; use the mode that best fits your workflow and resource constraints.
- Environment variables and CLI flags can override defaults; consult the project docs for a complete list of tunables (host, port, embeddings backend, device, etc.).
Related MCP Servers
goose
an open source, extensible AI agent that goes beyond code suggestions - install, execute, edit, and test with any LLM
cunzhi
告别AI提前终止烦恼,助力AI更加持久
probe
AI-friendly semantic code search engine for large codebases. Combines ripgrep speed with tree-sitter AST parsing. Powers AI coding assistants with precise, context-aware code understanding.
mcp-center
A centralized platform for managing and connecting MCP servers. MCP Center provides a high-performance proxy service that enables seamless communication between MCP clients and multiple MCP servers.
backlog -rust
MCP server for Backlog, project management service.
perplexity-web-api
🔍 Perplexity AI MCP without API key