Get the FREE Ultimate OpenClaw Setup Guide →

code_intelligence_mcp_server

This server indexes your codebase locally to provide fast, semantic, and structure-aware code navigation to tools like ClaudeCode, OpenCode, Trae, and Cursor.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio iceinvein-code_intelligence_mcp_server npx -y @iceinvein/code-intelligence-mcp

How to use

This MCP server indexes a codebase locally to enable fast, semantic code search and navigation for LLM-assisted tooling. It builds a local knowledge graph of symbols, types, and call relationships, then serves search results for tools like Claude Code, OpenCode, Trae, and Cursor. The server combines keyword search (via Tantivy) with vector search (via LanceDB and embeddings) and provides human-friendly symbol descriptions generated by a local LLM. It supports multi-repo indexing, on-device model usage, and per-repo leadership handling to optimize indexing and avoid redundant work. After starting, you can connect your MCP clients to the server to perform semantic searches, retrieve definitions, references, call graphs, type graphs, and impact analyses across your codebase.

Available capabilities include: search_code for semantic symbol lookup, get_definition and find_references for symbol navigation, get_call_hierarchy and get_type_graph for call/type graphs, find_affected_code and trace_data_flow for impact analysis, and additional tooling for repository-aware queries. The server can index multiple repositories, watch for file changes on macOS, and run entirely locally with embedding and LLM models resident on the host.

How to install

Prerequisites:

  • Node.js (for npx usage) or a system capable of running npm packages via npx.
  • Internet access to fetch the MCP package on first run.

Installation steps:

  1. Ensure Node.js and npm are installed. You can verify with: node -v npm -v

  2. Install or run the MCP server via npx (no local Rust toolchain required): npx -y @iceinvein/code-intelligence-mcp

    The first run will download the MCP package and initialize with defaults. You can customize the configuration in your MCP config file as needed.

  3. (Optional) If you prefer to pin a specific version, replace the package spec with a version tag, e.g. @iceinvein/code-intelligence-mcp@1.2.3.

  4. For standalone server mode or advanced usage, refer to the package's documentation for environment variables and flags that control host, port, and indexing behavior.

Notes:

  • The server stores data under a user directory (default), including embeddings (~531MB) and LLM model data (~1.1GB). Ensure sufficient disk space.
  • When using Claude/OpenCode clients, update their MCP settings to point at the appropriate command (as shown in the Quick Start).

Additional notes

Tips and common considerations:

  • The MCP server supports multi-repo indexing and sharing of the embedding model across repos (standalone mode shares models across repos to reduce duplication).
  • On macOS, file watching uses the notify crate with FSEvents for instant re-indexing; ensure appropriate permissions for directory monitoring.
  • If you encounter network or download issues, verify your environment has access to npm registry and, if behind a proxy, configure npm accordingly.
  • The server can run in local standalone mode or per-client stdio transport mode; use the mode that best fits your workflow and resource constraints.
  • Environment variables and CLI flags can override defaults; consult the project docs for a complete list of tunables (host, port, embeddings backend, device, etc.).

Related MCP Servers

Sponsor this space

Reach thousands of developers