octocode
Semantic code searcher and codebase utility
claude mcp add --transport stdio muvon-octocode cargo run --bin octocode-mcp
How to use
Octocode includes a built-in MCP server that exposes AI-assisted code understanding through the Model Context Protocol. To enable the MCP capabilities, start the server using the provided binary (for Rust builds this is typically the octocode-mcp binary). Once running, you can interact with the MCP endpoints to trigger AI-assisted code analysis, memory-enabled semantic search, and LSP-based code intelligence within your editor or IDE. The MCP integration is designed to work alongside Octocode’s index/search features, enabling context-aware code completion, smart assistance during reviews, and conversational queries about your codebase.
Typical workflow:
- Start the MCP server to boot the LSP and AI services for your project.
- Use your editor’s MCP can be configured to communicate with the running MCP endpoint to fetch context-aware suggestions, code snippets, and relationship graphs derived from your repository.
- Leverage the memory system and graph-based reasoning to surface connections between files, modules, and functions, improving navigation and understanding of large codebases.
How to install
Prerequisites:
- Rust toolchain (rustc and cargo) installed on your system
- git installed
- Optional: Voyage API key for embeddings and optional OpenRouter API key for enhanced AI features
Installation steps:
- Clone the repository or download the source: git clone https://github.com/Muvon/octocode.git cd octocode
- Build the MCP server (Rust/Cargo-based):
cargo build --release
or install the MCP binary if provided by the project:
cargo install --path . --bin octocode-mcp
- Ensure dependencies and environment are set:
- Set required API keys if you plan to use embeddings or AI features: export VOYAGE_API_KEY="your-voyage-api-key" export OPENROUTER_API_KEY="your-openrouter-api-key" (optional)
- Run the MCP server (example):
From the project root, using the built binary
cargo run --bin octocode-mcp -- --path /path/to/your/projectOr, if installed:
octocode-mcp --path /path/to/your/project - Verify the server is listening on the configured port and connect your editor/clients using the MCP protocol endpoints defined by the project documentation.
Additional notes
Notes and tips:
- The MCP server is designed to run locally with a local-first design, and will not access external networks for search unless you configure embedding providers like Voyage or OpenRouter.
- Ensure your VOYAGE_API_KEY is set if you plan to use embedding-based features; tokens are typically metered by the provider.
- If you encounter port conflicts, change the MCP server binding port in the configuration or environment variables as documented in the MCP integration guide.
- The server exposes LSP capabilities for language-aware assistance; configure your editor's LSP client to point to the MCP server endpoint for real-time code intelligence.
- Check logs for graph construction and memory system operations to troubleshoot index or semantic search behavior.
Related MCP Servers
gemini-cli
An open-source AI agent that brings the power of Gemini directly into your terminal.
grepai
Semantic Search & Call Graphs for AI Agents (100% Local)
Mantic.sh
A structural code search engine for Al agents.
ai-trader
Backtrader-powered backtesting framework for algorithmic trading, featuring 20+ strategies, multi-market support, CLI tools, and an integrated MCP server for professional traders.
automagik-genie
🧞 Automagik Genie – bootstrap, update, and roll back AI agent workspaces with a single CLI + MCP toolkit.
context-sync
Local persistent memory store for LLM applications including continue.dev, cursor, claude desktop, github copilot, codex, antigravity, etc.