Get the FREE Ultimate OpenClaw Setup Guide →

srag

Semantic code search and RAG system written in Rust with tree-sitter chunking, MCP server for IDE integration, prompt injection detection, and secret redaction

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio wrxck-srag npx @modelcontextprotocol/inspector srag mcp \
  --env API_KEY="LLM provider API key if using external providers (optional)" \
  --env SRAG_CONFIG="Path to custom config if needed (optional)"

How to use

srag provides a powerful MCP server that exposes semantic code search capabilities across all indexed repositories. It runs as an MCP service over stdio and communicates via JSON-RPC on stdin/stdout, enabling you to query embeddings, perform semantic searches, and inspect project conventions through a structured set of tools. You can leverage the built-in integration with Claude Code, so your agent can automatically discover and use srag's MCP tools when available. The available tools include listing indexed projects, semantic code search, finding similar code, symbol lookup, file retrieval, project pattern analysis, full-text search, and call graph queries, all tailored to help an AI coding assistant reference your actual codebase rather than generating from scratch. To interact with the MCP server in practice, you would start the server via the MCP launcher (for example using the inspector wrapper via npx) and then call methods like list_projects, search_code, get_file, or get_project_patterns from your agent or tooling.

When using the MCP tools, consider starting with search_code or get_project_patterns to quickly understand what patterns and conventions exist across your indexed repositories. For deeper investigations, find_similar_code can surface near-duplicates to inform refactoring decisions, while find_callers and find_callees help you trace function usage across the codebase. The integration is designed to minimize token usage by returning structured results that can be directly consumed by an LLM, improving both speed and relevance of generated code.

How to install

Prerequisites:

  • Rust toolchain installed
  • Python 3.10+ installed
  • Optional: Node.js environment if you plan to use the MCP wrapper via npx

Installation steps:

  1. Install prerequisites (Rust and Python):
    • Linux/macOS: follow the project’s install.sh script as described in the README, which builds the binary and sets up the Python backend
    • Windows: use WSL and follow the Linux instructions
  2. Run the install script to build the srag binary and set up the Python ML backend:
    • bash ./install.sh
  3. Ensure your PATH includes the local bin path if needed. For Linux/macOS, you may need:
    • echo 'export PATH="$HOME/.local/bin:$PATH"' >> ~/.bashrc
    • source ~/.bashrc
  4. If you plan to use MCP via the Node.js-based inspector wrapper, ensure Node.js is installed and you can run:
    • npx @modelcontextprotocol/inspector srag mcp
  5. Start indexing your projects with the provided CLI (examples from the README):
    • srag index /path/to/repo
    • srag sync
    • srag watch
    • srag chat
  6. Start the MCP server (as shown in the repository):
    • npx @modelcontextprotocol/inspector srag mcp

Prerequisite notes:

  • The Install section assumes Rust and Python are installed and that you have the necessary system dependencies to build the Rust binary and Python ML backend.
  • For macOS, you may install rust and python via Homebrew if you don’t already have them.
  • If you’re using external LLM providers, place your API key in api_key.txt within the config directory or set the corresponding environment variable as described in the config documentation.

Additional notes

Tips and notes:

  • The MCP server communicates over stdio; ensure your client usage expects JSON-RPC on stdin/stdout and handles no terminal output when run directly.
  • Config file is located at ~/.config/srag/config.toml on Linux or ~/Library/Application Support/srag/config.toml on macOS; you can adjust LLM provider settings, context sizes, and ignore patterns there.
  • External API keys for LLM providers can be supplied via api_key.txt in the config directory or through environment variables as described in the config guide.
  • The installer script installs the binary to a typical user-local path (~/.local/bin) and may require PATH adjustments if that directory isn’t in your PATH.
  • The MCP tools you’ll commonly use are list_projects, search_code, find_similar_code, search_symbols, get_file, get_project_patterns, text_search, find_callers, and find_callees; most workflows involve indexing projects first and then querying through one-shot or interactive commands via the MCP-enabled agent.
  • If you encounter issues with indexing or querying, check that the Python ML backend is running and reachable by the Rust CLI, and verify that the config.toml is correctly placed and readable.

Related MCP Servers

Sponsor this space

Reach thousands of developers