Get the FREE Ultimate OpenClaw Setup Guide →

grepai

Semantic Search & Call Graphs for AI Agents (100% Local)

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio yoanbernabeu-grepai docker run -i yoanbernabeu/grepai:latest \
  --env INDEX_DIR="path to store local index (optional)" \
  --env WATCH_PATHS="comma-separated paths to index (optional)" \
  --env EMBEDDING_PROVIDER="set to the embedding provider you use (e.g., ollama, lmstudio, openai) or leave default if supported"

How to use

grepai is a privacy-first semantic code search tool that can be exposed as an MCP tool so your AI agents can perform semantic code queries directly against your codebase. It uses vector embeddings to understand code meaning, enabling natural language queries such as 'authentication logic' or 'trace who calls handleUserSession' and returns relevant code snippets or references. The MCP integration allows an agent to invoke grepai as a tool within a broader workflow, making it possible to search, index, and trace code usage from within an automated agent session. You can run the grepai server in a container, and your MCP-enabled agent can send commands to the tool to initialize, index, and search your repository content. The tool supports indexing (watching for changes), semantic search, and graph tracing, all locally if you prefer privacy.

How to install

Prerequisites:

  • Docker installed on the host where the MCP server will run
  • Optional: an embedding provider (e.g., Ollama, LM Studio, or OpenAI) configured and accessible

Installation steps:

  1. Ensure Docker is running on your machine
  2. Pull and run the grepai MCP service (as a container):
# Run the grepai MCP server container (interactive)
docker run -it --rm \
  -v /path/to/your/code:/workspace \
  -e EMBEDDING_PROVIDER=ollama \
  yoanbernabeu/grepai:latest
  1. If your environment requires a specific embedding provider, install and configure it per its documentation (for example Ollama) and ensure the provider is reachable from the container.
  2. In your MCP configuration, reference the server as shown in the mcp_config example and start the MCP workflow that includes the grepai tool.

Notes:

  • If you need to customize paths to index, use environment variables or bind-mounts to expose your codebase to the container.
  • Ensure network access from the MCP host to the container if they are running on separate networks.

Additional notes

Tips and common considerations:

  • When using a local embedding provider, ensure the provider stays up-to-date with your codebase to keep search results relevant.
  • If you encounter performance issues, adjust the indexing cadence (watch vs. manual indexing) and consider narrowing the indexed scope to relevant packages.
  • Ensure the MCP client and grepai tool agree on data formats for search queries and results.
  • For privacy-sensitive codebases, prefer local embeddings and local indexing to avoid sending code to external services.
  • The env var EMBEDDING_PROVIDER is optional depending on how you configure grepai; set it to Ollama, LM Studio, or OpenAI as needed.

Related MCP Servers

Sponsor this space

Reach thousands of developers