blz
Local-first, line-accurate search for blazing-fast lookups of llms.txt documentation. Human-friendly, Agent-ready.
claude mcp add --transport stdio outfitter-dev-blz blz mcp-server
How to use
BLZ provides an MCP server integration for fast, local, line-accurate documentation search over llms.txt data. The server runs the BLZ MCP component which exposes the same powerful search and TOC navigation features as the CLI, enabling AI agents to query and retrieve exact line ranges (e.g., bun:304-324) with minimal latency. Once the MCP server is running, you can connect to it from your AI tooling to issue search requests, browse documentation structures, and obtain structured, citationed results suitable for inclusion in agent prompts or LLM reasoning. Typical workflows include starting the server, then using the agent’s MCP integration to run searches, fetch JSON results, and feed those results into downstream tasks.
To use the BLZ MCP capabilities, initialize the server (via the Blaze CLI integration) and then leverage its commands within your agent environment. The server supports features such as exact line citations, TOC navigation with tree view, language filtering, and offline-ready indexing. Your agent can request data through standard MCP protocols and receive structured responses that include matches, citations, and optional metadata. If you already use BLZ from the CLI, you can expose the same commands to your agent by targeting the mcp-server binary and querying it via stdio-based IPC.
Note: in agent setups, you typically run the server as a background process and communicate via the MCP protocol. The BLZ MCP server aims to provide fast, deterministic search results with line-precise citations, enabling robust offline usage and efficient data retrieval for AI workflows.
How to install
Prerequisites:
- Rust toolchain (stable) and Cargo installed, or an installed BLZ binary via the official install script
- Access to the internet for initial downloads
- Install BLZ (CLI) and ensure the mcp-server binary is available
- Quick install (macOS/Linux):
curl -fsSL https://blz.run/install.sh | sh
- Verify installation and that the blz binary is in PATH:
which blz
blz --version
- Install from source (optional, for development or custom builds)
# Clone the repository
git clone https://github.com/outfitter-dev/blz
cd blz
# Install the CLI from source
cargo install --path crates/blz-cli
- Ensure the MCP server binary is available
- The MCP server component is exposed via the blz CLI as a server subcommand. Confirm you can list commands or start the server:
blz mcp-server --help
- Run and test the MCP server locally
- Start the server (as a background process if desired):
blz mcp-server &
- Or run in foreground for testing and capture logs:
blz mcp-server --log-level debug
- Optional: set environment variables for tuning
- Example (adjust as needed):
export BLZ_MCP_DEBUG=true
export BLZ_INDEX_PATH="$HOME/.local/share/blz/index"
- Integrate with your MCP client
- Use the mcp-server as described in your agent setup, typically pointing your MCP client to the server's stdio interface.
Additional notes
Tips and common considerations:
- BLZ performs local indexing of llms.txt sources; ensure sources are downloaded and indexed before enabling MCP queries for best performance.
- The BLZ MCP server supports language filtering by default; you can disable it with appropriate flags if multi-language content is required.
- If you encounter slow searches after initial indexing, verify cache and source freshness (use blz refresh to reindex sources).
- For reproducible results, rely on exact line citations (e.g., bun:304-324) returned by the MCP server, which improves traceability in agent outputs.
- When deploying in multi-user or cloud environments, consider configuring per-user scopes and proper sandboxing for the server process to avoid cross-user data access issues.
Related MCP Servers
In-Memoria
Persistent Intelligence Infrastructure for AI Agents
time
⏰ Time MCP Server: Giving LLMs Time Awareness Capabilities
just
Share the same project justfile tasks with your AI Coding Agent.
mcp-proxy
Fast rust MCP proxy between stdio and SSE
mcp-framework
Rust MCP framework for building AI agents
cco
Real-time audit and approval system for Claude Code tool calls.