Get the FREE Ultimate OpenClaw Setup Guide →

cortex-scout

An advanced web extraction and meta-search engine for AI agents. It features native parallel searching, Human-in-the-Loop (HITL) authentication fallback, and LLM-optimized data synthesis for deep web research.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio cortex-works-cortex-scout env RUST_LOG=warn SEARCH_ENGINES=google,bing,duckduckgo,brave LANCEDB_URI=/YOUR_PATH/cortex-scout/lancedb HTTP_TIMEOUT_SECS=30 MAX_CONTENT_CHARS=10000 IP_LIST_PATH=/YOUR_PATH/cortex-scout/ip.txt PROXY_SOURCE_PATH=/YOUR_PATH/cortex-scout/proxy_source.json -- /YOUR_PATH/cortex-scout/mcp-server/target/release/cortex-scout-mcp \
  --env RUST_LOG="warn" \
  --env LANCEDB_URI="/YOUR_PATH/cortex-scout/lancedb" \
  --env IP_LIST_PATH="/YOUR_PATH/cortex-scout/ip.txt" \
  --env SEARCH_ENGINES="google,bing,duckduckgo,brave" \
  --env HTTP_TIMEOUT_SECS="30" \
  --env MAX_CONTENT_CHARS="10000" \
  --env PROXY_SOURCE_PATH="/YOUR_PATH/cortex-scout/proxy_source.json"

How to use

CortexScout is a Rust-based search and web extraction engine that can operate as an MCP stdio server with an optional HTTP interface. It provides capabilities for web search, token-efficient fetching, bounded crawling, schema-driven extraction, anti-bot handling with Chromium CDP rendering, and HITL fallbacks when needed. The MCP interface exposes a range of tools such as web_search, web_fetch, web_crawl, extract_fields, memory_search, and deep_research, enabling you to perform multi-hop research, structured extractions, and robust retrieval under challenging protections. To integrate, configure CortexScout as an MCP server and issue requests through the MCP protocol; you can also run the optional HTTP server to monitor health and direct traffic with a URL endpoint. Tools like visual_scout for HITL visuals and non_robot_search for last-resort rendering provide a safety net for protected targets.

How to install

Prerequisites:

  • Rust toolchain (rustup, cargo)
  • A Unix-like environment (Linux/macOS) or Windows with WSL

Installation steps:

  1. Install Rust and cargo if not already installed:
# Rust installation (eligible for all platforms)
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
export PATH="$HOME/.cargo/bin:$PATH"
  1. Clone the repository and navigate to the mcp-server directory:
git clone https://github.com/cortex-works/cortex-scout.git
cd cortex-scout/mcp-server
  1. Build the project (release for production):
cargo build --release
  1. Run the MCP server via the configured command (as defined in MCP config):
# The MCP integration will typically invoke the binary cortex-scout-mcp with the environment and arguments specified in mcp_config
  1. Optional: verify the HTTP server is up if enabled by running:
curl http://localhost:5000/health

Additional notes

Notes and tips:

  • Use RUST_LOG=warn to avoid overwhelming MCP clients with logs on stderr.
  • Adjust HTTP_TIMEOUT_SECS and MAX_CONTENT_CHARS to balance latency and data volume for your workloads.
  • If using HITL or deep research features, ensure LANCEDB_URI points to a writable LanceDB instance location.
  • For Windows environments, consider using the alternative command format that does not rely on the env wrapper; see docs/IDE_SETUP.md for details.
  • The OPENAI/OpenRouter/LLM integration for deep research can be enabled by passing environment variables like DEEP_RESEARCH_ENABLED and related synthesis settings when launching the MCP server via the same config structure.
  • Always validate your IP_LIST_PATH and PROXY_SOURCE_PATH files to ensure proper proxy rotation and request pacing.

Related MCP Servers

Sponsor this space

Reach thousands of developers