Get the FREE Ultimate OpenClaw Setup Guide →

DeepWideResearch

Agentic RAG for any scenario. Customize sources, depth, and width

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio puppyone-ai-deepwideresearch python main.py \
  --env EXA_API_KEY="your_exa_key_or_empty" \
  --env TAVILY_API_KEY="your_tavily_key_or_empty" \
  --env OPENROUTER_API_KEY="your_openrouter_key"

How to use

Open Deep Wide Research is an MCP-native agent designed for Retrieval-Augmented Generation (RAG) with a focus on agentic reasoning. It exposes a backend written in Python that you can run locally or on your infrastructure, and a frontend chat interface for interactive exploration. The system lets you tune the depth of reasoning (Deep) and breadth of sources (Wide) to balance quality, latency, and cost. By supplying API keys for OpenRouter and at least one of Exa or Tavily, you enable access to LLM providers and specialized search capabilities. The MCP model context protocol enables seamless integration with your data sources and tools, so you can plug in internal knowledge bases, APIs, and custom search engines, all while maintaining clear cost visibility behind the Deep × Wide settings.

To use the server, start the backend (and the frontend if you want the UI). The backend exposes an API that the frontend consumes, and both components can be deployed together or separately (API-only or full stack). When running locally, you can interact with the chat interface to issue queries, configure sources, and observe predicted costs per response based on your Deep × Wide configuration.

How to install

Prerequisites:

  • Python 3.9+ and Node.js 18+ installed on your machine
  • API keys: OpenRouter (required), and at least one of Exa or Tavily
  • Recommended model: open-o4mini

Installation steps:

  1. Clone the repository: git clone <repository-url> cd <repository-directory>

  2. Backend setup:

    • Copy the env template: cp deep_wide_research/env.example deep_wide_research/.env
    • Edit the copied .env and set your keys:

      deep_wide_research/.env

      OPENROUTER_API_KEY=your_key

      At least one of the following

      EXA_API_KEY=your_exa_key

      or

      TAVILY_API_KEY=your_tavily_key
    • Create and activate a virtual environment and install dependencies: python -m venv deep-wide-research source deep-wide-research/bin/activate pip install -r requirements.txt
    • Start the backend server: python main.py
  3. Frontend setup (optional, for UI):

    • Copy the frontend env template: cp chat_interface/env.example chat_interface/.env.local
    • Install and run frontend: cd chat_interface npm install npm run dev
    • Open the app at http://localhost:3000
  4. Docker (production) option:

    • docker-compose up -d

Additional notes

Environment variables and configuration:

  • OPENROUTER_API_KEY: required for access to OpenRouter LLM provider.
  • EXA_API_KEY or TAVILY_API_KEY: at least one is required to enable search/QA capabilities from Exa or Tavily. You can switch between providers as needed.
  • The MCP config exposes a single server named 'Open Deep Wide Research'. You can rename or add additional MCP servers if you host multiple MCP-driven agents.

Common issues and tips:

  • If the backend won’t start, ensure Python 3.9+ is used and that dependencies are installed from requirements.txt.
  • If the frontend cannot connect, verify that the backend is running and that the API URL is correctly configured in the frontend env.
  • When using Docker, ensure docker-compose is installed and that you have network access to pull images from the required registries.
  • The Deep × Wide settings directly influence run-time cost; adjust the Deep and Wide values to balance latency, data source coverage, and overall cost per response.

Related MCP Servers

Sponsor this space

Reach thousands of developers