Get the FREE Ultimate OpenClaw Setup Guide →

deep-research

A Model Context Protocol (MCP) compliant server designed for comprehensive web research. It uses Tavily's Search and Crawl APIs to gather detailed information on a given topic, then structures this data in a format perfect for LLMs to create high-quality markdown documents.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio pinkpixel-dev-deep-research-mcp npx -y @pinkpixel/deep-research-mcp \
  --env CRAWL_LIMIT="maximum URLs to crawl per source (default: 10)" \
  --env CRAWL_TIMEOUT="timeout in seconds for crawl requests (default: 180)" \
  --env SEARCH_TIMEOUT="timeout in seconds for search requests (default: 60)" \
  --env TAVILY_API_KEY="your Tavily API key (required)" \
  --env CRAWL_MAX_DEPTH="maximum crawl depth (default: 1)" \
  --env FILE_WRITE_ENABLED="true to enable file writing capability (default: false)" \
  --env MAX_SEARCH_RESULTS="maximum search results to retrieve (default: 7)" \
  --env ALLOWED_WRITE_PATHS="comma-separated allowed directories (default: user home directory)" \
  --env DOCUMENTATION_PROMPT="Optional custom documentation prompt" \
  --env FILE_WRITE_LINE_LIMIT="maximum lines per file write operation (default: 200)"

How to use

Deep Research MCP Server is a Node.js-based MCP service that orchestrates Tavily-powered web search and crawling to assemble a structured research output optimized for large language models. It performs multi-step information gathering, combines findings from multiple sources, and packages the results along with a detailed documentation prompt into a JSON payload suitable for downstream tools and LLMS. You interact with the server by launching it via npx (or a global install) and providing your Tavily API key. The server output is designed to be machine-friendly, enabling automated documentation generation and long-form markdown production. The included configuration hooks let you override the default documentation prompt and control timing, depth, and scope of search and crawl operations to fit your use case.

How to install

Prerequisites:

  • Node.js 18.x or later
  • npm (comes with Node.js) or Yarn

Installation steps:

  1. Quick run via NPX (no global install) — Recommended for quick use
npx @pinkpixel/deep-research-mcp
  1. Global installation (optional)
npm install -g @pinkpixel/deep-research-mcp

Then run with:

deep-research-mcp
  1. Local project integration or development
git clone https://github.com/your-username/deep-research-mcp.git
cd deep-research-mcp
npm install

Configure environment variables as needed (see mcp_config for details).

Additional notes

Tips and notes:

  • You must provide a Tavily API key via TAVILY_API_KEY. This is required for search and crawl operations.
  • The DOCUMENTATION_PROMPT env var can customize how the final markdown/documentation is generated; if omitted, a comprehensive default prompt is used.
  • Use OUTPUT PATH controls to write research outputs to a specific folder if you need persistent storage beyond default behavior.
  • Timeout and performance settings (SEARCH_TIMEOUT, CRAWL_TIMEOUT, MAX_SEARCH_RESULTS, CRAWL_MAX_DEPTH, CRAWL_LIMIT) help tailor the tool for slower networks or larger research scopes.
  • If you encounter permission issues with file writing (FILE_WRITE_ENABLED), ensure ALLOWED_WRITE_PATHS includes the directories you intend to write to and that the process has write permissions.
  • Ensure your Tavily API key is kept secure and not committed to version control.
  • For debugging, you can temporarily set verbose logging in your own environment to trace the calls made by the tool.

Related MCP Servers

Sponsor this space

Reach thousands of developers