Get the FREE Ultimate OpenClaw Setup Guide →

anycrawl

AnyCrawl MCP Server, with Scrape, Crawl and SERP.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio any4ai-anycrawl-mcp-server npx -y anycrawl-mcp \
  --env ANYCRAWL_API_KEY="YOUR-API-KEY"

How to use

AnyCrawl MCP Server provides web scraping and crawling capabilities accessible through the MCP (Model Context Protocol) interface. It supports multiple transport modes including STDIO, MCP (streamable HTTP), and SSE, making it suitable for local tooling, Cursor integration, and web-based clients. The server can extract content from single URLs, crawl entire sites with configurable depth and limits, and optionally integrate search results from the web. Output formats are flexible, including Markdown, HTML, plain text, screenshots, and structured JSON, enabling easy consumption by LLMs and downstream tools. To use it, provide your AnyCrawl API key and choose the desired transport mode (MCP, SSE, or STDIO) via environment variables or npm scripts. The server is designed to be cloud-ready or self-hosted behind an Nginx proxy for production deployments.

How to install

Prerequisites:

  • Node.js (LTS) installed on your system
  • npm or pnpm
  • An AnyCrawl API key (required for cloud/self-hosted API access)

Installation steps:

  1. Install globally via npm:
npm install -g anycrawl-mcp-server
  1. Set your API key (cloud/self-hosted) and run the MCP server:
ANYCRAWL_API_KEY=YOUR-API-KEY anycrawl-mcp
  1. If you prefer using npx (without global install):
ANYCRAWL_API_KEY=YOUR-API-KEY npx -y anycrawl-mcp
  1. Optional: run in MCP or SSE mode by environment variables or npm scripts as described in the server docs.

Notes:

  • Make sure your API key is kept secure and not committed to source control.
  • If behind a proxy or firewall, ensure outbound access to AnyCrawl services is allowed.

Additional notes

Tips and considerations:

  • Environment variables: ANYCRAWL_API_KEY is required for authentication; ANYCRAWL_BASE_URL can be set for self-hosted scenarios. For local development, you can also set ANYCRAWL_HOST and ANYCRAWL_PORT to customize listening address/port.
  • Transport modes: STDIO is best for local tooling, MCP is suitable for JSON-based clients, and SSE is ideal for web apps and browser integrations.
  • If deploying with Docker Compose, ensure MCP and SSE ports (default 3000 and 3001) are exposed and properly proxied behind Nginx with API key-prefixed paths.
  • Rate limits and credits: When using the cloud service, monitor your API credits; the AnyCrawl free tier provides 1,500 credits to crawl pages.
  • Cursor/Claude/Known integrations: The MCP will expose endpoints via /mcp or /sse paths; ensure clients are configured to use the correct transport and include the API key in headers or query parameters as required by your client.
  • Troubleshooting: Check health endpoint when SSE is enabled to verify server responsiveness; review environment variable configurations if connectivity fails.

Related MCP Servers

Sponsor this space

Reach thousands of developers