Get the FREE Ultimate OpenClaw Setup Guide →

scrapegraph

ScapeGraph MCP Server

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio scrapegraphai-scrapegraph-mcp npx -y @smithery/cli@latest run @ScrapeGraphAI/scrapegraph-mcp --config "{\"scrapegraphApiKey\":\"YOUR-SGAI-API-KEY\"}" \
  --env SCRAPEGRAPH_API_KEY="YOUR-SGAI-API-KEY (replace with your actual API key)"

How to use

ScrapeGraph MCP Server provides a collection of enterprise-grade web scraping tools that are accessible to language models via the MCP interface. The server exposes eight core tools designed for structured data extraction, web content transformation, and multi-page crawling, with both simple and advanced use cases. Tools include markdownify (convert pages to markdown), smartscraper (AI-assisted data extraction with support for infinite scrolling), searchscraper (AI-powered web search results), scrape (basic JS-rendered page fetch), sitemap (URL discovery from sitemaps), smartcrawler_initiate (begin multi-page crawls), smartcrawler_fetch_results (poll for crawl results), and agentic_scrapper (agent-based, multi-step scraping workflows). You can invoke these tools from the MCP client you use (Claude Desktop, Cursor, or any MCP-compatible client) by specifying the server and the appropriate tool function with the required parameters. The server is designed to handle API-key-based authentication and provides outputs in markdown, JSON, or custom schemas depending on the tool and prompt.

How to install

Prerequisites:

  • Node.js installed on the host (for npx usage)
  • Access to a ScrapeGraph API key

Install via Smithery (recommended):

  1. Ensure you have npx and Smithery CLI installed:
    • npm install -g @smithery/cli
  2. Run the automated install for the ScrapeGraph MCP: npx -y @smithery/cli install @ScrapeGraphAI/scrapegraph-mcp --client claude

Configure the MCP (example using Claude Desktop):

  1. Obtain your ScrapeGraph API key from the ScrapeGraph Dashboard.
  2. Create or update your Claude Desktop MCP config to include:
{
  "mcpServers": {
    "@ScrapeGraphAI-scrapegraph-mcp": {
      "command": "npx",
      "args": [
        "-y",
        "@smithery/cli@latest",
        "run",
        "@ScrapeGraphAI/scrapegraph-mcp",
        "--config",
        "\"{\\\"scrapegraphApiKey\\\":\\\"YOUR-SGAI-API-KEY\\\"}\""
      ]
    }
  }
}
  1. Replace YOUR-SGAI-API-KEY with your actual API key. Save the configuration and restart Claude Desktop to apply changes.

Alternatively, for remote usage you can connect a hosted MCP or use the provided remote configuration example in the README.

Additional notes

Tips and considerations:

  • Use environment variables to store your API key securely, e.g., SCRAPEGRAPH_API_KEY, and reference them in your local config where supported.
  • The eight tools provide different cost/credit profiles; monitor credits for long-running crawls or high-volume extractions.
  • For multi-page crawls (smartcrawler_initiate), you’ll receive a request_id to poll with smartcrawler_fetch_results to retrieve structured results when complete.
  • If you run into connectivity or authentication issues, verify that the API key is correctly configured and that the MCP server is reachable from your client.
  • When using remote hosting, consider adding the Authorization header or API key in the config as shown in the README remote examples.
  • Output formats may vary by tool; specify markdown, JSON, or a custom schema in prompts as needed.

Related MCP Servers

Sponsor this space

Reach thousands of developers