scrapegraph
ScapeGraph MCP Server
claude mcp add --transport stdio scrapegraphai-scrapegraph-mcp npx -y @smithery/cli@latest run @ScrapeGraphAI/scrapegraph-mcp --config "{\"scrapegraphApiKey\":\"YOUR-SGAI-API-KEY\"}" \
--env SCRAPEGRAPH_API_KEY="YOUR-SGAI-API-KEY (replace with your actual API key)"How to use
ScrapeGraph MCP Server provides a collection of enterprise-grade web scraping tools that are accessible to language models via the MCP interface. The server exposes eight core tools designed for structured data extraction, web content transformation, and multi-page crawling, with both simple and advanced use cases. Tools include markdownify (convert pages to markdown), smartscraper (AI-assisted data extraction with support for infinite scrolling), searchscraper (AI-powered web search results), scrape (basic JS-rendered page fetch), sitemap (URL discovery from sitemaps), smartcrawler_initiate (begin multi-page crawls), smartcrawler_fetch_results (poll for crawl results), and agentic_scrapper (agent-based, multi-step scraping workflows). You can invoke these tools from the MCP client you use (Claude Desktop, Cursor, or any MCP-compatible client) by specifying the server and the appropriate tool function with the required parameters. The server is designed to handle API-key-based authentication and provides outputs in markdown, JSON, or custom schemas depending on the tool and prompt.
How to install
Prerequisites:
- Node.js installed on the host (for npx usage)
- Access to a ScrapeGraph API key
Install via Smithery (recommended):
- Ensure you have npx and Smithery CLI installed:
- npm install -g @smithery/cli
- Run the automated install for the ScrapeGraph MCP: npx -y @smithery/cli install @ScrapeGraphAI/scrapegraph-mcp --client claude
Configure the MCP (example using Claude Desktop):
- Obtain your ScrapeGraph API key from the ScrapeGraph Dashboard.
- Create or update your Claude Desktop MCP config to include:
{
"mcpServers": {
"@ScrapeGraphAI-scrapegraph-mcp": {
"command": "npx",
"args": [
"-y",
"@smithery/cli@latest",
"run",
"@ScrapeGraphAI/scrapegraph-mcp",
"--config",
"\"{\\\"scrapegraphApiKey\\\":\\\"YOUR-SGAI-API-KEY\\\"}\""
]
}
}
}
- Replace YOUR-SGAI-API-KEY with your actual API key. Save the configuration and restart Claude Desktop to apply changes.
Alternatively, for remote usage you can connect a hosted MCP or use the provided remote configuration example in the README.
Additional notes
Tips and considerations:
- Use environment variables to store your API key securely, e.g., SCRAPEGRAPH_API_KEY, and reference them in your local config where supported.
- The eight tools provide different cost/credit profiles; monitor credits for long-running crawls or high-volume extractions.
- For multi-page crawls (smartcrawler_initiate), you’ll receive a request_id to poll with smartcrawler_fetch_results to retrieve structured results when complete.
- If you run into connectivity or authentication issues, verify that the API key is correctly configured and that the MCP server is reachable from your client.
- When using remote hosting, consider adding the Authorization header or API key in the config as shown in the README remote examples.
- Output formats may vary by tool; specify markdown, JSON, or a custom schema in prompts as needed.
Related MCP Servers
nautex
MCP server for guiding Coding Agents via end-to-end requirements to implementation plan pipeline
mcp-yfinance
Real-time stock API with Python, MCP server example, yfinance stock analysis dashboard
pfsense
pfSense MCP Server enables security administrators to manage their pfSense firewalls using natural language through AI assistants like Claude Desktop. Simply ask "Show me blocked IPs" or "Run a PCI compliance check" instead of navigating complex interfaces. Supports REST/XML-RPC/SSH connections, and includes built-in complian
cloudwatch-logs
MCP server from serkanh/cloudwatch-logs-mcp
servicenow-api
ServiceNow MCP Server and API Wrapper
the -company
TheMCPCompany: Creating General-purpose Agents with Task-specific Tools