scrapi
MCP server for using ScrAPI to scrape web pages.
claude mcp add --transport stdio deventerprisesoftware-scrapi-mcp docker run -i --rm -e SCRAPI_API_KEY deventerprisesoftware/scrapi-mcp \ --env SCRAPI_API_KEY="<YOUR_API_KEY>"
How to use
This MCP server exposes ScrAPI integration as an MCP service that lets you scrape web pages using ScrAPI. It provides two main tools: scrape_url_html and scrape_url_markdown. The scrape_url_html tool returns the HTML content of a target URL, which is useful when you need full DOM structure and to preserve complex layouts. The scrape_url_markdown tool returns the page content as Markdown, which is helpful when you want clean, readable text extraction. Both tools accept a required url parameter and an optional browserCommands parameter, which is a JSON-encoded array of browser commands to interact with the page before scraping (e.g., clicking buttons, filling forms, waiting for elements, scrolling). This enables you to bypass bot detection, dynamic loading, or geolocation restrictions by simulating user interactions before extraction.
How to install
Prerequisites:
- Docker installed and running (recommended) or Node.js with npm for NPX usage.
- Optional: ScrAPI API key if you need higher concurrency or quota.
Option A: Run via Docker (recommended for quick setup)
-
Ensure Docker is installed and running:
- macOS: Docker Desktop
- Windows: Docker Desktop
- Linux: install docker and start the daemon
-
Run the MCP server using the official image:
docker run -it --rm \
-e SCRAPI_API_KEY=<YOUR_API_KEY> \
deventerprisesoftware/scrapi-mcp
- Provide your ScrAPI API key via environment variable SCRAPI_API_KEY to enable authenticated usage.
Option B: Run via NPX (Node.js environment required)
- Ensure Node.js and npm are installed.
- Install and run the MCP server with NPX:
npx -y @deventerprisesoftware/scrapi-mcp
- When prompted or configured, supply SCRAPI_API_KEY if you have a key; otherwise you can run with limited free usage.
Option C: Build locally from source (if you clone the repo)
- Ensure Node.js and npm are installed.
- Install dependencies and start the server as documented in the repo's package.json scripts.
- Configure environment variables (e.g., SCRAPI_API_KEY) as needed.
Additional notes
Tips and notes:
- If you run behind a firewall or proxy, ensure outbound access to ScrAPI services is allowed.
- The SCRAPI_API_KEY env var is optional for limited usage but required for higher concurrency or quotas.
- For the Docker setup, you can customize the environment variable mapping or add other ScrAPI-related config as needed.
- The browserCommands input should be a JSON array string, with each item representing a command such as click, input, scroll, wait, waitfor, or javascript as documented by ScrAPI.
- If you encounter rate limits, consider upgrading your ScrAPI plan or introducing throttling in your client.
Related MCP Servers
frontmcp
TypeScript-first framework for the Model Context Protocol (MCP). You write clean, typed code; FrontMCP handles the protocol, transport, DI, session/auth, and execution flow.
shinzo-ts
TypeScript SDK for MCP server observability, built on OpenTelemetry. Gain insight into agent usage patterns, contextualize tool calls, and analyze server performance across platforms. Integrate with any OpenTelemetry ingest service including the Shinzo platform.
n8n-workflow-builder
MCP server that allow LLM in agent mode builds n8n workflows for you
openai -agent-dotnet
Sample to create an AI Agent using OpenAI models with any MCP server running on Azure Container Apps
mcp-bun
Bun Javascript Runtime MCP Server for AI Agents
CyberSecurity s
Model Context Protocol Server For Cyber Security