xcatcher -manifest
Agent-first Remote MCP for X (Twitter) batch crawling with x402 USDC top-up (Base/Solana). Includes OpenAPI + copy-paste ADK/curl E2E examples.
claude mcp add --transport stdio lvpiggyqq-xcatcher-mcp-manifest docker run -i ghcr.io/example/xcatcher-manifest:latest \ --env XCATCHER_API_KEY="API key obtained from xcatcher.top (required for authenticated calls)"
How to use
Xcatcher is a REST-based service that lets users top up points on the Xcatcher platform via x402 on Solana, obtain an API key, create crawl tasks, poll task status, and download XLSX results. This MCP manifest exposes the same capabilities through standard HTTP calls that you can script with curl, jq, and base64 as described in the repo. Typical workflows include requesting a quote for points, performing a payment proof, acquiring an API key, and then programmatically creating and tracking crawl tasks. The tools you’ll rely on are the REST endpoints under the base URL https://xcatcher.top and its API base at https://xcatcher.top/api/v1, with optional health checks at https://xcatcher.top/mcp/health. You’ll need to store the API key securely and pass it as a Bearer token when performing authenticated actions like checking balance, creating tasks, and downloading results.
How to install
Prerequisites:
- curl
- jq
- base64 (often provided by coreutils)
- Optional: docker if you choose the containerized deployment
Installation steps:
-
Install prerequisites
- macOS: brew install curl jq coreutils
- Debian/Ubuntu: sudo apt-get update && sudo apt-get install -y curl jq coreutils
- Windows: use Windows Subsystem for Linux or PowerShell equivalents for curl and jq
-
Choose deployment method
-
Docker (recommended if you want isolation): a) Ensure Docker is installed and running b) Run the MCP manifest container (replace image with the correct one if needed): docker run -it --rm ghcr.io/example/xcatcher-manifest:latest
-
Non-Docker (if supported by the manifest): a) Ensure your environment can run the provided server binary/module b) Follow the server’s specific start command documented in the manifest
-
-
Configure environment variables
- Set XCATCHER_API_KEY with the API key obtained from the xcatcher.top workflow (Step 4 in the manifest): export XCATCHER_API_KEY="<your-api-key>"
-
Validate health (optional health check)
- curl -sS https://xcatcher.top/mcp/health
- If healthy, you should receive an OK or similar status object depending on the deployment
-
Test a basic authenticated call
- Example: check your balance after obtaining an API key (replace with the actual endpoints if different):
curl -sS https://xcatcher.top/api/v1/me
-H "Authorization: Bearer $XCATCHER_API_KEY" | jq .
- Example: check your balance after obtaining an API key (replace with the actual endpoints if different):
curl -sS https://xcatcher.top/api/v1/me
Additional notes
Notes and tips:
- XCATCHER_API_KEY is required for authenticated endpoints like /me, /tasks, and /tasks/{id}/download. Obtain it via the provided steps in the manifest (Step 4).
- Quotes expire quickly; plan to proceed with payment and API key acquisition promptly after generating a quote (Step 1).
- When creating crawl tasks, you must supply an idempotency_key and a users array. Reusing the same idempotency_key will avoid duplicate tasks.
- If you encounter 401 or 402 errors, re-run steps 1–4 to refresh your quote, payment proof, and API key.
- The download of results is protected; ensure you pass the Bearer token when fetching the task results.
- For health and troubleshooting, refer to the in-manifest guidance and the health endpoint.
- The provided mcp_config uses a container-based deployment example; replace with your actual deployment method and image as needed.
Related MCP Servers
code-mode
🔌 Plug-and-play library to enable agents to call MCP and UTCP tools via code execution.
crawl4ai
🕷️ A lightweight Model Context Protocol (MCP) server that exposes Crawl4AI web scraping and crawling capabilities as tools for AI agents. Similar to Firecrawl's API but self-hosted and free. Perfect for integrating web scraping into your AI workflows with OpenAI Agents SDK, Cursor, Claude Code, and other MCP-compatible tools.
crawlbase
Crawlbase MCP Server connects AI agents and LLMs with real-time web data. It powers Claude, Cursor, and Windsurf integrations with battle-tested web scraping, JavaScript rendering, and anti-bot protection enabling structured, live data inside your AI workflows.
uxc
Universal API calling CLI for URL-first discovery and invocation across OpenAPI, gRPC, GraphQL, MCP, and JSON-RPC.
mcp-json-yaml-toml
A structured data reader and writer like 'jq' and 'yq' for AI Agents
pumpclaw
Free token launcher for AI agents on Base. Uniswap V4. 80% creator fees. LP locked forever. Deploy via Farcaster (@clawd), CLI, or smart contract.