google_search_console-automation
Scannednpx machina-cli add skill ComposioHQ/awesome-claude-skills/google_search_console-automation --openclawGoogle Search Console Automation via Rube MCP
Automate Google Search Console operations through Composio's Google Search Console toolkit via Rube MCP.
Toolkit docs: composio.dev/toolkits/google_search_console
Prerequisites
- Rube MCP must be connected (RUBE_SEARCH_TOOLS available)
- Active Google Search Console connection via
RUBE_MANAGE_CONNECTIONSwith toolkitgoogle_search_console - Always call
RUBE_SEARCH_TOOLSfirst to get current tool schemas
Setup
Get Rube MCP: Add https://rube.app/mcp as an MCP server in your client configuration. No API keys needed — just add the endpoint and it works.
- Verify Rube MCP is available by confirming
RUBE_SEARCH_TOOLSresponds - Call
RUBE_MANAGE_CONNECTIONSwith toolkitgoogle_search_console - If connection is not ACTIVE, follow the returned auth link to complete setup
- Confirm connection status shows ACTIVE before running any workflows
Tool Discovery
Always discover available tools before executing workflows:
RUBE_SEARCH_TOOLS: queries=[{"use_case": "search performance, URL inspection, sitemaps, and indexing status", "known_fields": ""}]
This returns:
- Available tool slugs for Google Search Console
- Recommended execution plan steps
- Known pitfalls and edge cases
- Input schemas for each tool
Core Workflows
1. Discover Available Google Search Console Tools
RUBE_SEARCH_TOOLS:
queries:
- use_case: "list all available Google Search Console tools and capabilities"
Review the returned tools, their descriptions, and input schemas before proceeding.
2. Execute Google Search Console Operations
After discovering tools, execute them via:
RUBE_MULTI_EXECUTE_TOOL:
tools:
- tool_slug: "<discovered_tool_slug>"
arguments: {<schema-compliant arguments>}
memory: {}
sync_response_to_workbench: false
3. Multi-Step Workflows
For complex workflows involving multiple Google Search Console operations:
- Search for all relevant tools:
RUBE_SEARCH_TOOLSwith specific use case - Execute prerequisite steps first (e.g., fetch before update)
- Pass data between steps using tool responses
- Use
RUBE_REMOTE_WORKBENCHfor bulk operations or data processing
Common Patterns
Search Before Action
Always search for existing resources before creating new ones to avoid duplicates.
Pagination
Many list operations support pagination. Check responses for next_cursor or page_token and continue fetching until exhausted.
Error Handling
- Check tool responses for errors before proceeding
- If a tool fails, verify the connection is still ACTIVE
- Re-authenticate via
RUBE_MANAGE_CONNECTIONSif connection expired
Batch Operations
For bulk operations, use RUBE_REMOTE_WORKBENCH with run_composio_tool() in a loop with ThreadPoolExecutor for parallel execution.
Known Pitfalls
- Always search tools first: Tool schemas and available operations may change. Never hardcode tool slugs without first discovering them via
RUBE_SEARCH_TOOLS. - Check connection status: Ensure the Google Search Console connection is ACTIVE before executing any tools. Expired OAuth tokens require re-authentication.
- Respect rate limits: If you receive rate limit errors, reduce request frequency and implement backoff.
- Validate schemas: Always pass strictly schema-compliant arguments. Use
RUBE_GET_TOOL_SCHEMASto load full input schemas whenschemaRefis returned instead ofinput_schema.
Quick Reference
| Operation | Approach |
|---|---|
| Find tools | RUBE_SEARCH_TOOLS with Google Search Console-specific use case |
| Connect | RUBE_MANAGE_CONNECTIONS with toolkit google_search_console |
| Execute | RUBE_MULTI_EXECUTE_TOOL with discovered tool slugs |
| Bulk ops | RUBE_REMOTE_WORKBENCH with run_composio_tool() |
| Full schema | RUBE_GET_TOOL_SCHEMAS for tools with schemaRef |
Toolkit docs: composio.dev/toolkits/google_search_console
Source
git clone https://github.com/ComposioHQ/awesome-claude-skills/blob/master/composio-skills/google_search_console-automation/SKILL.mdView on GitHub Overview
This skill automates Google Search Console operations using Composio’s Rube MCP toolkit. It handles search performance data, URL inspection, sitemaps, and indexing status, with an emphasis on always querying current tool schemas first. This keeps GSC workflows consistent, scalable, and auditable.
How This Skill Works
Begin by verifying Rube MCP connectivity and fetching tool schemas via RUBE_SEARCH_TOOLS. Discover available GSC tools with a specific use_case, then execute the chosen tool through RUBE_MULTI_EXECUTE_TOOL, passing schema compliant arguments. For bulk tasks, leverage RUBE_REMOTE_WORKBENCH and thread pools to run multiple operations in parallel.
When to Use It
- Automate retrieval of search performance data to monitor trends over time.
- Inspect individual URLs to verify indexing and crawl status.
- Manage and submit sitemaps, then verify indexing status.
- Incorporate GSC tasks into multi step automated workflows such as fetch before update.
- Perform bulk operations on many URLs or sitemaps using remote bench tools.
Quick Start
- Step 1: Add the Rube MCP endpoint https://rube.app/mcp to your client and verify RUBE_SEARCH_TOOLS is available.
- Step 2: Call RUBE_SEARCH_TOOLS to discover Google Search Console tools and their input schemas.
- Step 3: Run a tool with RUBE_MULTI_EXECUTE_TOOL using a discovered slug and schema compliant arguments; for bulk tasks use RUBE_REMOTE_WORKBENCH.
Best Practices
- Always run RUBE_SEARCH_TOOLS first to load current schemas.
- Check the GSC connection is ACTIVE before executing any tools.
- Handle pagination by checking next_cursor or page_token and looping until exhausted.
- Validate schemas strictly and prefer schemaRef driven inputs when provided.
- Use RUBE_REMOTE_WORKBENCH with a ThreadPoolExecutor for bulk operations.
Example Use Cases
- Discover tools and run a search performance query to compare last week against this week.
- Inspect a batch of URLs to confirm indexing status and crawl errors.
- Submit a sitemap and automatically monitor when it is indexed.
- Fetch indexing status for a list of URLs and export results to a report.
- Orchestrate a multi step workflow that fetches performance data, inspects URLs, and updates a status board.