Get the FREE Ultimate OpenClaw Setup Guide →

sql-queries

Scanned
npx machina-cli add skill phuryn/pm-skills/sql-queries --openclaw
Files (1)
SKILL.md
3.5 KB

SQL Query Generator

Purpose

Transform natural language requirements into optimized SQL queries across multiple database platforms. This skill helps product managers, analysts, and engineers generate accurate queries without manual syntax work.

How It Works

Step 1: Understand Your Database Schema

  • If you provide a schema file (SQL, documentation, or diagram description), I will read and analyze it
  • Extract table names, column definitions, data types, and relationships
  • Identify primary keys, foreign keys, and indexing strategies

Step 2: Process Your Request

  • Clarify the exact data you need to retrieve or analyze
  • Confirm the SQL dialect (BigQuery, PostgreSQL, MySQL, Snowflake, etc.)
  • Ask for any additional requirements (filters, aggregations, sorting)

Step 3: Generate Optimized Query

  • Write efficient SQL that leverages your database structure
  • Include comments explaining complex logic
  • Add performance considerations for large datasets
  • Provide alternative approaches if applicable

Step 4: Explain and Test

  • Explain the query logic in plain English
  • Suggest how to test or validate results
  • Offer tips for performance optimization
  • If you want, generate a test script or sample data

Usage Examples

Example 1: Query from Schema File

Upload your database_schema.sql file and say:
"Generate a query to find users who signed up in the last 30 days
and had at least 5 active sessions"

Example 2: Query from Diagram Description

"Here's my database: Users table (id, email, created_at), Sessions table
(id, user_id, timestamp, duration). Generate a query for average session
duration per user in January 2026."

Example 3: Complex Analysis Query

"Create a BigQuery query to analyze our revenue by region and customer tier,
including year-over-year growth rates."

Key Capabilities

  • Multi-Dialect Support: Works with BigQuery, PostgreSQL, MySQL, Snowflake, SQL Server
  • File Reading: Reads schema files, SQL dumps, and data documentation
  • Query Optimization: Suggests indexes, partitioning, and performance improvements
  • Explanation: Breaks down queries for learning and documentation
  • Testing: Can generate test queries and sample data scripts
  • Script Execution: Create executable SQL scripts for your database

Tips for Best Results

  1. Provide context: Share your database schema or structure
  2. Be specific: Clearly describe what data you need and any filters
  3. Mention database: Specify which SQL dialect you're using
  4. Include constraints: Mention data volume, time ranges, and performance needs
  5. Request format: Ask for the query result format if you need specific output

Output Format

You'll receive:

  • SQL Query: Production-ready SQL code with comments
  • Explanation: What the query does and how it works
  • Performance Notes: Optimization tips and considerations
  • Test Script (if requested): Sample data and validation queries

Further Reading

Source

git clone https://github.com/phuryn/pm-skills/blob/main/pm-data-analytics/skills/sql-queries/SKILL.mdView on GitHub

Overview

The sql-queries skill translates natural language requests into production-ready SQL across BigQuery, PostgreSQL, MySQL, Snowflake, and more. By reading your uploaded schema files or diagrams, it tailors queries to your structures, explains complex logic, and flags performance considerations. It helps product managers, analysts, and engineers generate accurate queries without manual syntax work.

How This Skill Works

Reads and analyzes your schema from an upload to identify tables, keys, and relationships. Interprets the NL request, locks in the SQL dialect, and notes required filters or aggregations. Generates optimized, production-ready SQL with inline comments and optional test scripts, plus an explanation of the logic.

When to Use It

  • You have a schema file and need a query to identify users signed up in the last 30 days with at least 5 active sessions.
  • You have a diagram description and want per-user metrics like average session duration.
  • You need a BigQuery-style revenue analysis by region and customer tier with YoY growth.
  • You want a test script or sample data to validate results before running real queries.
  • You require guidance on performance implications and optimization options for large datasets.

Quick Start

  1. Step 1: Upload a schema file or describe your database and specify the SQL dialect.
  2. Step 2: State the data you need with filters, aggregations, and output format.
  3. Step 3: Receive production-ready SQL with explanations and optional test scripts.

Best Practices

  • Provide your schema or structure so the tool can tailor the query.
  • Be specific about the data you need, including filters, dates, and aggregations.
  • Mention the SQL dialect (BigQuery, PostgreSQL, MySQL, etc.).
  • Include constraints such as data volume, time ranges, and performance needs.
  • Request the desired output format (query only, or query plus explanation and test data).

Example Use Cases

  • Upload your database_schema.sql file and say: Generate a query to find users who signed up in the last 30 days and had at least 5 active sessions.
  • Here's a diagram description: Users(id, email, created_at), Sessions(id, user_id, timestamp, duration). Generate a query for average session duration per user in January 2026.
  • Create a BigQuery query to analyze revenue by region and customer tier, including year-over-year growth rates.
  • Ask for a test script or sample data to validate results before production run.
  • Request guidance on performance optimization and alternative query approaches for large datasets.

Frequently Asked Questions

Add this skill to your agents
Sponsor this space

Reach thousands of developers