Get the FREE Ultimate OpenClaw Setup Guide →

verify

npx machina-cli add skill parthalon025/autonomous-coding-toolkit/verify --openclaw
Files (1)
SKILL.md
2.7 KB

Dependencies

  • Bash tool (git, test runners, linters)

Run a verification pass on the work just completed. Do NOT skip steps.

Step 1: Check what changed

  1. git diff --stat — list modified files
  2. git diff --cached --stat — list staged files
  3. If no git repo, list files you created or modified this session

Step 2: Run automated checks (if available)

Try each in order, skip if not applicable:

  1. Tests: Look for test runner config. Run tests. Report pass/fail count.
  2. Linter: Look for linter config. Run linter. Report issue count.
  3. Type check: Look for tsconfig.json or mypy.ini. Run type checker.
  4. Build: If there's a build step, run it.

Step 2.5: Integration Wiring + Lesson Scanner

Run this step if the session built multiple components across batches.

  1. Integration wiring check: Confirm every shared module built this session is imported/called by its consumer.
  2. Lesson scanner: Dispatch lesson-scanner agent against modified files.
  3. Contract tests: For parallel feature lists, verify a contract test exists.

Step 3: Pipeline testing (if service has API, UI, or multi-layer data flow)

3a: Horizontal sweep — every endpoint/interface works

Hit every API endpoint, CLI command, and static file with a known input. Confirms the surface exists and responds.

3b: Vertical trace — one input flows through the entire stack

Submit one real input and trace it through every layer. Confirms data flows end-to-end and state accumulates correctly.

Why both axes are required

Horizontal catches: missing routes, broken static files, schema errors, 500s. Vertical catches: path prefix mismatches, missing state updates, aggregate bugs.

If time-constrained: Run the vertical trace — it catches more integration bugs per minute.

Step 4: Manual verification checklist

For each file changed, verify:

  • Does the change do what was asked?
  • No secrets committed
  • No debug artifacts left
  • File permissions correct
  • If config changed: service reloaded/restarted?

Step 5: Report

Present as:

VERIFICATION — <date>
Files changed: N
Tests: X passed, Y failed (or N/A)
Lint: X issues (or N/A)
Types: clean (or N/A)
Pipeline (horizontal): X/Y endpoints pass (or N/A)
Pipeline (vertical): data traced input→output / [list gaps] (or N/A)
Manual checks: all clear / [list issues]

Anti-patterns

  • NEVER say "looks good" without running actual commands
  • NEVER skip the git diff
  • NEVER declare work complete if any test fails

Source

git clone https://github.com/parthalon025/autonomous-coding-toolkit/blob/main/skills/verify/SKILL.mdView on GitHub

Overview

The verify skill provides a self-verification checklist to run before marking work complete, committing, or opening PRs. It guides you through diff checks, automated checks, integration steps, and a standardized report to ensure quality.

How This Skill Works

Start by inspecting changes with git diff. Then run available automated checks (tests, linter, type check, build). If applicable, perform integration wiring and lesson scans, followed by pipeline tests and a formal VERIFICATION report.

When to Use It

  • Before declaring work complete on a feature
  • Before committing changes to a branch
  • Before creating a pull request to the main branch
  • After making multi-component changes in a session
  • When integrating wiring, lesson scanning, or contract tests

Quick Start

  1. Step 1: git diff --stat; git diff --cached --stat
  2. Step 2: Run available checks (tests, linter, type check, build) in order
  3. Step 3: Generate VERIFICATION — <date>, Files changed: N, results

Best Practices

  • Always run git diff --stat and git diff --cached --stat
  • Run tests, linters, type checks, and builds when applicable
  • Perform integration wiring and lesson scanning for multi-component work
  • Execute horizontal and vertical pipeline checks if the service has API/UI/data flow
  • Generate and record the VERIFICATION report verbatim (date, counts, results)

Example Use Cases

  • Adding a new user API: verify diff, run tests, lint, type-check, build, and document results
  • Fixing a bug in payments module: ensure no secrets or debugs remain; re-run checks
  • Refactoring across modules: perform integration wiring checks and lesson-scanner run
  • New multi-service feature: run horizontal and vertical pipeline tests and report outcomes
  • Documentation or config tweak: still run checks and capture a VERIFICATION report

Frequently Asked Questions

Add this skill to your agents
Sponsor this space

Reach thousands of developers