Get the FREE Ultimate OpenClaw Setup Guide →

red-phase-verifier

Scanned
npx machina-cli add skill smicolon/ai-kit/red-phase-verifier --openclaw
Files (1)
SKILL.md
3.8 KB

Red Phase Verifier

Ensures tests are written BEFORE implementation and fail initially.

Purpose

In TDD:

  1. RED: Write failing tests
  2. GREEN: Implement to pass
  3. REFACTOR: Improve while green

This skill enforces Step 1.

Activation Triggers

This skill activates when:

  • Running /dev-loop command
  • Creating test files before implementation
  • Mentioning "write tests first"

Verification Process

Step 1: Check Implementation Exists

# If implementing UserService.create_user:
# Check if method exists and has logic

import inspect
from app.services import UserService

method = getattr(UserService, 'create_user', None)
if method:
    source = inspect.getsource(method)
    if 'pass' not in source and 'raise NotImplementedError' not in source:
        # Implementation exists!
        WARN: "Implementation found before tests"

Step 2: Run Tests

pytest tests/test_new_feature.py -v --tb=short

Step 3: Verify Failures

Expected output:

FAILED tests/test_new_feature.py::test_create - AssertionError
FAILED tests/test_new_feature.py::test_validate - AttributeError

Step 4: Report

RED PHASE VERIFICATION

Tests written: 5
Tests failing: 5

Red phase confirmed!
Implementation may now proceed.

Warning Cases

Case 1: Tests Pass Before Implementation

WARNING: Tests passed before implementation!

Failing tests: 0/5

This indicates:
1. Tests don't test new functionality
2. Tests have trivial assertions
3. Implementation already exists

Action: Regenerate stricter tests

Case 2: Partial Failures

PARTIAL RED PHASE

Tests failing: 3/5
Tests passing: 2/5

Passing tests may be:
1. Testing existing functionality (OK)
2. Trivial assertions (NOT OK)

Review passing tests:
- test_helper_exists: assert helper <- TRIVIAL
- test_constant: assert X == X <- TRIVIAL

Action: Strengthen or remove trivial tests

Case 3: Wrong Failure Type

UNEXPECTED FAILURE TYPE

test_create_user: SyntaxError in test file

Tests should fail due to:
- AssertionError (expected behavior not met)
- AttributeError (method doesn't exist yet)
- NotImplementedError (placeholder)

NOT due to:
- SyntaxError (test file broken)
- ImportError (missing dependency)
- TypeError (wrong arguments)

Action: Fix test file syntax

Enforcement

When red phase not verified:

CANNOT PROCEED TO GREEN PHASE

Red phase requirements not met:
- 2 tests passed before implementation
- 1 test has syntax error

Fix issues, then run verification again.

Expected Failure Types

Good Failures (Proceed)

  • AssertionError - Assertion failed (expected)
  • AttributeError - Method/attribute doesn't exist yet
  • NotImplementedError - Placeholder implementation
  • ModuleNotFoundError - Module not created yet

Bad Failures (Fix First)

  • SyntaxError - Test code is broken
  • IndentationError - Test code formatting issue
  • NameError - Undefined variable in test
  • TypeError - Wrong arguments in test setup

Dev Loop Integration

When integrated with /dev-loop:

  1. Tests written -> Run red phase verifier
  2. If all fail correctly -> Proceed to implementation
  3. If some pass -> Warn and suggest fixes
  4. If wrong failures -> Block until fixed
TDD LOOP: RED PHASE

Running red phase verification...

test_create_user: FAILED (AttributeError)
test_validate_email: FAILED (AssertionError)
test_duplicate_email: FAILED (AssertionError)
test_get_user: FAILED (NotImplementedError)
test_list_users: FAILED (NotImplementedError)

Red phase: 5/5 tests failing correctly

Proceeding to GREEN phase...

Source

git clone https://github.com/smicolon/ai-kit/blob/main/packs/django/skills/red-phase-verifier/SKILL.mdView on GitHub

Overview

Red Phase Verifier enforces the TDD first step: write failing tests before implementing code. It checks that tests fail initially and reports if red phase is not properly demonstrated. This helps maintain discipline and prevents premature green implementations.

How This Skill Works

The verifier activates during TDD workflows, inspects code for existing implementations, and runs pytest to verify that tests fail as expected. It then prints a red-phase report and guides whether to proceed to green or adjust tests to ensure proper failures.

When to Use It

  • Running the /dev-loop command to validate the red phase
  • Creating test files before implementing the feature
  • Mentioning 'write tests first' or practicing TDD in a PR
  • Verifying that failing tests exist before coding new functionality
  • Enforcing disciplined test-driven development in new features

Quick Start

  1. Step 1: Check for existing implementation in target code and confirm it's incomplete
  2. Step 2: Run pytest tests to observe red failures
  3. Step 3: Review failure types and ensure all tests fail before proceeding

Best Practices

  • Ensure tests actually fail (not skipped or green) before touching code
  • Look for placeholder implementations (pass, NotImplementedError) and remove them only after red phase
  • Run pytest with verbose output to see clear failure messages
  • Document expected failure types (e.g., AssertionError, AttributeError, NotImplementedError)
  • If some tests pass, reassess test coverage or tighten tests to enforce red phase

Example Use Cases

  • Kick off a new feature by writing failing tests before code
  • Detecting a test that raises AttributeError for a missing method
  • Seeing a NotImplementedError indicates placeholder implementation
  • Red phase check in a Django service where tests fail as expected
  • CI pipeline blocks PR until red phase is satisfied

Frequently Asked Questions

Add this skill to your agents
Sponsor this space

Reach thousands of developers