Get the FREE Ultimate OpenClaw Setup Guide →

spec-driven-development

Scanned
npx machina-cli add skill addyosmani/agent-skills/spec-driven-development --openclaw
Files (1)
SKILL.md
7.3 KB

Spec-Driven Development

Overview

Write a structured specification before writing any code. The spec is the shared source of truth between you and the human engineer — it defines what we're building, why, and how we'll know it's done. Code without a spec is guessing.

When to Use

  • Starting a new project or feature
  • Requirements are ambiguous or incomplete
  • The change touches multiple files or modules
  • You're about to make an architectural decision
  • The task would take more than 30 minutes to implement

When NOT to use: Single-line fixes, typo corrections, or changes where requirements are unambiguous and self-contained.

The Gated Workflow

Spec-driven development has four phases. Do not advance to the next phase until the current one is validated.

SPECIFY ──→ PLAN ──→ TASKS ──→ IMPLEMENT
   │          │        │          │
   ▼          ▼        ▼          ▼
 Human      Human    Human      Human
 reviews    reviews  reviews    reviews

Phase 1: Specify

Start with a high-level vision. Ask the human clarifying questions until requirements are concrete.

Surface assumptions immediately. Before writing any spec content, list what you're assuming:

ASSUMPTIONS I'M MAKING:
1. This is a web application (not native mobile)
2. Authentication uses session-based cookies (not JWT)
3. The database is PostgreSQL (based on existing Prisma schema)
4. We're targeting modern browsers only (no IE11)
→ Correct me now or I'll proceed with these.

Never silently fill in ambiguous requirements. The spec's entire purpose is to surface misunderstandings before code gets written — assumptions are the most dangerous form of misunderstanding.

Write a spec document covering these six core areas:

  1. Objective — What are we building and why? Who is the user? What does success look like?

  2. Commands — Full executable commands with flags, not just tool names.

    Build: npm run build
    Test: npm test -- --coverage
    Lint: npm run lint --fix
    Dev: npm run dev
    
  3. Project Structure — Where source code lives, where tests go, where docs belong.

    src/           → Application source code
    src/components → React components
    src/lib        → Shared utilities
    tests/         → Unit and integration tests
    e2e/           → End-to-end tests
    docs/          → Documentation
    
  4. Code Style — One real code snippet showing your style beats three paragraphs describing it. Include naming conventions, formatting rules, and examples of good output.

  5. Testing Strategy — What framework, where tests live, coverage expectations, which test levels for which concerns.

  6. Boundaries — Three-tier system:

    • Always do: Run tests before commits, follow naming conventions, validate inputs
    • Ask first: Database schema changes, adding dependencies, changing CI config
    • Never do: Commit secrets, edit vendor directories, remove failing tests without approval

Spec template:

# Spec: [Project/Feature Name]

## Objective
[What we're building and why. User stories or acceptance criteria.]

## Tech Stack
[Framework, language, key dependencies with versions]

## Commands
[Build, test, lint, dev — full commands]

## Project Structure
[Directory layout with descriptions]

## Code Style
[Example snippet + key conventions]

## Testing Strategy
[Framework, test locations, coverage requirements, test levels]

## Boundaries
- Always: [...]
- Ask first: [...]
- Never: [...]

## Success Criteria
[How we'll know this is done — specific, testable conditions]

## Open Questions
[Anything unresolved that needs human input]

Reframe instructions as success criteria. When receiving vague requirements, translate them into concrete conditions:

REQUIREMENT: "Make the dashboard faster"

REFRAMED SUCCESS CRITERIA:
- Dashboard LCP < 2.5s on 4G connection
- Initial data load completes in < 500ms
- No layout shift during load (CLS < 0.1)
→ Are these the right targets?

This lets you loop, retry, and problem-solve toward a clear goal rather than guessing what "faster" means.

Phase 2: Plan

With the validated spec, generate a technical implementation plan:

  1. Identify the major components and their dependencies
  2. Determine the implementation order (what must be built first)
  3. Note risks and mitigation strategies
  4. Identify what can be built in parallel vs. what must be sequential
  5. Define verification checkpoints between phases

The plan should be reviewable: the human should be able to read it and say "yes, that's the right approach" or "no, change X."

Phase 3: Tasks

Break the plan into discrete, implementable tasks:

  • Each task should be completable in a single focused session
  • Each task has explicit acceptance criteria
  • Each task includes a verification step (test, build, manual check)
  • Tasks are ordered by dependency, not by perceived importance
  • No task should require changing more than ~5 files

Task template:

- [ ] Task: [Description]
  - Acceptance: [What must be true when done]
  - Verify: [How to confirm — test command, build, manual check]
  - Files: [Which files will be touched]

Phase 4: Implement

Execute tasks one at a time following incremental-implementation and test-driven-development skills.

Keeping the Spec Alive

The spec is a living document, not a one-time artifact:

  • Update when decisions change — If you discover the data model needs to change, update the spec first, then implement.
  • Update when scope changes — Features added or cut should be reflected in the spec.
  • Commit the spec — The spec belongs in version control alongside the code.
  • Reference the spec in PRs — Link back to the spec section that each PR implements.

Common Rationalizations

RationalizationReality
"This is simple, I don't need a spec"Simple tasks don't need long specs, but they still need acceptance criteria. A two-line spec is fine.
"I'll write the spec after I code it"That's documentation, not specification. The spec's value is in forcing clarity before code.
"The spec will slow us down"A 15-minute spec prevents hours of rework. Waterfall in 15 minutes beats debugging in 15 hours.
"Requirements will change anyway"That's why the spec is a living document. An outdated spec is still better than no spec.
"The user knows what they want"Even clear requests have implicit assumptions. The spec surfaces those assumptions.

Red Flags

  • Starting to write code without any written requirements
  • Asking "should I just start building?" before clarifying what "done" means
  • Implementing features not mentioned in any spec or task list
  • Making architectural decisions without documenting them
  • Skipping the spec because "it's obvious what to build"

Verification

Before proceeding to implementation, confirm:

  • The spec covers all six core areas
  • The human has reviewed and approved the spec
  • Success criteria are specific and testable
  • Boundaries (Always/Ask First/Never) are defined
  • The spec is saved to a file in the repository

Source

git clone https://github.com/addyosmani/agent-skills/blob/main/skills/spec-driven-development/SKILL.mdView on GitHub

Overview

Spec-driven development requires creating a structured specification before coding. The spec acts as the shared source of truth between you and the human engineer, defining what’s built, why, and how success is measured. Without a spec, code is effectively guessing.

How This Skill Works

The process follows a gated four-phase workflow: SPECIFY → PLAN → TASKS → IMPLEMENT, with human reviews at each phase. Start by surfacing assumptions, draft a comprehensive spec covering objective, commands, project structure, code style, testing strategy, and boundaries, then validate before proceeding to planning and implementation.

When to Use It

  • Starting a new project or feature
  • Requirements are ambiguous or incomplete
  • The change touches multiple files or modules
  • You're about to make an architectural decision
  • The task would take more than 30 minutes to implement

Quick Start

  1. Step 1: Specify — surface assumptions, define Objective, Commands, Project Structure, Code Style, Testing Strategy, Boundaries, and Open Questions
  2. Step 2: Review — obtain human reviews to validate the spec and surface any gaps or conflicts
  3. Step 3: Plan, Tasks, Implement — translate the spec into actionable tasks, review them, then implement once validated

Best Practices

  • Surface assumptions immediately before writing any spec content
  • Write a spec document covering Objective, Commands, Project Structure, Code Style, Testing Strategy, and Boundaries
  • Never silently fill in ambiguous requirements; surface misunderstandings
  • Reframe vague requirements as concrete, testable success criteria
  • Follow the gated workflow and validate at each phase before moving forward

Example Use Cases

  • Kick off a new web feature where requirements are unclear by drafting a complete spec first
  • Adopt an architectural refactor after outlining objectives, constraints, and success criteria
  • Implement a cross-cutting change that touches several modules by detailing the project structure and commands upfront
  • Define an API contract and testing strategy before integration work begins
  • Expand product scope with uncertain boundaries by surfacing assumptions and documenting success criteria

Frequently Asked Questions

Add this skill to your agents
Sponsor this space

Reach thousands of developers