Get the FREE Ultimate OpenClaw Setup Guide →

writing-plans

Scanned
npx machina-cli add skill tobyhede/turboshovel/writing-plans --openclaw
Files (1)
SKILL.md
3.4 KB

Writing Plans

Overview

Write comprehensive implementation plans assuming the engineer has zero context for our codebase and questionable taste. Document everything they need to know: which files to touch for each task, code, testing, docs they might need to check, how to test it. Give them the whole plan as bite-sized tasks. DRY. YAGNI. TDD. Frequent commits.

Assume they are a skilled developer, but know almost nothing about our toolset or problem domain. Assume they don't know good test design very well.

Announce at start: "I'm using the writing-plans skill to create the implementation plan."

Context: This should be run in a dedicated worktree (created by brainstorming skill).

Save plans to: .work/YYYY-MM-DD-<feature-name>.md

Bite-Sized Task Granularity

Each step is one action (2-5 minutes):

  • "Write the failing test" - step
  • "Run it to make sure it fails" - step
  • "Implement the minimal code to make the test pass" - step
  • "Run the tests and make sure they pass" - step
  • "Commit" - step

Plan Document Header

Every plan MUST start with this header:

# [Feature Name] Implementation Plan

> **For Claude:** REQUIRED SUB-SKILL: Use turboshovel:executing-plans to implement this plan task-by-task.

**Goal:** [One sentence describing what this builds]

**Architecture:** [2-3 sentences about approach]

**Tech Stack:** [Key technologies/libraries]

---

Task Structure

### Task N: [Component Name]

**Files:**
- Create: `exact/path/to/file.py`
- Modify: `exact/path/to/existing.py:123-145`
- Test: `tests/exact/path/to/test.py`

**Step 1: Write the failing test**

```python
def test_specific_behavior():
    result = function(input)
    assert result == expected

Step 2: Run test to verify it fails

Run: pytest tests/path/test.py::test_name -v Expected: FAIL with "function not defined"

Step 3: Write minimal implementation

def function(input):
    return expected

Step 4: Run test to verify it passes

Run: pytest tests/path/test.py::test_name -v Expected: PASS

Step 5: Commit

git add tests/path/test.py src/path/file.py
git commit -m "feat: add specific feature"

## Remember
- Exact file paths always
- Complete code in plan (not "add validation")
- Exact commands with expected output
- Reference relevant skills with @ syntax
- SRP, DRY, YAGNI, TDD, frequent commits

## Execution Handoff

After saving the plan, offer execution choice:

**"Plan complete and saved to `.work/<filename>.md`. Two execution options:**

**1. Subagent-Driven (this session)** - I dispatch fresh subagent per task, review between tasks, fast iteration

**2. Parallel Session (separate)** - Open new session with executing-plans, batch execution with checkpoints

**Which approach?"**

**If Subagent-Driven chosen:**
- **REQUIRED SUB-SKILL:** Use turboshovel:subagent-driven-development
- Stay in this session
- Fresh subagent per task + code review

**If Parallel Session chosen:**
- Guide them to open new session in worktree
- **REQUIRED SUB-SKILL:** New session uses turboshovel:executing-plans

Source

git clone https://github.com/tobyhede/turboshovel/blob/main/plugin/skills/writing-plans/SKILL.mdView on GitHub

Overview

Writing Plans creates comprehensive implementation plans for engineers who have zero context about the codebase. It details every required file touch, code snippets, tests, docs, and verification steps, enabling rapid, traceable delivery.

How This Skill Works

It assumes the engineer has zero codebase context and produces bite-sized tasks (2–5 minutes) with exact file paths. Plan creation runs in a dedicated worktree and saves to .work/YYYY-MM-DD-<feature-name>.md, ensuring reproducibility. The plan follows a repeatable task structure with tests, minimal code, and clear verification steps.

When to Use It

  • Design is complete and you need granular, engineers-ready tasks with exact files, code, and tests.
  • You are onboarding engineers who know almost nothing about the codebase or domain.
  • You want to document every testing, docs, and verification step before touching code.
  • You prefer bite-sized, 2–5 minute tasks to enable frequent commits and fast feedback.
  • You want the plan saved in a dedicated worktree for auditability and handoff.

Quick Start

  1. Step 1: Announce usage and create a new worktree entry for the feature.
  2. Step 2: Create the plan header and fill Goal, Architecture, and Tech Stack sections.
  3. Step 3: Break work into Task N: sections with exact file paths, code, tests, docs, and save to .work/YYYY-MM-DD-<feature-name>.md.

Best Practices

  • Always start with the standard header and clearly state Goal, Architecture, and Tech Stack.
  • Include exact file paths for every task (Create/Modify/Test) and keep them concrete.
  • Keep tasks atomic (2–5 minutes each) and follow the 5-step task cycle (write test, run fail, implement, run pass, commit).
  • Write tests first (TDD) and ensure each task has a verification step in the plan.
  • Reference related turboshovel skills with @ syntax where relevant (e.g., @turboshovel:executing-plans or @turboshovel:subagent-driven-development).

Example Use Cases

  • Add a new REST endpoint with full plan coverage: exact files, test, docs, and verification.
  • Introduce a feature flag with a complete plan including tests and rollout docs.
  • Refactor a module with a plan that lists all touched files, minimal code changes, and test updates.
  • Implement a logging subsystem with tests, docs, and verification steps.
  • Migrate a database schema with migration script, tests, and rollback documentation.

Frequently Asked Questions

Add this skill to your agents
Sponsor this space

Reach thousands of developers