capture-lesson
Scannednpx machina-cli add skill parthalon025/autonomous-coding-toolkit/capture-lesson --openclawCapture Lesson
Overview
Structured process for writing new lessons that enforces the FRAMEWORK.md template, OIL tier rules, category validation, and all three validation scripts before committing. Prevents manual shortcutting that skips recurrence analysis and sustain checks.
When to Use
- After discovering a bug, audit finding, or session insight worth capturing
- When
/capture-lessonis invoked - After a debugging session reveals a repeatable anti-pattern
- When a code review or counter session surfaces a new failure mode
Process (follow this order exactly)
Step 1: Gather Context
Ask the user:
- What happened? — factual description with error messages, data contradictions, numbers
- Which files were involved? — specific paths
- Which cluster does this resemble? — A (Silent Failures), B (Integration Boundary), C (Cold-Start), D (Specification Drift), E (Context & Retrieval), F (Planning & Control Flow), or standalone
Step 2: Draft the Lesson File
Create ~/Documents/docs/lessons/YYYY-MM-DD-short-description.md using the exact FRAMEWORK.md template:
# Lesson: [Short Title]
**Date:** YYYY-MM-DD
**System:** [project name]
**Tier:** observation | insight | lesson
**Category:** [from enum below]
**Keywords:** [comma-separated for grep retrieval]
**Files:** `path/to/file1`, `path/to/file2`
## Observation (What Happened)
[Factual description. Include numbers, error messages, data contradictions.]
## Analysis (Root Cause — 5 Whys)
**Why #1:** [surface cause]
**Why #2:** [why that happened]
**Why #3:** [root cause — deepest controllable cause]
## Corrective Actions
| # | Action | Status | Owner | Evidence |
|---|--------|--------|-------|----------|
| 1 | [specific action] | proposed | [who] | — |
## Ripple Effects
[What other systems/pipelines does this touch?]
## Sustain Plan
- [ ] 7-day check: [what to verify]
- [ ] 30-day check: [confirm no recurrence]
- [ ] Contingency: [if corrective action doesn't hold]
## Key Takeaway
[One sentence. The thing you'd tell someone in 10 seconds.]
Step 2.5: Infer Scope Tags
Determine the lesson's scope by analyzing its content:
-
Check domain signals: Does the lesson reference specific systems?
- Home Assistant, HA entities, MQTT, Frigate →
domain:ha-aria - Telegram bot, polling, getUpdates →
domain:telegram - Notion API, sync, replica →
domain:notion - Ollama, model loading, queue →
domain:ollama
- Home Assistant, HA entities, MQTT, Frigate →
-
Check framework signals: Does it reference specific tooling?
- systemd, journalctl, timers →
framework:systemd - pytest, fixtures, conftest →
framework:pytest - Preact, JSX,
h()→framework:preact
- systemd, journalctl, timers →
-
Check language signals: What language(s) does it apply to?
- Python-only patterns →
language:python - Bash/shell patterns →
language:bash - JavaScript/TypeScript →
language:javascript
- Python-only patterns →
-
Default to
universalif the lesson describes a general principle (error handling, testing, architecture) not specific to any domain/language. -
Propose to user: Present inferred scope tags and ask for confirmation before writing. Example: "Inferred scope:
[domain:ha-aria, language:python]— does this look right?"
Add the scope: field to the YAML frontmatter after languages::
scope: [domain:ha-aria, language:python]
Reference: ~/Documents/docs/lessons/TEMPLATE.md § Scope (Project-Level Filtering) for the full tag vocabulary.
Step 3: Validate Tier (HARD GATE)
Enforce OIL taxonomy rules:
| Tier | Requires | Status |
|---|---|---|
observation | Raw facts only | observed |
insight | Root cause identified via 5 Whys | analyzed |
lesson | Corrective action proposed with owner + timeline | proposed |
lesson_learned | Implementation proof + 30-day sustain evidence | validated |
HARD GATE: Never assign lesson_learned to a new lesson. A new lesson starts at observation, insight, or lesson depending on how far the analysis goes. Promotion to lesson_learned requires sustained evidence over time.
Step 4: Validate Category
Category must be exactly one of:
| Category | Scope |
|---|---|
data-model | Schema, inheritance, data flow |
registration | Module loading, decorators, imports |
cold-start | First-run, missing baselines |
integration | Cross-service, shared state, API contracts |
deployment | Service config, systemd, env vars |
monitoring | Alerts, noise suppression, staleness |
ui | Frontend, data display |
testing | Coverage gaps, mock masking |
performance | Resources, memory, scheduling |
security | Auth, secrets, permissions |
If the lesson doesn't fit any category cleanly, pick the closest match and note the tension in Ripple Effects.
Step 5: Update SUMMARY.md
Edit ~/Documents/docs/lessons/SUMMARY.md:
- Add row to the Quick Reference table with the next sequential number
- Update cluster membership — add the lesson number to the relevant cluster's parenthetical list in the cluster section header
- Update the count in the header line (e.g., "72 lessons" becomes "73 lessons")
- Update tier counts in the Status & Maturity table
Step 6: Run Validation Scripts
Run each script and address output before proceeding:
# Recurrence analysis — if alert triggers, answer the 4 questions before continuing
bash ~/Documents/scripts/lesson-class-check.sh ~/Documents/docs/lessons/YYYY-MM-DD-short-description.md
# Promotion candidates — informational, report to user
bash ~/Documents/scripts/lesson-promote-check.sh
# Overdue sustain items — informational, report to user
bash ~/Documents/scripts/lessons-sustain-check.sh
If lesson-class-check.sh triggers a recurrence alert, answer these 4 questions before proceeding:
- Why didn't the existing cluster mitigations catch this?
- Is this a new sub-pattern or a gap in existing mitigations?
- Should a new mitigation be added to the cluster?
- Should an existing mitigation be strengthened?
Step 7: Commit
Stage and commit with the standard format:
git add ~/Documents/docs/lessons/YYYY-MM-DD-short-description.md ~/Documents/docs/lessons/SUMMARY.md
git commit -m "docs: add lesson #N — short description"
Key References
| File | Purpose |
|---|---|
~/Documents/docs/lessons/FRAMEWORK.md | Template and OIL taxonomy |
~/Documents/docs/lessons/SUMMARY.md | Lesson index (Quick Reference table + clusters) |
~/Documents/scripts/lesson-class-check.sh | Cluster recurrence analysis |
~/Documents/scripts/lesson-promote-check.sh | Hookify promotion candidates |
~/Documents/scripts/lessons-sustain-check.sh | Overdue sustain items |
Common Mistakes
| Mistake | Fix |
|---|---|
Assigning lesson_learned to a new lesson | Start at observation, insight, or lesson — promotion requires 30-day evidence |
| Skipping 5 Whys analysis | If tier is insight or higher, 5 Whys is required — at least 2-3 levels deep |
| Using a category not in the enum | Pick the closest match from the 10 valid categories |
| Forgetting to update SUMMARY.md counts | Always update: row count in header, tier counts in Status table, cluster membership lists |
Skipping lesson-class-check.sh | This is the most important validation — it detects cluster recurrence patterns |
Source
git clone https://github.com/parthalon025/autonomous-coding-toolkit/blob/main/skills/capture-lesson/SKILL.mdView on GitHub Overview
Capture Lesson provides a structured process for writing new lessons that enforces the FRAMEWORK.md template, OIL tier rules, category validation, and all three validation scripts before committing. It prevents manual shortcutting that skips recurrence analysis and sustain checks.
How This Skill Works
When /capture-lesson is invoked, the user is guided through Step 1 Gather Context, Step 2 Draft the Lesson File using the exact FRAMEWORK.md template, Step 2.5 Infer Scope Tags, and Step 3 HARD GATE validation to enforce OIL taxonomy. On successful validation, the lesson is prepared for commit with proper metadata and documentation.
When to Use It
- After discovering a bug, audit finding, or session insight worth capturing
- When /capture-lesson is invoked
- After a debugging session reveals a repeatable anti-pattern
- When a code review or counter session surfaces a new failure mode
Quick Start
- Step 1: Gather Context
- Step 2: Draft the Lesson File using the FRAMEWORK.md template
- Step 3: Infer Scope Tags and run the three validation scripts before committing
Best Practices
- Always draft using the exact FRAMEWORK.md template to ensure consistency
- Fill Observation with raw facts, including error messages, data, and timestamps
- Use the 5 Whys in Analysis to identify root cause
- Propose concrete Corrective Actions with owner and status
- Run all three validation scripts (and infer scope tags) before committing
Example Use Cases
- Bug in data ingestion caused silent data loss; lesson emphasizes boundary checks and adding tests.
- Audit finding reveals a race condition during session finalization; lesson highlights synchronization and idempotency.
- Flaky API integration during debugging; lesson covers retry strategy, timeouts, and clearer integration boundaries.
- Code review uncovers recurring anti-patterns in control flow; lesson focuses on refactoring planning and clearer architecture.
- Performance regression in caching layer after deployment; lesson notes monitoring, cache warmth, and load testing.