lesson-learned
npx machina-cli add skill softaworks/agent-toolkit/lesson-learned --openclawLesson Learned
Extract specific, grounded software engineering lessons from actual code changes. Not a lecture -- a mirror. Show the user what their code already demonstrates.
Before You Begin
Load the principles reference first.
- Read
references/se-principles.mdto have the principle catalog available - Optionally read
references/anti-patterns.mdif you suspect the changes include areas for improvement - Determine the scope of analysis (see Phase 1)
Do not proceed until you've loaded at least se-principles.md.
Phase 1: Determine Scope
Ask the user or infer from context what to analyze.
| Scope | Git Commands | When to Use |
|---|---|---|
| Feature branch | git log main..HEAD --oneline + git diff main...HEAD | User is on a non-main branch (default) |
| Last N commits | git log --oneline -N + git diff HEAD~N..HEAD | User specifies a range, or on main (default N=5) |
| Specific commit | git show <sha> | User references a specific commit |
| Working changes | git diff + git diff --cached | User says "what about these changes?" before committing |
Default behavior:
- If on a feature branch: analyze branch commits vs main
- If on main: analyze the last 5 commits
- If the user provides a different scope, use that
Phase 2: Gather Changes
- Run
git logwith the determined scope to get the commit list and messages - Run
git difffor the full diff of the scope - If the diff is large (>500 lines), use
git diff --statfirst, then selectively read the top 3-5 most-changed files - Read commit messages carefully -- they contain intent that raw diffs miss
- Only read changed files. Do not read the entire repo.
Phase 3: Analyze
Identify the dominant pattern -- the single most instructive thing about these changes.
Look for:
- Structural decisions -- How was the code organized? Why those boundaries?
- Trade-offs made -- What was gained vs. sacrificed? (readability vs. performance, DRY vs. clarity, speed vs. correctness)
- Problems solved -- What was the before/after? What made the "after" better?
- Missed opportunities -- Where could the code improve? (present gently as "next time, consider...")
Map findings to specific principles from references/se-principles.md. Be specific -- quote actual code, reference actual file names and line changes.
Phase 4: Present the Lesson
Use this template:
## Lesson: [Principle Name]
**What happened in the code:**
[2-3 sentences describing the specific change, referencing files and commits]
**The principle at work:**
[1-2 sentences explaining the SE principle]
**Why it matters:**
[1-2 sentences on the practical consequence -- what would go wrong without this, or what goes right because of it]
**Takeaway for next time:**
[One concrete, actionable sentence the user can apply to future work]
If there is a second lesson worth noting (maximum 2 additional):
---
### Also worth noting: [Principle Name]
**In the code:** [1 sentence]
**The principle:** [1 sentence]
**Takeaway:** [1 sentence]
What NOT to Do
| Avoid | Why | Instead |
|---|---|---|
| Listing every principle that vaguely applies | Overwhelming and generic | Pick the 1-2 most relevant |
| Analyzing files that were not changed | Scope creep | Stick to the diff |
| Ignoring commit messages | They contain intent that diffs miss | Read them as primary context |
| Abstract advice disconnected from the code | Not actionable | Always reference specific files/lines |
| Negative-only feedback | Demoralizing | Lead with what works, then suggest improvements |
| More than 3 lessons | Dilutes the insight | One well-grounded lesson beats seven vague ones |
Conversation Style
- Reflective, not prescriptive. Use the user's own code as primary evidence.
- Never say "you should have..." -- instead use "the approach here shows..." or "next time you face this, consider..."
- If the code is good, say so. Not every lesson is about what went wrong. Recognizing good patterns reinforces them.
- If the changes are trivial (a single config tweak, a typo fix), say so honestly rather than forcing a lesson. "These changes are straightforward -- no deep lesson here, just good housekeeping."
- Be specific. Generic advice is worthless. Every claim must point to a concrete code change.
Source
git clone https://github.com/softaworks/agent-toolkit/blob/main/skills/lesson-learned/SKILL.mdView on GitHub Overview
Lesson Learned extracts concrete software engineering takeaways from recent git changes. It ties what the code shows to established principles, turning diffs into actionable lessons rather than generic critique. This helps teams reflect on past work and improve future decisions.
How This Skill Works
The skill loads a principles reference (se-principles.md), determines the scope of analysis from the git history, gathers the relevant diffs, and identifies a dominant pattern. It then maps findings to specific SE principles and presents them using a structured lesson template.
When to Use It
- When you want an engineering takeaway from recent work or a specific set of commits
- When you need to answer questions like 'what did I just learn?' or 'what is the lesson here?'
- When you want to map code changes to concrete principles from se-principles.md
- When reviewing a feature branch or last commits to extract actionable insights
- When reflecting on design trade-offs uncovered by diffs and commits
Quick Start
- Step 1: Load the principles reference first (references/se-principles.md).
- Step 2: Determine scope and gather changes with git log and git diff for the chosen range; read commit messages for intent.
- Step 3: Analyze for a dominant pattern, map to a principle, and present the Lesson using the provided template.
Best Practices
- Load references/se-principles.md before starting analysis
- Read commit messages carefully and focus on changed files only
- Keep analysis within the actual diff scope; avoid repo-wide speculation
- Quote specific files/lines and reference corresponding commits
- Present findings using the Lesson template and cite the matching principle
Example Use Cases
- Example 1: Analyzing a feature-branch change set shows a fail-fast improvement in src/api/input_validation.go; Lesson: Fail-fast validation can prevent downstream errors.
- Example 2: Duplicated logic across modules was reduced by a refactor in src/utils/auth.go and src/services/authenticator.go; Lesson: DRY principle improves maintainability but watch for over-abstraction.
- Example 3: Logging verbosity increased in src/server/logging.go with a more configurable logger; Lesson: Balance observability with performance and avoid noisy logs.
- Example 4: A bug fix across multiple tests reveals weak edge-case coverage; Lesson: Invest in testability and targeted tests for boundary conditions.
- Example 5: A performance tweak in hot path code shows profiling before optimization; Lesson: Measure impact before changing logic to avoid premature optimization.