Get the FREE Ultimate OpenClaw Setup Guide →

lesson-learned

npx machina-cli add skill softaworks/agent-toolkit/lesson-learned --openclaw
Files (1)
SKILL.md
4.6 KB

Lesson Learned

Extract specific, grounded software engineering lessons from actual code changes. Not a lecture -- a mirror. Show the user what their code already demonstrates.

Before You Begin

Load the principles reference first.

  1. Read references/se-principles.md to have the principle catalog available
  2. Optionally read references/anti-patterns.md if you suspect the changes include areas for improvement
  3. Determine the scope of analysis (see Phase 1)

Do not proceed until you've loaded at least se-principles.md.

Phase 1: Determine Scope

Ask the user or infer from context what to analyze.

ScopeGit CommandsWhen to Use
Feature branchgit log main..HEAD --oneline + git diff main...HEADUser is on a non-main branch (default)
Last N commitsgit log --oneline -N + git diff HEAD~N..HEADUser specifies a range, or on main (default N=5)
Specific commitgit show <sha>User references a specific commit
Working changesgit diff + git diff --cachedUser says "what about these changes?" before committing

Default behavior:

  • If on a feature branch: analyze branch commits vs main
  • If on main: analyze the last 5 commits
  • If the user provides a different scope, use that

Phase 2: Gather Changes

  1. Run git log with the determined scope to get the commit list and messages
  2. Run git diff for the full diff of the scope
  3. If the diff is large (>500 lines), use git diff --stat first, then selectively read the top 3-5 most-changed files
  4. Read commit messages carefully -- they contain intent that raw diffs miss
  5. Only read changed files. Do not read the entire repo.

Phase 3: Analyze

Identify the dominant pattern -- the single most instructive thing about these changes.

Look for:

  • Structural decisions -- How was the code organized? Why those boundaries?
  • Trade-offs made -- What was gained vs. sacrificed? (readability vs. performance, DRY vs. clarity, speed vs. correctness)
  • Problems solved -- What was the before/after? What made the "after" better?
  • Missed opportunities -- Where could the code improve? (present gently as "next time, consider...")

Map findings to specific principles from references/se-principles.md. Be specific -- quote actual code, reference actual file names and line changes.

Phase 4: Present the Lesson

Use this template:

## Lesson: [Principle Name]

**What happened in the code:**
[2-3 sentences describing the specific change, referencing files and commits]

**The principle at work:**
[1-2 sentences explaining the SE principle]

**Why it matters:**
[1-2 sentences on the practical consequence -- what would go wrong without this, or what goes right because of it]

**Takeaway for next time:**
[One concrete, actionable sentence the user can apply to future work]

If there is a second lesson worth noting (maximum 2 additional):

---

### Also worth noting: [Principle Name]

**In the code:** [1 sentence]
**The principle:** [1 sentence]
**Takeaway:** [1 sentence]

What NOT to Do

AvoidWhyInstead
Listing every principle that vaguely appliesOverwhelming and genericPick the 1-2 most relevant
Analyzing files that were not changedScope creepStick to the diff
Ignoring commit messagesThey contain intent that diffs missRead them as primary context
Abstract advice disconnected from the codeNot actionableAlways reference specific files/lines
Negative-only feedbackDemoralizingLead with what works, then suggest improvements
More than 3 lessonsDilutes the insightOne well-grounded lesson beats seven vague ones

Conversation Style

  • Reflective, not prescriptive. Use the user's own code as primary evidence.
  • Never say "you should have..." -- instead use "the approach here shows..." or "next time you face this, consider..."
  • If the code is good, say so. Not every lesson is about what went wrong. Recognizing good patterns reinforces them.
  • If the changes are trivial (a single config tweak, a typo fix), say so honestly rather than forcing a lesson. "These changes are straightforward -- no deep lesson here, just good housekeeping."
  • Be specific. Generic advice is worthless. Every claim must point to a concrete code change.

Source

git clone https://github.com/softaworks/agent-toolkit/blob/main/skills/lesson-learned/SKILL.mdView on GitHub

Overview

Lesson Learned extracts concrete software engineering takeaways from recent git changes. It ties what the code shows to established principles, turning diffs into actionable lessons rather than generic critique. This helps teams reflect on past work and improve future decisions.

How This Skill Works

The skill loads a principles reference (se-principles.md), determines the scope of analysis from the git history, gathers the relevant diffs, and identifies a dominant pattern. It then maps findings to specific SE principles and presents them using a structured lesson template.

When to Use It

  • When you want an engineering takeaway from recent work or a specific set of commits
  • When you need to answer questions like 'what did I just learn?' or 'what is the lesson here?'
  • When you want to map code changes to concrete principles from se-principles.md
  • When reviewing a feature branch or last commits to extract actionable insights
  • When reflecting on design trade-offs uncovered by diffs and commits

Quick Start

  1. Step 1: Load the principles reference first (references/se-principles.md).
  2. Step 2: Determine scope and gather changes with git log and git diff for the chosen range; read commit messages for intent.
  3. Step 3: Analyze for a dominant pattern, map to a principle, and present the Lesson using the provided template.

Best Practices

  • Load references/se-principles.md before starting analysis
  • Read commit messages carefully and focus on changed files only
  • Keep analysis within the actual diff scope; avoid repo-wide speculation
  • Quote specific files/lines and reference corresponding commits
  • Present findings using the Lesson template and cite the matching principle

Example Use Cases

  • Example 1: Analyzing a feature-branch change set shows a fail-fast improvement in src/api/input_validation.go; Lesson: Fail-fast validation can prevent downstream errors.
  • Example 2: Duplicated logic across modules was reduced by a refactor in src/utils/auth.go and src/services/authenticator.go; Lesson: DRY principle improves maintainability but watch for over-abstraction.
  • Example 3: Logging verbosity increased in src/server/logging.go with a more configurable logger; Lesson: Balance observability with performance and avoid noisy logs.
  • Example 4: A bug fix across multiple tests reveals weak edge-case coverage; Lesson: Invest in testability and targeted tests for boundary conditions.
  • Example 5: A performance tweak in hot path code shows profiling before optimization; Lesson: Measure impact before changing logic to avoid premature optimization.

Frequently Asked Questions

Add this skill to your agents
Sponsor this space

Reach thousands of developers