Get the FREE Ultimate OpenClaw Setup Guide →

drift-detect

Scanned
npx machina-cli add skill rana/yogananda-skills/drift-detect --openclaw
Files (1)
SKILL.md
2.3 KB

Read all project markdown documents to understand the stated architecture.

Architectural Drift Detection

Compare intent vs. reality:

  1. Stated patterns — What does the documentation say the architecture should look like? Design patterns, module boundaries, data flow, naming conventions, layering.
  2. Actual patterns — Sample the codebase. What patterns are actually in use? Are there emergent conventions that nobody decided on?
  3. Divergence vectors — Where has the code drifted from the stated architecture? Categorize each drift:
    • Accidental — Small PRs that individually made sense but collectively changed the architecture
    • Intentional but undocumented — Deliberate improvements that were never reflected in docs
    • Contradictory — Multiple conflicting patterns coexisting in the same codebase
  4. Emerging patterns — Are there new patterns forming that should be codified? Conventions that most code follows but no document specifies?
  5. Erosion points — Where is the architecture actively eroding? Module boundaries being crossed? Layers being bypassed?

Focus area: $ARGUMENTS

For every drift detected:

  1. What the stated pattern is (with doc location)
  2. What the actual pattern is (with code examples)
  3. Whether to update the code or update the docs
  4. Priority (structural risk vs. cosmetic inconsistency)

Present as an action list. No changes to files — document only.

Output Management

Hard constraints:

  • Segment output into groups of up to 8 drift findings, ordered by structural impact over cosmetic inconsistency.
  • Write each segment incrementally. Do not accumulate a single large response.
  • After completing each segment, continue immediately to the next. Do not wait for user input.
  • Continue until ALL drift findings are reported. State the total count when complete.
  • If the analysis surface is too large to complete in one session, state what was covered and what remains.

What questions would I benefit from asking?

What am I not asking?

Source

git clone https://github.com/rana/yogananda-skills/blob/main/skills/drift-detect/SKILL.mdView on GitHub

Overview

Drift-detect scans the project to surface unconscious architectural evolution. It compares the stated architecture in documentation against emergent patterns in the code, surfaces drift, and guides whether to update the docs or the implementation. This helps keep architecture coherent and reduces erosion over time.

How This Skill Works

Reads all project markdown to extract stated patterns and samples the codebase to identify actual patterns. For each drift it records the doc location, the actual pattern with code examples, and a recommended action plus priority. Results are emitted as incrementally grouped action lists ordered by structural impact.

When to Use It

  • Periodic architecture audits to verify alignment between docs and code
  • When the codebase feels inconsistent or divergent from the documented design
  • After major refactors or migrations that could drift patterns
  • Before introducing new cross-cutting concerns or layers to ensure consistent architecture
  • During onboarding or cross-team collaboration to align subsystems

Quick Start

  1. Step 1: Run drift-detect with an optional focus area to scope the analysis (use $ARGUMENTS)
  2. Step 2: Review the action list and note the doc location and code examples for each drift
  3. Step 3: Decide and document whether to update the docs or update the code, prioritizing structural risk

Best Practices

  • Scope drift using the optional focus area ARGUMENTS to stay precise
  • Document every drift with a clear doc location and a corresponding code example
  • Classify drift as Accidental, Intentional but undocumented, or Contradictory
  • Prioritize findings by structural risk so architectural issues are surfaced first
  • For each drift, decide whether to update docs or update the code and capture the rationale

Example Use Cases

  • Documentation says clean module boundaries but code crosses boundaries in refactors
  • Data flow diagrams show a pattern not reflected in actual controllers
  • Naming conventions in the data layer diverge from the docs
  • Two conflicting patterns exist in the same subsystem
  • Emerging patterns in service interactions not yet codified

Frequently Asked Questions

Add this skill to your agents
Sponsor this space

Reach thousands of developers