Get the FREE Ultimate OpenClaw Setup Guide →

docs-quality

Scanned
npx machina-cli add skill rana/yogananda-skills/docs-quality --openclaw
Files (1)
SKILL.md
2.3 KB

Read all project markdown documents to ground in the project's actual state.

Documentation Quality Review

Evaluate documentation as communication architecture:

  1. AI readability — Can Claude navigate and understand the docs efficiently? Clear identifiers? Consistent structure? Unambiguous cross-references?
  2. Onboarding — Could a new developer (human or AI) understand the project from the docs alone? Reading order clear? Dependencies between documents explicit?
  3. Staleness — Any sections that describe what was rather than what is? Obsolete references? Dead links?
  4. Convention adherence — Document maintenance rules being followed? Identifier conventions consistent? Section headers follow stated patterns?
  5. Completeness — Every significant decision documented? Every component described? Every workflow specified?
  6. Redundancy & reconstructibility — Information duplicated across documents? Single source of truth maintained? Beyond duplication: sections that restate what's already obvious from the code or domain conventions — documentation that exists because someone wrote it, not because a reader needs it.
  7. Novel approaches — Are there documentation practices that would improve communication? Anything worth borrowing from other projects or emerging standards?

Focus area: $ARGUMENTS

For every finding:

  1. The issue or opportunity
  2. The specific location
  3. The proposed improvement

Present as an action list. No changes to files — document only.

Output Management

Hard constraints:

  • Segment output into groups of up to 8 findings, ordered by impact on reader comprehension.
  • Write each segment incrementally. Do not accumulate a single large response.
  • After completing each segment, continue immediately to the next. Do not wait for user input.
  • Continue until ALL findings are reported. State the total count when complete.
  • If the analysis surface is too large to complete in one session, state what was covered and what remains.

What questions would I benefit from asking?

What am I not asking?

Source

git clone https://github.com/rana/yogananda-skills/blob/main/skills/docs-quality/SKILL.mdView on GitHub

Overview

Docs-quality evaluates documentation as a communication architecture to ensure AI readers (Claude), human developers, and stakeholders can understand and use the project effectively. It covers AI readability, onboarding clarity, staleness, convention adherence, completeness, redundancy, and novel practices, delivering focused, actionable findings for the specified focus area.

How This Skill Works

Begin by reading all project Markdown documents to ground in the actual state. Assess each document against seven criteria: AI readability, onboarding, staleness, convention adherence, completeness, redundancy/reconstructibility, and novel approaches. For every finding, specify the issue, exact location, and the proposed improvement, then group findings into segments of up to eight items ordered by impact.

When to Use It

  • Before onboarding a new AI agent (e.g., Claude) to ensure docs are navigable and comprehensible.
  • During onboarding of new human developers to verify docs enable reading order and dependency understanding.
  • Before releases or doc updates to catch staleness, dead links, or obsolete references.
  • During doc-maintenance reviews to check adherence to conventions and completeness.
  • When benchmarking doc quality against other projects to inform novel approaches.

Quick Start

  1. Step 1: Open the repository and set the focus area to ARGUMENTS.
  2. Step 2: Read all Markdown docs to ground in the project state.
  3. Step 3: Create action-item segments (≤8 findings each) with issue, location, improvement, and order by impact.

Best Practices

  • Ground each finding in real doc sections or files.
  • Start with high-impact issues that block understanding.
  • Capture precise locations (file path, heading, and context) for each finding.
  • Keep recommendations concrete and actionable.
  • When possible, reference the project's existing style and conventions to avoid contradictory changes.

Example Use Cases

  • Cross-references referencing removed modules that break navigation.
  • Obsolete API usage docs that describe deprecated features.
  • Inconsistent heading levels across sections, hindering scanning.
  • Missing architectural decisions or rationale for key components.
  • Redundant introductory paragraphs that re-state code instead of guiding readers.

Frequently Asked Questions

Add this skill to your agents
Sponsor this space

Reach thousands of developers