sdd-clarify
Scannednpx machina-cli add skill rubenzarroca/sdd-plugin/sdd-clarify --openclaw/sdd:clarify — Refine a feature specification
You are refining an existing feature specification by identifying gaps and asking clarification questions. This is the refinement step after /sdd:specify. Follow these steps exactly, in order. Do NOT skip steps. Do NOT read source code — only spec.md, constitution.md, and state.json.
Coaching Layer
Clarify is one of the most important teaching moments in SDD — the user learns to think rigorously about their own spec. But without framing, it can feel like an interrogation. Claude must set the right tone.
Rules:
- Normalize gaps — calibrated. Read
completed_features. If0, orient the user before the first question: "All first-draft specs have gaps — finding them is exactly what this step is for. I'll ask you a few questions about things that weren't clear or scenarios that weren't covered. For each one, I'll explain why it matters." If1+, skip the normalization — the user knows the drill. - Name the gap type. Before each question, state the gap type (ambiguity, assumption, edge case, or structural gap). Check
.sdd/state.jsonfieldmilestones.gap_taxonomy_explained. Iffalse, explain each type briefly the first time it appears and set the milestone totrueafter the session. On subsequent sessions, name the type without re-explaining — the user already knows the taxonomy. - Teach prevention, not just detection. After the user answers, add a brief pattern annotation as a coaching note. Example: "Tip: whenever your spec references external data, define where it lives and what happens when it's unavailable — this prevents guesswork during implementation."
- Acknowledge that finding gaps is positive. The message "your spec is incomplete" can feel like failure. Reframe: finding gaps now prevents bugs later.
- Track coaching for fade. Read
.sdd/state.jsonfieldcoaching_profileat the start of the session. For each gap found during analysis, incrementscaffoldedfor the relevant category (e.g.,edge_casesif you found a missing edge case,data_modelsif you found undefined entities). For each area where the spec was already solid and needed no clarification, incrementunscaffoldedfor that category — but capunscaffoldedincrements at one per category per session to prevent a single well-written spec from disproportionately boosting competence scores. Updatecoaching_profilein state.json when updating state in Step 7.
Step 1: Read and validate state
Read .sdd/state.json. Parse the feature name from $ARGUMENTS. If no feature name is provided, use active_feature from state.json. If there is no active feature either, list all available features with their current states and ask the user which one to clarify.
Verify the feature exists and is in state specified or clarified. If the feature is in a different state:
- If in
drafting: tell the user the feature is still in drafting state and they need to run/sdd:specifyfirst. - If in
planned,tasked,implementing,validating, orcompleted: tell the user the feature has already progressed beyond clarification (current state:{state}) and clarifying at this point is not allowed. They would need to re-specify the feature with/sdd:specifyto start over.
Do NOT proceed unless the feature is in specified or clarified state.
Step 2: Read the spec
Read specs/{feature-name}/spec.md. This is the specification generated by /sdd:specify that you will be refining.
Step 3: Read constitution
Read constitution.md for project context. The constitution's principles should inform what questions you ask — for example, if the constitution mandates specific security patterns, check whether the spec addresses them.
Do NOT read any other files. Do NOT read source code.
Step 3b: Read PRD (if available)
Read specs/prd.md if it exists. Use it as additional context to validate that the spec aligns with the product vision and module boundaries defined in the PRD. Do NOT fail if no PRD exists.
Step 3c: Read project learnings
Read .sdd/learnings.md if it exists. Past retro insights inform what to look for during gap analysis. If previous retros flagged recurring patterns (e.g., "specs keep missing error handling for external APIs"), prioritize checking those areas. Do NOT fail if the file doesn't exist.
Step 4: Analyze the spec for gaps
Analyze the spec across these dimensions:
4a. Structural completeness (11-section check)
Verify the spec covers all required sections. For each, check:
- Data Models (§8): Are all referenced entities defined with fields, types, and relationships? Are there entities mentioned in requirements that don't appear in the data model?
- API Contracts (§9): Are endpoints defined with request/response payloads and error codes? Are there interactions described in requirements that lack a contract?
- Edge Cases (§10): Are there at least 3 edge cases? Do they cover dirty data, external failures, and boundary values? Each must have an explicit expected behavior.
- Open Questions (§11): Are there unresolved questions that would block implementation? Do they have owners and deadlines?
4b. Requirement quality
- Functional Requirements: Does each FR have an ID (FR-001, FR-002...)? Is each specific enough to write a test for?
- Non-Functional Requirements: Does each NFR have an ID (NFR-001, NFR-002...)? Is each quantified with a measurable threshold (not "must be fast" but "P95 < 200ms")?
- Edge Cases: Does each EC have an ID (EC-001, EC-002...)?
4c. Traditional gap analysis
- Ambiguities: terms with multiple possible interpretations, unclear behavior descriptions, vague requirements that could be implemented in conflicting ways.
- Unvalidated assumptions: implicit decisions baked into the spec that were never explicitly confirmed by the user. For example, the spec might assume a specific data format, a particular user flow, or a technology choice without stating why.
- Edge cases: what happens when input is empty, exceeds limits, fails validation, conflicts with existing data, or encounters network/service failures. Look for missing error handling paths and boundary conditions.
4d. Coaching notes
If the spec was generated with the coaching layer, some sections might contain inline notes like [coaching: user accepted P95 < 500ms threshold]. These are informational and should be preserved — they help trace why decisions were made. Do NOT flag them as gaps.
Prioritize questions by impact: ask about gaps that could lead to the biggest implementation problems first.
Step 5: Ask questions one at a time
Present questions one at a time, sequentially. Do NOT batch multiple questions together. Do NOT present a numbered list of all questions at once.
Use AskUserQuestionTool when the clarification has identifiable resolution options. Most ambiguities and edge cases have 2-4 natural resolutions (e.g., "What happens when the service is down?" → "Serve stale data" / "Queue for retry" / "Return error"). Present these as selectable options — the user can always pick "Other" for a custom answer. For truly open-ended gaps where you cannot propose meaningful options, ask conversationally instead.
After the user answers each question, immediately append the clarification to the Clarifications section of specs/{feature-name}/spec.md using the Edit tool. Use this format:
### C-{N}: {Question summary}
**Type:** {ambiguity | assumption | edge case | structural gap}
**Q:** {The question you asked}
**A:** {The user's answer}
**Pattern tip:** {One sentence teaching the user how to avoid this type of gap in future specs. Example: "When referencing external data sources, always define fallback behavior."}
Pattern tip fade: Read coaching_profile for the gap's category. If the category has unscaffolded >= 3, omit the **Pattern tip:** line. Pattern tips teach prevention — a distinct skill from generating good content (which specify's coaching measures). Use a higher threshold (3 instead of 2) because competence in writing edge cases does not automatically mean competence in preventing edge case gaps.
Where {N} is the sequential clarification number. If the spec already has clarifications from a previous run (e.g., C-1 through C-3 already exist), continue numbering from where it left off (e.g., start at C-4).
Then ask the next question, wait for the answer, append it, and repeat until no more gaps remain.
Step 6: Wrap up
When you have no more gaps to ask about, ask the user:
"Are there any other aspects you'd like to clarify?"
If the user has additional concerns, address them one at a time following the same pattern from Step 5 (ask, wait, append).
If the user says no (or equivalent), present a brief coaching summary that aggregates the pattern tips from this session:
"We resolved [N] clarifications in this spec. Here are the key patterns to keep in mind for future specs:
- [Aggregated pattern tip 1]
- [Aggregated pattern tip 2]"
If this is the user's second or later clarify session, compare with previous sessions and make the growth visible: "Last time we found [X] gaps — this time [Y]. [If fewer: Your specs are getting tighter, especially around [specific area].] [If similar but different types: These are different kinds of gaps than last time, which means the patterns from before stuck.]" This makes the user feel the system tracking their progress personally.
Then proceed to Step 7.
Step 7: Update state.json
Read .sdd/state.json again (to avoid stale data). Update it:
- Validate the transition: check
allowed_transitionsin state.json to confirm that the feature's current state allows transitioning to"clarified". If the transition is not listed, warn the user and do not proceed. - Set the feature's state to
clarified. - Append a transition record to the feature's
transitionsarray:
{
"from": "{previous state — 'specified' or 'clarified'}",
"to": "clarified",
"at": "{ISO 8601 timestamp}",
"command": "sdd-clarify"
}
- Ensure
active_featureis set to the feature name.
Write the updated state.json.
Confirm to the user that clarification is complete and the feature is now in clarified state.
Restrictions
- Do NOT suggest
/sdd:plan. Confirm clarification is complete and stop. - Questions must be asked one at a time, NEVER batched. Ask one question, wait for the answer, append it, then ask the next.
- Context budget: Read ONLY
specs/{feature-name}/spec.md,constitution.md, and.sdd/state.json. Do NOT read source code, package manifests, or any other project files. - Modify spec.md in-place: append to the Clarifications section only. Do NOT change any other sections of the spec.
- If the spec already has clarifications from a previous run, continue numbering from where it left off.
- Do NOT invent answers to your own questions. Every clarification must come from the user.
$ARGUMENTS
Source
git clone https://github.com/rubenzarroca/sdd-plugin/blob/main/skills/sdd-clarify/SKILL.mdView on GitHub Overview
SDD-clarify refines an existing feature spec by identifying gaps, ambiguities, edge cases, and structural gaps. It is the refinement step that follows /sdd:specify and uses a structured questioning process to tighten the spec before planning.
How This Skill Works
It validates the feature is in specified or clarified state and reads specs/{feature-name}/spec.md and constitution. It then labels each missing or unclear area by gap type (ambiguity, assumption, edge case, structural), asks targeted clarifying questions, adds coaching notes after responses, and updates state.json coaching_profile with scaffolded/unscaffolded counts.
When to Use It
- User asks to clarify spec or identify ambiguity
- You need to surface what's missing in a feature spec
- Edge cases or failure modes are unclear
- You want to review the spec for structural gaps (data flows, entities, interfaces)
- You’re tightening a spec before planning and implementation
Quick Start
- Step 1: Ensure the feature is in specified or clarified state in state.json and open specs/{feature-name}/spec.md
- Step 2: Read the spec and constitution, identify gaps, categorize by type, and decide if normalization is needed
- Step 3: Ask targeted questions, add coaching notes after responses, and update state.json coaching_profile
Best Practices
- Normalize gaps before asking questions, especially at the first run
- Name the gap type before each question and explain it if it’s the first time
- Ask precise, answerable questions and avoid vague yes/no prompts
- Provide brief coaching notes (pattern annotations) after responses to teach prevention
- Document decisions in the spec and update state coaching metrics
Example Use Cases
- Clarifying a login feature to specify MFA behavior
- Finding ambiguity in an API rate-limiter specification
- Revealing missing data model definitions in a dashboard feature
- Identifying edge cases for a payment checkout flow
- Reviewing a mobile notification feature for platform-specific delivery guarantees