validate
Scannednpx machina-cli add skill ea-toolkit/architecture-catalog/validate --openclawFiles (1)
SKILL.md
1.2 KB
Validate Architecture Model
Run the validation script and interpret results for the user.
Workflow
- Run:
python scripts/validate.py - Parse the output
- Summarize for the user:
- Total elements in registry
- Validation errors (elements in diagrams but not registered)
- Orphan elements (registered but not in any diagram)
- Domain maturity scores
Response Format
**Validation Results**
| Metric | Count |
|--------|-------|
| Registry entries | X |
| Validation errors | X |
| Orphan elements | X |
**Errors (if any):**
[List each error with file location]
**Orphans by layer:**
[Group orphans by ArchiMate layer]
**Recommendations:**
[Suggest fixes for errors, note that orphans are expected for incomplete models]
Notes
- Orphan elements are normal for incomplete models - don't alarm the user
- Validation errors are more serious - these indicate diagrams reference unregistered elements
- If user asks for JSON output, run:
python scripts/validate.py --format json
Source
git clone https://github.com/ea-toolkit/architecture-catalog/blob/main/.claude/skills/validate/SKILL.mdView on GitHub Overview
Runs the architecture validation script to compare registry entries against diagrams, reporting orphans, errors, and domain maturity scores. This helps ensure diagram-consistent governance and highlights gaps for remediation.
How This Skill Works
Execute python scripts/validate.py to perform the checks, then parse the output to extract metrics such as total registry entries, validation errors, orphan elements, and domain maturity scores. For automation, use python scripts/validate.py --format json to obtain machine-readable results.
When to Use It
- Before sharing or auditing your architecture model to confirm consistency.
- When diagrams reference elements not present in the registry (validation errors).
- When identifying elements registered but not included in any diagram (orphans).
- For ongoing health checks and to track domain maturity scores over time.
- When you need a machine-readable report for dashboards or CI pipelines.
Quick Start
- Step 1: Run python scripts/validate.py to generate results.
- Step 2: (Optional) Run python scripts/validate.py --format json to obtain machine-readable output.
- Step 3: Review the Errors and Orphans sections and update the registry or diagrams accordingly.
Best Practices
- Run validation after every major diagram update.
- Review 'Errors' with file locations to quickly locate source issues.
- Group 'Orphans by layer' to prioritize remediation.
- Keep the registry synchronized with diagrams; resolve orphans that are unintended.
- Use domain maturity scores to drive modeling improvements and gating.
Example Use Cases
- Total elements in registry: 120; Validation errors: 6; Orphan elements: 14; Domain maturity scores indicate gaps in the 'Implementation' layer.
- Orphans grouped by layer: 9 in Technology, 5 in Business, 0 in Motivation; remediation planned by priority.
- CI integration: Running validate in CI outputs JSON payload that feeds a dashboard.
- Validation errors point to diagrams referencing unregistered elements; after registration updates, errors drop to zero.
- Ongoing health check reduces orphan counts from 20 to 8 over 3 sprints with registry-diagram alignment.
Frequently Asked Questions
Add this skill to your agents