Get the FREE Ultimate OpenClaw Setup Guide →

prioritization-matrix

Scanned
npx machina-cli add skill aroyburman-codes/pm-skills/prioritization-matrix --openclaw
Files (1)
SKILL.md
4.1 KB

Prioritization Matrix Skill

Score, rank, and prioritize a set of features, initiatives, or ideas using structured scoring frameworks.

When to Use

  • User has a list of features and needs to decide what to build first
  • User needs to justify prioritization to stakeholders
  • User says /prioritization-matrix followed by a list of items
  • Any time competing initiatives need to be ranked

Supported Frameworks

RICE (Reach, Impact, Confidence, Effort)

Best for: Growth-focused teams with measurable reach data.

FactorHow to Score
Reach# of users/customers affected per quarter
Impact0.25 (minimal) / 0.5 (low) / 1 (medium) / 2 (high) / 3 (massive)
Confidence100% (high) / 80% (medium) / 50% (low)
EffortPerson-months of work

Score = (Reach x Impact x Confidence) / Effort

ICE (Impact, Confidence, Ease)

Best for: Fast estimation when detailed reach data isn't available.

FactorHow to Score (1-10)
ImpactHow much will this move the needle?
ConfidenceHow sure are we about impact and effort?
EaseHow easy is this to implement? (10 = trivial)

Score = Impact x Confidence x Ease

Weighted Scoring

Best for: Custom criteria that matter to your team.

Define 4-6 criteria with weights (must sum to 100%):

CriteriaWeightDescription
Strategic alignment25%How well does this support company goals?
User impact25%How much does this improve user experience?
Revenue potential20%Direct or indirect revenue impact
Technical feasibility15%How complex is implementation?
Time sensitivity15%Is there a window of opportunity?

Score each item 1-5 on each criterion. Weighted score = sum of (score x weight).

Value vs. Effort (2x2)

Best for: Quick visual communication to stakeholders.

Plot items on a 2x2 matrix:

  • Quick Wins (High value, Low effort) → Do first
  • Big Bets (High value, High effort) → Plan carefully
  • Fill-ins (Low value, Low effort) → Do if capacity allows
  • Money Pits (Low value, High effort) → Deprioritize

Workflow

Step 1: Clarify

  • What are we prioritizing? (features, bugs, initiatives, experiments)
  • What timeframe? (this sprint, this quarter, this year)
  • What constraints? (team size, dependencies, deadlines)
  • What's the primary goal? (growth, retention, revenue, quality)

Step 2: Choose Framework

Based on the context, recommend the most appropriate framework. If the user doesn't specify, default to RICE for product features and Weighted Scoring for strategic initiatives.

Step 3: Score

For each item:

  • Score on each dimension with reasoning (not just numbers)
  • Flag assumptions and confidence level
  • Note dependencies between items

Step 4: Rank & Recommend

  • Sort by composite score
  • Group into tiers: Must Do / Should Do / Could Do / Won't Do
  • Highlight any items where the score conflicts with your intuition (and explain why)

Step 5: Communicate

Generate a stakeholder-ready summary:

  • Top 3 priorities with one-sentence rationale each
  • What we're NOT doing and why
  • Key assumptions that could change the ranking

Output Format

Generate a clean markdown table with scores, plus a summary paragraph. Include the rationale for the top and bottom items. Flag any close calls where small changes in assumptions would flip the ranking.

AI/ML-Specific Considerations

When prioritizing AI features, also consider:

  • Model readiness: Is the underlying model capable enough?
  • Eval coverage: Do we have evals to measure success?
  • Safety review: Does this need safety/red-team review?
  • Data requirements: Do we have the training/eval data?
  • Cost per query: What's the inference cost impact?

Source

git clone https://github.com/aroyburman-codes/pm-skills/blob/main/skills/prioritization-matrix/SKILL.mdView on GitHub

Overview

Score, rank, and backlog features or initiatives using RICE, ICE, weighted scoring, or custom frameworks. This yields a prioritized backlog with clear rationale, suitable for sprint planning, roadmap decisions, and balancing trade-offs between initiatives.

How This Skill Works

Choose a framework based on context (RICE for growth with measurable reach, ICE for quick estimates, or Weighted Scoring for custom criteria). Score each item on the relevant dimensions, compute a composite score, and rank items into tiers; finally, craft a stakeholder-ready summary with top priorities and assumptions.

When to Use It

  • You have a list of features and need to decide what to build first
  • You need to justify prioritization to stakeholders
  • You say /prioritization-matrix followed by a list of items
  • Competing initiatives need to be ranked
  • You’re planning sprints or roadmaps and balancing trade-offs

Quick Start

  1. Step 1: Clarify priorities, timeframe, constraints
  2. Step 2: Choose a framework (default to RICE for features; Weighted Scoring for initiatives)
  3. Step 3: Score, rank, and prepare a stakeholder-ready summary

Best Practices

  • Clarify scope, timeframe, and constraints before scoring
  • Choose the most appropriate framework for the context
  • Score with reasoning, note assumptions, and mark confidence
  • Flag dependencies and close-call items that could flip rankings
  • Present a stakeholder-ready summary with the top 3 priorities

Example Use Cases

  • Prioritize a product backlog for the next sprint using RICE
  • Justify a roadmap decision with weighted scoring across criteria
  • Rank growth experiments with ICE for quick estimation
  • Balance tech debt versus new features using Weighted Scoring
  • Communicate value vs effort with a 2x2 matrix for quick wins

Frequently Asked Questions

Add this skill to your agents
Sponsor this space

Reach thousands of developers