Prioritization Frameworks
Why Prioritization Matters
Every product team has more ideas than capacity. Prioritization is the discipline of deciding what to work on and—equally important—what NOT to work on.
“Strategy is about making choices, trade-offs; it’s about deliberately choosing to be different.” — Michael Porter
Common Prioritization Frameworks
RICE Scoring
RICE = Reach × Impact × Confidence / Effort
| Factor | Definition | Scale |
|---|---|---|
| Reach | How many users will this affect in a time period? | Number of users/quarter |
| Impact | How much will it affect each user? | 3=Massive, 2=High, 1=Medium, 0.5=Low, 0.25=Minimal |
| Confidence | How confident are we in our estimates? | 100%=High, 80%=Medium, 50%=Low |
| Effort | How many person-months to complete? | Person-months |
Formula:
RICE Score = (Reach × Impact × Confidence) / Effort
Example:
| Initiative | Reach | Impact | Confidence | Effort | Score |
|---|---|---|---|---|---|
| New dashboard | 500 | 2 | 80% | 3 | 267 |
| API improvement | 100 | 3 | 100% | 1 | 300 |
| Email feature | 2000 | 0.5 | 50% | 2 | 250 |
Best for: Comparing initiatives when you need a quantitative approach
MoSCoW Method
Categorize features into four buckets:
| Category | Definition | Rule of Thumb |
|---|---|---|
| Must Have | Critical for launch/success | Without this, the release fails |
| Should Have | Important but not critical | Significant value, can work around |
| Could Have | Nice to have | Only if time/resources allow |
| Won’t Have | Explicitly out of scope | Not this time, maybe later |
Best for: Scope negotiations, release planning, MVP definition
Tips:
- Limit Must Haves to ~60% of capacity
- Be honest about Won’t Haves—clarity helps
- Revisit categorization as you learn more
Kano Model
Categorize features by how they affect customer satisfaction:
Satisfaction
▲
│ Delighters
│ ╱ (Excitement)
│ ╱
│ ╱ Performance
│ ╱ ╱ (Linear)
─────┼────╱─────╱────────────────▶ Feature
│ ╱ ╱ Fulfillment
│ ╱ ╱
│ ╱ ╱
│╱ ╱
│ Must-Haves
│ (Basic)
▼
| Category | Description | Example |
|---|---|---|
| Must-Haves (Basic) | Expected; absence causes dissatisfaction | Data accuracy, system uptime |
| Performance (Linear) | More is better, linear relationship | Speed, match rates, volume |
| Delighters (Excitement) | Unexpected; presence creates delight | AI-powered insights, proactive alerts |
| Indifferent | No impact on satisfaction | Features users don’t care about |
| Reverse | Presence causes dissatisfaction | Unwanted complexity |
Best for: Understanding customer perception, balancing feature types
Value vs. Effort Matrix
Simple 2x2 prioritization:
High Value
│
┌────────┼────────┐
│ Do │ Plan │
│ First │ Next │
│ │ │
────┼────────┼────────┼────▶ High
Low │ │ │ Effort
│ Quick │ Avoid │
│ Wins │ These │
│ │ │
└────────┴────────┘
│
Low Value
| Quadrant | Strategy |
|---|---|
| High Value, Low Effort | Do first (quick wins) |
| High Value, High Effort | Plan and execute strategically |
| Low Value, Low Effort | Fill in when capacity allows |
| Low Value, High Effort | Avoid or deprioritize |
Best for: Quick triage, visual communication to stakeholders
Weighted Scoring
Create a custom scoring model based on your priorities:
Step 1: Define criteria that matter
| Criterion | Weight |
|---|---|
| Revenue impact | 30% |
| Strategic alignment | 25% |
| Customer demand | 20% |
| Technical feasibility | 15% |
| Competitive necessity | 10% |
Step 2: Score each initiative (1-5 scale)
Step 3: Calculate weighted score
Score = Σ (Criterion Score × Weight)
Best for: When you have specific strategic priorities to optimize
ICE Scoring
Simpler alternative to RICE:
| Factor | Definition | Scale |
|---|---|---|
| Impact | How much will this improve the metric? | 1-10 |
| Confidence | How sure are we? | 1-10 |
| Ease | How easy to implement? | 1-10 |
Formula:
ICE Score = (Impact + Confidence + Ease) / 3
Best for: Quick prioritization, growth experiments
Opportunity Scoring
Based on customer importance and current satisfaction:
Opportunity Score = Importance + (Importance - Satisfaction)
Where:
- Importance: How important is this to users? (1-10)
- Satisfaction: How satisfied are they with current solution? (1-10)
Interpretation:
- High importance + Low satisfaction = Big opportunity
- High importance + High satisfaction = Maintain (table stakes)
- Low importance = Deprioritize
Best for: Customer-centric prioritization, identifying gaps
Prioritization in Practice
Combining Frameworks
No single framework is perfect. Many teams combine approaches:
- Strategic filter first: Does it align with company/product strategy?
- Qualitative triage: MoSCoW or Value/Effort matrix for initial sorting
- Quantitative scoring: RICE or weighted scoring for detailed comparison
- Stakeholder input: Incorporate business constraints and context
Avoiding Common Mistakes
| Mistake | Problem | Solution |
|---|---|---|
| HiPPO (Highest Paid Person’s Opinion) | Loudest voice wins | Use data and frameworks |
| Recency bias | Latest request gets priority | Maintain backlog discipline |
| Squeaky wheel | Whoever complains most wins | Balance all customer input |
| Scope creep | Everything becomes a Must Have | Enforce WON’T HAVE category |
| Analysis paralysis | Too much scoring, no action | Timebox prioritization |
| Ignoring tech debt | Short-term thinking | Reserve capacity for maintenance |
The Prioritization Meeting
Preparation:
- Pre-score items individually
- Gather supporting data
- Know your capacity constraints
Meeting flow:
- Review criteria and goals
- Discuss outliers (high/low scores)
- Debate disagreements
- Align on final priority order
- Commit to what’s in/out
Outputs:
- Prioritized backlog
- Clear rationale for decisions
- Stakeholder alignment
Path2Response Context
Prioritization Criteria for P2R
| Criterion | Why It Matters |
|---|---|
| Revenue impact | Direct business value |
| Client retention | 97% retention is key metric |
| Competitive necessity | Stay ahead of alternatives |
| Data differentiation | Core value proposition |
| Operational efficiency | Internal tools matter too |
| Compliance/security | Non-negotiable requirements |
Balancing Stakeholders
P2R serves multiple customer types with different needs:
| Stakeholder | Priority Lens |
|---|---|
| Nonprofits | Donor acquisition, cost per donor |
| Agencies | Client results, margins, ease of use |
| Brands | Customer acquisition, integration |
| Internal | Operational efficiency, data quality |
Technical Debt Allocation
Reserve capacity for non-feature work:
- Platform reliability
- Data quality improvements
- Security and compliance
- Performance optimization
Recommendation: 20-30% of capacity for technical health