Metrics are how we know if we’re succeeding. Without measurement, product decisions are opinions. With measurement, they’re informed choices.
“What gets measured gets managed.” — Peter Drucker
| Type | Definition | Example |
| Input (Leading) | Activities we control | Features shipped, experiments run |
| Output (Lagging) | Results we want | Revenue, retention, satisfaction |
Focus on outcomes (output) but track inputs to understand causation.
┌─────────────────┐
│ North Star │ ← Company-level
│ Metric │
└────────┬────────┘
│
┌──────────────┼──────────────┐
▼ ▼ ▼
┌──────────┐ ┌──────────┐ ┌──────────┐
│ Product │ │ Product │ │ Product │ ← Product-level
│ KPIs │ │ KPIs │ │ KPIs │
└────┬─────┘ └────┬─────┘ └────┬─────┘
│ │ │
┌────┴────┐ ┌────┴────┐ ┌────┴────┐
│ Feature │ │ Feature │ │ Feature │ ← Feature-level
│ Metrics │ │ Metrics │ │ Metrics │
└─────────┘ └─────────┘ └─────────┘
A classic framework for measuring the customer journey:
Question: How do users find us?
| Metric | Definition |
| Traffic sources | Where visitors come from |
| Cost per acquisition (CPA) | Marketing spend / new users |
| Conversion to signup | Visitors → Signups |
Question: Do users have a good first experience?
| Metric | Definition |
| Time to value | How long until first success |
| Onboarding completion | % completing key steps |
| Activation rate | % reaching “aha moment” |
Question: Do users come back?
| Metric | Definition |
| Retention rate | % still active after N days/months |
| Churn rate | % leaving per period |
| DAU/MAU ratio | Daily/monthly active user engagement |
Question: Are we making money?
| Metric | Definition |
| MRR/ARR | Monthly/Annual recurring revenue |
| ARPU | Average revenue per user |
| LTV | Lifetime value of a customer |
| Expansion revenue | Revenue from existing customers |
Question: Do users tell others?
| Metric | Definition |
| NPS | Net Promoter Score |
| Referral rate | % of users who refer others |
| Viral coefficient | Referrals per user |
| Metric | Formula | Why It Matters |
| ARR | Sum of annual contract values | Overall business health |
| MRR | Sum of monthly contract values | Monthly health |
| Net Revenue Retention (NRR) | (Start MRR + Expansion - Churn - Contraction) / Start MRR | Growth from existing customers |
| Gross Revenue Retention (GRR) | (Start MRR - Churn - Contraction) / Start MRR | Customer retention health |
| Metric | Formula | Target |
| Customer Acquisition Cost (CAC) | Sales & Marketing spend / New customers | Lower is better |
| LTV:CAC Ratio | Customer LTV / CAC | >3:1 is healthy |
| Payback Period | CAC / Monthly revenue | <12 months ideal |
| Logo Retention | Customers retained / Total customers | >90% for B2B |
| Metric | Definition |
| DAU/WAU/MAU | Active users by period |
| Feature adoption | % of users using specific features |
| Session frequency | How often users return |
| Time in product | Engagement depth |
| Metric | Definition | Why It Matters |
| Match rate | Records matched / Records submitted | Data quality indicator |
| Data freshness | Age of most recent update | Competitive differentiator |
| Audience performance | Response rates, ROI | Customer success |
| Query volume | API calls, exports | Usage and engagement |
| Coverage | % of addressable market in database | Scale indicator |
| Metric | Definition |
| API uptime | Availability percentage |
| API latency | Response time (p50, p95, p99) |
| Error rate | Failed requests / Total requests |
| Developer adoption | # of integrations, active developers |
| Letter | Meaning | Example |
| S | Specific | “Increase retention” → “Increase 90-day retention” |
| M | Measurable | Can track with data |
| A | Achievable | Realistic given resources |
| R | Relevant | Connected to business goals |
| T | Time-bound | “by end of Q2” |
| Good Metric | Vanity Metric |
| Activation rate | Total signups |
| Monthly active users | Registered users |
| Retention rate | Total accounts |
| Revenue per customer | Press mentions |
Test: Does this metric drive decisions? Can we act on it?
| Anti-Pattern | Problem | Solution |
| Too many metrics | Analysis paralysis | Focus on 3-5 key metrics |
| Vanity metrics | Feels good, no insight | Tie to business outcomes |
| Gaming | Optimizing metric, not goal | Use balanced scorecard |
| Short-term focus | Sacrifice long-term health | Include leading indicators |
| No baseline | Can’t measure improvement | Establish baselines first |
Revenue from recurring customers (ARR × retention)
| Area | Key Metrics |
| Acquisition | New customers, pipeline generated |
| Activation | First campaign delivered, time to first order |
| Retention | 97% client retention (company goal) |
| Revenue | ARR, expansion revenue, ARPU |
| Data Quality | Match rates, data freshness, coverage |
| Feature | Success Metrics |
| Digital Audiences | Segment adoption, campaign performance lift |
| Path2Contact | Append match rate, client satisfaction |
| Path2Ignite | Campaigns run, response rate improvement |
| Path2Linkage | Partner integrations, query volume |
| Metric | Target |
| Data update frequency | Weekly (transactions), Daily (digital) |
| API uptime | 99.9% |
| Support response time | <4 hours |
| Onboarding time | <2 weeks |
- Hierarchy: Start with top-level, drill down
- Context: Include trends, comparisons, targets
- Actionability: Show metrics someone can act on
- Freshness: Update frequency matches decision cadence
┌─────────────────────────────────────────────────────┐
│ NORTH STAR: [Metric] [Trend] [vs Target] │
├─────────────────────┬───────────────────────────────┤
│ Revenue │ Retention │
│ • ARR: $X │ • Logo retention: X% │
│ • Growth: +X% │ • NRR: X% │
│ • Pipeline: $X │ • Churn: X customers │
├─────────────────────┼───────────────────────────────┤
│ Acquisition │ Engagement │
│ • New customers: X │ • Active customers: X │
│ • Win rate: X% │ • Feature adoption: X% │
│ • CAC: $X │ • Campaigns run: X │
├─────────────────────┴───────────────────────────────┤
│ Data Quality │
│ • Match rate: X% • Freshness: X days • Coverage │
└─────────────────────────────────────────────────────┘