Metrics Interview Questions for Product Managers
Metrics questions test your ability to define success, diagnose problems, and make data-driven decisions. These questions appear in virtually every PM interview and are critical for roles at data-driven companies. This guide covers the frameworks and practice you need.
Types of Metrics Questions
Define Success Metrics
"How would you measure success for Instagram Reels?"
Diagnose a Problem
"Daily active users dropped 10% last week. How would you investigate?"
Design an Experiment
"How would you test whether a new feature is successful?"
Evaluate Tradeoffs
"Feature A increases engagement but decreases revenue. What do you do?"
The Metrics Framework
North Star Metric
The single metric that best captures the value users get from your product. It should be:
- Actionable: Teams can influence it
- Meaningful: Connected to business success
- User-centric: Reflects value delivered to users
Examples:
- Netflix: Hours watched
- Uber: Trips completed
- Slack: Daily messages sent
- Airbnb: Nights booked
Supporting Metrics
Additional metrics that provide a fuller picture of product health:
Engagement metrics:
- Daily/Monthly Active Users (DAU/MAU)
- Sessions per user
- Time spent
- Feature adoption rate
Retention metrics:
- D1/D7/D30 retention
- Churn rate
- Cohort retention curves
Quality metrics:
- Task completion rate
- Error rate
- Customer satisfaction (CSAT/NPS)
Guardrail Metrics
Metrics that ensure you are not causing harm while optimizing for your north star. They prevent unintended negative consequences.
Examples:
- If optimizing for time spent, guardrail: user satisfaction
- If optimizing for signups, guardrail: quality of signups (retention)
- If optimizing for revenue, guardrail: user trust/satisfaction
Input vs Output Metrics
Input (leading) metrics: Activities that drive outcomes (feature usage, content created)
Output (lagging) metrics: The outcomes themselves (revenue, retention)
Problem Diagnosis Framework
When asked "X metric dropped. What happened?"
Step 1: Clarify the Problem
- How much did it drop? (10% vs 50% require different responses)
- When did it start?
- Is it still declining or has it stabilized?
Step 2: Segment the Data
Break down by:
- Platform (iOS, Android, web)
- Geography
- User cohort (new vs returning)
- User segment (free vs paid)
Step 3: Generate Hypotheses
Categorize possible causes:
- External factors: Seasonality, competition, market changes
- Internal changes: Product updates, bugs, experiments
- Data issues: Tracking bugs, definition changes
- User behavior: Shift in how users engage
Step 4: Investigate and Validate
Prioritize hypotheses by likelihood and ease of validation. Check logs, run queries, talk to users.
Step 5: Recommend Action
Based on root cause, propose solutions and next steps.
A/B Testing Questions
Common A/B testing topics:
Experiment Design
- What is your hypothesis?
- What are you measuring?
- How large is your sample?
- How long will you run the test?
- What is your success threshold?
Interpreting Results
- Statistical significance vs practical significance
- Confidence intervals
- Multiple comparison problems
- Novelty effects
Making Decisions
- When to ship vs not ship
- Tradeoffs between metrics
- Segmented results
Example: Measure Success for YouTube Shorts
North Star Metric: Short-form video watch time. This captures user value (entertainment) and is actionable by teams.
Supporting Metrics:
- Daily active Shorts viewers
- Videos watched per session
- Shorts creation rate
- Creator retention
- Viewer-to-creator conversion
Guardrail Metrics:
- Long-form video watch time (ensure Shorts does not cannibalize)
- User satisfaction scores
- Ad revenue per session
Input Metrics:
- Shorts uploaded per day
- Average Shorts quality score
- Discoverability improvements
Example: DAU Dropped 10%
Clarify: "When did it start? Is it 10% absolute or relative? Is it still declining?"
Segment: "Let me break down by platform, geography, and user type. Where is the drop concentrated?"
Hypotheses:
- Bug in recent release
- Tracking issue
- Seasonal pattern
- Competitive action
- Failed experiment
Investigation Plan:
- Check if drop correlates with recent releases
- Verify tracking is working correctly
- Compare to same period last year
- Review active experiments
- Check competitor launches
Likely Scenario: If the drop is isolated to Android after a recent update, likely a bug. Rollback and investigate.
Common Metrics Interview Questions
- "How would you measure success for LinkedIn Learning?"
- "Uber ride completions dropped. What happened?"
- "How would you decide if a new feature is successful?"
- "Design metrics for a productivity app"
- "Time spent on our app increased but revenue decreased. What do you do?"
- "How would you measure the health of a two-sided marketplace?"
Tips for Metrics Questions
Start with the user: What does success look like for users?
Connect to business: How does user success translate to company goals?
Be comprehensive: Cover engagement, retention, quality, and guardrails.
Think about gaming: How could people manipulate this metric?
Consider tradeoffs: What are you sacrificing by optimizing this?
For comprehensive PM interview preparation, see our main PM interview guide. Ensure your resume highlights data-driven achievements with specific metrics that demonstrate your analytical abilities.