In 2007, every startup pitch deck showed a chart of total registered users going up and to the right. The number always grew. It revealed nothing. Teams optimising this number could run Facebook spam campaigns, offer free pizza for signups, or buy keyword traffic at a loss and all receive congratulations from their investors. The curve did not distinguish between a user who built their business on the product and a user who signed up, clicked nothing, and never returned.
Dave McClure built AARRR because there was no standardised framework for thinking about the complete customer lifecycle. His stated goal: "I wanted to get startups concentrating on stuff that really matters, not just the number of eyeballs or the number of people coming but whether you're activating them, retaining them, having them pay you money, and having them bring their friends."
A number that always increases and looks good in a pitch deck but does not measure whether the business is actually working. Total registered users, total downloads, page views, and cumulative revenue without cohort breakdowns are classic examples. Eric Ries named this pattern in 2009, two years after AARRR provided the alternative.
Dave McClure presented "Startup Metrics for Pirates: AARRR!" at Ignite Seattle 4 on August 8, 2007. McClure's PayPal experience (Director of Marketing, 2001–2004) directly informed the framework. PayPal's $10 referral bonus drove 7–10% daily user growth, demonstrating that a single funnel stage — Referral — could transform a growth trajectory. The framework formalised the lesson that all five stages required equal measurement discipline.
AARRR measures five sequential stages in the customer lifecycle. Each stage has a distinct definition, specific metrics, and a measurement methodology. The stages are named in the order a user experiences them, which is the opposite of the order a team should optimise them.
AARRR describes the customer journey in acquisition-first order. The optimisation order is the reverse: Retention first, then Activation, then Referral, then Revenue, then Acquisition. Optimising Acquisition before fixing Retention is pouring water into a bucket with a hole in it.
"Users come to the site from various channels."
The moment a user finds and arrives at the product. They have not yet engaged meaningfully. Acquisition is the top of the funnel, not the most important stage.
Web analytics (GA4), UTM parameters, app store analytics, ad dashboards.
The first stage most teams over-invest in before fixing downstream stages that convert acquired users.
RARRA (Retention, Activation, Referral, Revenue, Acquisition) was created in 2017 for mobile apps where acquisition costs had made top-of-funnel spending irrational. Both are the same framework applied at different maturity levels. This page uses AARRR as the primary structure because it is the universal reference.
Confirm three conditions before starting. A product is in market with real users. Analytics is instrumented. A specific business decision exists that this measurement will inform.
Define precisely what each of the five stages means for your specific product.
Identify the specific in-product action that signals a user has experienced core value (e.g., Slack's 2,000 messages).
Choose no more than two key metrics per stage. Track ratios and rates, not raw numbers.
Set up product analytics, implement event tracking for each KPI, configure acquisition channel attribution.
Create a single-view dashboard covering all five stages with conversion rates between stages.
Confirm the activation metric predicts retention by comparing cohorts of activated vs. non-activated users.
Find the stage transition with the largest drop in users. This is the One Metric That Matters (OMTM).
Assign a team or person responsible for each stage metric. Unowned metrics do not improve.
A/B test at the bottleneck stage. Focus 80% effort on existing features.
Review the dashboard weekly. Update KPIs and targets quarterly. Shift focus as bottlenecks change.
Your top-line user numbers grow every month but revenue is flat, meaning you attract users who do not activate, retain, or pay. You have run A/B tests that moved a conversion metric but did not produce measurable revenue impact — the classic symptom of optimising a disconnected stage.
Early Startup (S3) - Useful (Activation and Retention only): The goal is product-market fit, not growth. Tracking Acquisition, Referral, and Revenue before Activation is fixed optimises the wrong thing.
Startup (S4) - Critical: Full funnel tracking begins. Start with Retention, move backward through the funnel.
Growth Stage (S5) - Critical: Referral programmes become the highest-leverage investment at this stage because the user base is large enough to generate meaningful viral coefficients.
Scale Stage and Enterprise (S6–S7) - Useful (adapted): Augment with North Star Metrics, growth loops, or segment-specific funnels.
In September 2008, Dropbox had roughly 100,000 registered users and a fundamental unit economics failure. Google AdWords cost $233 to $388 per acquisition for a product priced at $99 per year.
How They Applied ItDrew Houston explicitly credits the framework: 'We used Dave McClure's AARRR framework to run the company in the early days.' Acquisition was solved with a viral demo video. Activation revealed an 80% failure rate for file uploads, which they fixed. Referral became the primary growth lever, driving 35% of daily signups.
After pivoting from a failed game in 2012, Slack's challenge was convincing teams to adopt software in a category most users did not recognise as distinct from email. The growth thesis depended on finding the activation threshold.
How They Applied ItActivation was critical. Stewart Butterfield identified 2,000 messages sent per team as the activation threshold. Every aspect of onboarding was engineered to push teams toward that threshold. Acquisition was organic. Revenue followed activation.
In 2008, Airbnb was a struggling startup. In NYC, growth was stalling despite having listings. The problem was not Acquisition. It was Activation.
How They Applied ItAcquisition used a Craigslist integration. Activation broke through when listings got professional photos (doubling revenue). Referral was redesigned in 2014 using double-sided incentives placed at moments of highest satisfaction.
Teams build comprehensive dashboards covering all five stages at launch. Nobody owns the bottleneck.
The framework's clean five-stage structure invites teams to treat it as five parallel workstreams.
Focus on one to two stages at a time, starting with the biggest bottleneck.
Teams count website visits or signups as success metrics without measuring whether users experienced core value.
Acquisition metrics are the easiest to measure and surface in board reports.
Define activation explicitly before running any acquisition campaign. Kill spend on channels producing high acquisition but low activation.
Companies rush to monetise before users find value, creating churn machines that burn acquisition spend.
Investor pressure for revenue metrics.
Apply Sean Ellis's Startup Growth Pyramid: achieve product-market fit at the base before pursuing monetisation.
Teams report total users or cumulative revenue — numbers that always go up.
Aggregate numbers are psychologically satisfying and easy to present.
Build cohort-based dashboards from day one. Group users by acquisition date and track retention.
Teams deprioritise Referral until the 'core funnel is fixed'.
Referral feels less urgent and is harder to measure than Acquisition.
Build referral prompts into the product at the moment of highest user satisfaction.
Startups present high acquisition numbers without demonstrating activation or retention.
Vanity metrics are easy to present. Boards celebrate top-line growth.
Restructure investor reporting around per-cohort retention, activation rates, and LTV/CAC.
Pick and sequence up to six frameworks for a real broken-funnel scenario. Scored on inclusion and order.
Vote on which framework applies first. Community split bars animate in, then the dependency-based verdict reveals.
Flag the broken steps in a six-step strategy. Results show correct flags, false positives, and reasoning.