How To Define Cohorts? (First Install Vs First Purchase Guide)


Lakshith Dinesh
Updated on: Dec 2, 2025
You're spending ₹10 lakh a month on app installs, but half your users disappear within 48 hours while the other half generate 80% of your revenue six months later. The difference between those two groups isn't luck, it's which cohort they belong to and how you're measuring success.
Cohort analysis groups users by a shared starting point—like their first install date or first purchase date, then tracks how those groups behave over time. This guide walks you through choosing between install cohorts and purchase cohorts, building your first cohort report, avoiding common analysis mistakes, and connecting cohort insights to actual ROAS decisions.
What Is Cohort Analysis And Why It Matters For Mobile Apps
Cohort analysis groups users who share a common characteristic or experience within a defined time frame. Think of everyone who installed your app in January, or everyone who made their first purchase during Diwali, that's a cohort.
For mobile apps, cohorts track how groups of users behave over days, weeks, or months after their first interaction with your product. You're watching patterns unfold over time rather than looking at a single snapshot.
Here's what makes cohorts different from vanity metrics:
Total downloads tell you volume, cohorts tell you quality: 100,000 installs sounds great until you realize 80% uninstalled within 48 hours
Monthly active users hide churn: Your MAU might stay flat while you're constantly replacing churned users with expensive new installs
Campaign-level cohorts show true ROI: Users from your Meta campaign in Week 1 might stick around twice as long as users from Google in Week 2
Install Cohorts Vs Purchase Cohorts: Picking The Right North Star
Install cohorts group users by their first app download date. Purchase cohorts group users by their first transaction date. The difference changes everything about how you measure success.
Cohort Type | Best For | Key Metric | When To Use |
|---|---|---|---|
Install Cohorts | User acquisition campaigns | Day 1, 7, 30 retention | Early growth, freemium apps |
Purchase Cohorts | Revenue optimization | Customer LTV, repeat purchase rate | Established apps, paid products |
Install cohorts answer "Are people sticking around?" Purchase cohorts answer "Are customers worth acquiring?" If you're running campaigns optimized for installs, you'll track install cohorts to see if users open your app the next day. If you're running campaigns optimized for purchases, you'll track purchase cohorts to see if buyers come back for a second order.
Most apps start with install cohorts during their growth phase, then layer in purchase cohorts once monetization becomes the priority. You don't have to pick one forever.
When To Use First Install Cohorts For Growth Decisions
Install cohorts work best when you're testing new acquisition channels or scaling user base quickly. They tell you which traffic sources deliver engaged users, not just downloads that immediately uninstall.
You launch TikTok ads for the first time—install cohorts show whether users from TikTok have better Day 7 retention than your existing Meta traffic. You update your app store screenshots—install cohorts reveal if the new creative attracts higher-quality users who stick around. You run three ad variations—install cohorts show which creative brings users who actually use your app.
Install cohorts also work better for apps with long consideration cycles. If your users typically browse for weeks before making their first purchase, grouping by install date captures the full user journey from the start.
When To Use First Purchase Cohorts For Profitability Tracking
Purchase cohorts become essential once you're spending serious money on acquisition and need to prove ROI to your finance team or investors. They cut through the noise of free users and show you the economics of paying customers.
Here's where purchase cohorts give you clearer answers:
Revenue-focused campaigns: When you're running "Shop Now" campaigns on Meta or Google optimized for purchases, purchase cohorts show true customer quality by acquisition source
LTV analysis: Purchase cohorts reveal how much revenue customers generate in months 1, 3, 6, and 12 after their first transaction
Unit economics: Calculate your actual payback period by dividing customer acquisition cost by the cumulative revenue each purchase cohort generates over time
Purchase cohorts also expose the difference between one-time buyers and repeat customers. If your March purchase cohort has a 40% repeat purchase rate by Month 3 while April's cohort sits at 15%, you know something changed in your acquisition strategy or product experience.
How Cohort Modelling Shapes Retention LTV And ROAS
Cohort analysis feeds directly into the three metrics that determine whether your app grows profitably or burns cash. Every retention curve, LTV projection, and ROAS calculation starts with cohort data.
Retention patterns emerge clearly when you stack cohorts side by side. You'll spot that users from organic search have 45% Day 30 retention while paid social users drop to 22%, which tells you you're either targeting the wrong audience or your onboarding experience doesn't match the promise in your ads.
LTV forecasting relies entirely on cohort behavior over time. If your January purchase cohort generated ₹500 per customer by Month 6, you can model future cohorts and set acquisition bids accordingly. ROAS optimization happens when you connect cohort performance back to the campaigns that drove those users—if iOS install cohorts from Campaign A deliver 3.2x ROAS by Day 30 while Campaign B delivers 1.8x, you know exactly where to shift budget.
Steps To Build A Cohort Report In Under 30 Minutes
Building your first cohort report feels intimidating, but the framework is simpler than most analytics projects. You're essentially creating a table where rows are user groups and columns are time periods.
1. Define The Entry Event And Time Bucket
Pick the moment when users enter a cohort—first install, first purchase, first subscription, or any milestone that matters for your business. Then decide how to group entry events: daily cohorts for high-volume apps, weekly cohorts for moderate traffic, monthly cohorts for lower-volume apps.
Daily cohorts give you the fastest feedback but create noisy data if you only get a few hundred installs per day. Weekly cohorts smooth out day-to-day variance while still letting you spot trends within a month.
2. Pull Clean Attribution Data
Gather user-level data from your mobile measurement partner or analytics platform, making sure each user has an accurate install date or first purchase date. Messy attribution breaks cohort analysis—if 30% of your installs don't have proper campaign tags, your cohorts will be incomplete.
You'll also want to confirm that your attribution window settings match your business model. E-commerce apps might use a 7-day click window, while high-consideration fintech apps might extend to 30 days.
3. Build The Cohort Table Or SQL View
Structure your data with cohorts as rows and time periods as columns. For an install cohort retention report, each cell shows the percentage of users from that cohort who were active on Day 1, Day 7, Day 30.
For a purchase cohort revenue report, each cell shows cumulative revenue per customer from that cohort after 1 month, 3 months, 6 months. The math is straightforward: divide total cohort revenue by cohort size.
4. Visualise With A Cohort Chart Heatmap
Color-code your cohort table so higher retention or revenue shows as darker colors. Heatmap format makes patterns obvious at a glance—you'll immediately see which cohorts perform better without scanning through rows of numbers.
Most teams use green gradients for positive metrics like retention and red gradients for negative metrics like churn. The goal is to make good cohorts and bad cohorts visually distinct in under 5 seconds.
5. Share The Cohort Report With Stakeholders
Format your insights for different audiences. Your founder cares about payback period and overall unit economics. Your performance marketing manager cares about which campaigns drive the best cohorts. Your finance team cares about LTV projections for budget planning.
Focus on one clear takeaway per cohort chart rather than overwhelming stakeholders with every possible cut of the data.
Reading A Cohort Chart Without Fooling Yourself
Cohort charts look scientific, but they're easy to misread if you're not careful. The patterns you think you're seeing might be data artifacts, seasonal noise, or survivorship bias disguised as insight.
Beware Of Reinstalls And Reattribution
Users who uninstall and reinstall your app can show up as new cohort members if your attribution logic isn't set up correctly. This inflates your install cohort sizes and makes retention look worse than reality since the "new" install is actually a returning user.
Reattribution windows also matter—if a user clicks a new ad campaign 45 days after their original install, some MMPs will reassign them to a new cohort. Make sure you're tracking first-touch attribution for cohort entry, not last-touch.
Normalise For Marketing Spend And Seasonality
Comparing a cohort from a week when you spent ₹5 lakh on ads to a cohort from a week when you spent ₹50 lakh creates a false comparison. Higher spend often brings lower-quality users at the margin, so retention naturally drops.
Seasonal patterns distort cohort comparisons too. Your Diwali cohort might look amazing because purchase intent was already high across the market, not because your campaigns suddenly got better.
Watch Sample Size And Survivorship Bias
A cohort of 50 users tells you almost nothing—random variance will dominate the signal. You generally want at least several hundred users per cohort before drawing conclusions, though the exact threshold depends on your baseline retention and conversion rates.
Survivorship bias creeps in when you only analyze cohorts that made it past a certain threshold. If you only look at purchase cohorts and ignore the 80% of installs who never bought, you'll overestimate your true customer LTV.
Common Mistakes In Cohort Retention Analysis And Fixes
Even experienced growth teams make cohort analysis errors that lead to wrong conclusions and wasted budget. You'll probably recognize at least one of the following.
Mixing Calendar Dates With Cohort Age
Calendar-based reporting shows what happened on specific dates—like total revenue on March 15. Cohort-based reporting shows what happened at specific lifecycle stages—like revenue 30 days after first purchase. Mixing both perspectives in one chart creates confusion.
Always label your axes clearly: is the column "March 2024" or "Month 3 after purchase"? They're fundamentally different questions.
Comparing Different Funnel Steps In One Chart
Putting install cohorts and purchase cohorts on the same heatmap makes no sense because the denominators are different. Your March install cohort might have 10,000 users while your March purchase cohort has 1,500 customers.
Keep install cohort analysis separate from purchase cohort analysis, then connect insights across both reports in your summary.
Ignoring SKAN And Offline Events
iOS privacy changes mean a growing percentage of your installs come through Apple's SKAdNetwork with delayed, aggregated attribution. If you're only analyzing deterministic user-level cohorts, you're missing 30-50% of your iOS traffic.
Offline conversions—like users who see your ad, install your app, then purchase in a physical store—also won't appear in standard cohort reports unless you're stitching online and offline data together.
Tooling Checklist From Spreadsheets To AI-Driven Cohort Analytics
You don't need expensive enterprise software to start analyzing cohorts, but the right tools make the difference between monthly manual reports and always-on insights.
Spreadsheet Template Basics
Start with a simple Google Sheets or Excel template if you're just learning cohort analysis or working with a small user base. Export raw data from your analytics platform, pivot it by cohort and time period, then calculate retention or revenue metrics manually.
This approach works fine for early-stage apps with under 50,000 monthly installs, though it breaks down quickly as data volume grows.
Product Analytics Platforms
Tools like Amplitude, Mixpanel, and CleverTap offer built-in cohort reporting with automatic data refresh. You define the cohort entry event once, and the platform continuously updates retention curves as new data arrives.
The limitation: product analytics platforms typically don't include marketing attribution data. You're analyzing user behavior without knowing which campaigns drove which cohorts.
Mobile Measurement Partners Like Linkrunner
MMPs unify attribution data with cohort analysis, connecting every install and in-app event back to the specific campaign, creative, and channel that drove it. Your cohort reports automatically show campaign-level performance without manual data stitching.
Linkrunner surfaces underperforming cohorts automatically through AI-powered insights—instead of building cohort reports manually each week, you get alerts when a new campaign's Day 7 retention drops below your baseline. Growth teams reallocate budget in days instead of weeks, turning cohort analysis from a reporting task into a real-time optimization tool.
Scale Winning Campaigns Faster With Linkrunner
Cohort analysis only creates value when it changes your decisions. The gap between insight and action is where most teams lose momentum—you spot that Campaign A delivers better cohorts than Campaign B, but by the time you've pulled the data, built the report, and gotten approval to shift budget, two more weeks have passed.
Linkrunner unifies attribution data across Meta, Google, TikTok, and other channels into cohort reports that update automatically. You see which campaigns drive high-retention cohorts and which burn budget on users who churn immediately, without reconciling screenshots and spreadsheets across five different dashboards.
The AI layer surfaces patterns you'd miss in manual analysis—like iOS cohorts from a specific ad set that looked fine at Day 3 but collapsed at Day 14, or Android purchase cohorts from influencer traffic that took longer to convert but delivered 2.3x higher LTV. Request a demo to see how Linkrunner turns cohort data into growth decisions.
FAQs About Defining Cohorts
How large does a cohort need to be for reliable retention insights?
Cohorts typically need at least several hundred users to produce statistically meaningful patterns, though the exact size depends on your app's baseline retention rates and conversion goals. Apps with very low retention might need larger cohorts to separate signal from noise.
What happens to users who reinstall my app after uninstalling?
Reinstalling users can be treated as new cohort members or tracked separately depending on your attribution settings and business objectives. Most teams assign reinstalls to their original cohort to maintain clean first-touch attribution and avoid inflating cohort sizes.
Can I switch from install-based to purchase-based cohorts without losing historical data?
You can maintain both cohort types simultaneously and gradually shift focus to purchase cohorts as your app matures and revenue becomes the primary growth metric. Historical install cohort data remains valuable for understanding early user behavior even after you start prioritizing purchase cohorts.




