7 Critical Events Every Mobile Game Should Track from Day One

Lakshith Dinesh
Updated on: Jan 30, 2026
Your mobile game drove 200,000 installs last month. Your dashboard shows healthy DAU figures, average session times of 12 minutes, and players completing the tutorial. Then your UA lead asks: "What percentage of players who installed actually made an in-app purchase?" You dig into the data and find that only 1,800 players (0.9%) converted to paying users. The other 198,200 are playing for free, churning after a few sessions, or stuck at early game stages without progressing.
This is the measurement gap that separates profitable mobile games from those that burn UA budgets chasing install volume. Generic event tracking (installs, sessions, level completions) tells you people are playing. Monetisation-focused tracking tells you whether they're spending, which is the metric that determines whether your game survives.
Mobile games require specialised event taxonomies because the monetisation funnel is fundamentally different from ecommerce, fintech, or subscription apps. Players must experience core gameplay, validate enjoyment, reach engagement depth, encounter monetisation opportunities, and choose to spend real money on virtual goods. Each step requires specific measurement to identify where players churn and which acquisition sources drive spenders rather than free loaders.
Why Generic Event Tracking Fails for Gaming (The ARPU Blindspot)
Most gaming analytics guides recommend tracking standard events: level_completed, session_started, tutorial_finished. These metrics create a dangerous blindspot around average revenue per user (ARPU), the metric that determines profitability.
Generic tracking might show 50,000 daily active users playing 3 sessions each. But 50,000 DAU generating zero revenue is a failing game. Meanwhile, 15,000 DAU with 3% conversion and ₹150 ARPU generates ₹6.75 lakh daily. DAU without monetisation context is a vanity metric that misleads teams into thinking engagement equals success.
The ARPU blindspot emerges because gaming monetisation is highly concentrated. In most free-to-play games, 1-3% of players generate 50-80% of revenue. These "whales" and "dolphins" behave differently from free players, but generic tracking treats all players equally. A user who completes 50 levels without spending looks identical to a future whale who's about to spend ₹5,000 in the next week.
The 7 events below represent the minimum viable event taxonomy for any mobile game. They track player progression from onboarding through monetisation activation and long-term retention, revealing which acquisition sources drive players who actually spend money rather than players who simply churn after consuming free content.
Event #1: Tutorial Completed (Onboarding Success)
Event name: tutorial_completed
When to fire: When player finishes all required tutorial steps and enters main gameplay (not just skipping tutorial, but actually completing it)
Why it matters: Tutorial completion is gaming's activation event. Players who complete tutorials understand core mechanics and have committed initial time investment. This event enables measurement of onboarding effectiveness and early drop-off rates. Players who skip or abandon tutorials rarely return and almost never monetise.
Properties to track:
Tutorial duration (time to complete in seconds)
Steps completed (if multi-step tutorial)
Skipped any steps (yes/no)
Device performance (fps during tutorial, if measurable)
Install source (attribution data)
Benchmark targets: 65-80% of installs should complete tutorial within first session for well-designed games. Lower rates indicate tutorial is too long, confusing, or encounters technical issues. Higher rates (above 85%) suggest tutorial might be too short to teach core mechanics, risking later confusion.
Common issues detected:
Drop-off at specific tutorial steps reveals UX problems or confusing mechanics. Long tutorial completion times (above 10 minutes for simple games) indicate friction. If certain acquisition channels show dramatically lower tutorial completion rates, those channels are attracting players who aren't genuinely interested in your game genre.
Optimisation opportunities:
Analyse exactly where players abandon tutorials to identify confusing steps. A/B test tutorial length to find optimal balance between education and engagement. Send re-engagement push notifications to players who installed but didn't complete tutorial within 24 hours.
Event #2: Level 5 Reached (Core Loop Validation)
Event name: level_5_reached
When to fire: When player reaches level 5, stage 5, or equivalent early milestone (adjust number based on your game's progression system)
Why it matters: Reaching level 5 validates that players understand and enjoy your core gameplay loop. Tutorial teaches mechanics; level 5 confirms players want to continue using those mechanics. This is typically the point where players have enough context to make informed decisions about whether your game is worth their continued time and potential money.
Properties to track:
Time to reach level 5 (hours/days since install)
Sessions played to reach level 5
Total playtime to reach level 5
Difficulty encountered (lives lost, retries, hints used)
In-app purchase exposed (was player shown IAP offer before level 5?)
Benchmark targets: 40-55% of players who complete tutorial should reach level 5 within 7 days. This represents core loop validation. Lower rates indicate core gameplay isn't compelling enough to drive continued engagement. Much lower rates from specific UA channels suggest those channels attract players who don't actually enjoy your game genre.
Common issues detected:
Long time-to-level-5 (over 5 days) indicates low engagement frequency even among retained players. High difficulty metrics (many retries) at specific levels before level 5 indicate progression barriers. If level 5 rate varies significantly by device type, performance issues may be causing drop-off on lower-end devices.
Optimisation opportunities:
Analyse which levels between tutorial and level 5 have highest drop-off to identify difficulty spikes or boring content. Create milestone rewards at level 5 to encourage continued play. Use level 5 completion as a trigger for first soft monetisation prompts (special offers, limited-time packs).
Event #3: First Session Over 10 Minutes (Engagement Depth)
Event name: first_session_10min
When to fire: When player completes their first single session exceeding 10 minutes of active gameplay
Why it matters: Session length indicates engagement depth. Many players install, play 2-3 minutes, and churn. Players who voluntarily spend 10+ minutes in a single session are genuinely engaged with your content. This event strongly correlates with long-term retention and monetisation likelihood.
Properties to track:
Exact session duration (minutes)
Content consumed (levels played, features used)
Session number (first ever vs later sessions)
Time of day (morning, afternoon, evening, night)
Day of week
Benchmark targets: 30-45% of players who complete tutorial should have at least one 10+ minute session within their first 7 days. Higher rates indicate highly engaging core loop. Lower rates suggest your game is designed for quick sessions or lacks depth to sustain longer engagement.
Common issues detected:
If long sessions happen only in first 1-2 days then disappear, content exhaustion or difficulty walls are causing disengagement. If long sessions concentrate on weekends but weekday sessions are short, your game may be too demanding for casual mobile play.
Optimisation opportunities:
Identify which game features drive longest sessions to expand those areas. Create session length-based rewards to encourage longer play ("Play 15 minutes for bonus reward"). Analyse session length patterns by acquisition source to identify channels driving deeply engaged players.
Event #4: First IAP (Monetisation Activation)
Event name: first_iap_completed
When to fire: When player makes their first-ever in-app purchase (any amount, any item)
Why it matters: First IAP is gaming's most critical monetisation event. It marks the transition from free player to paying customer. Players who make first IAP are 10-15x more likely to make subsequent purchases than players who've never paid. This event enables calculation of install-to-payer conversion rate and ARPU by acquisition channel.
Properties to track:
Purchase amount (₹)
Item purchased (starter pack, currency, cosmetic, power-up, subscription)
Player level at purchase
Sessions played before purchase
Days since install
Trigger context (what prompted purchase: special offer, paywall, organic browse)
Benchmark targets: 1-5% of players should make first IAP within 30 days for healthy free-to-play games. This varies significantly by genre. Casual games see 2-5% conversion, while mid-core games see 1-3%. Hardcore games can see 3-8% but with higher ARPU. Focus on payer conversion rate and ARPU, not just install volume.
Common issues detected:
Very early first IAP (within 24 hours) might indicate predatory design that converts impulse but damages long-term value. Very late first IAP (after 14+ days) suggests monetisation touchpoints aren't reaching players at optimal moments. Concentration on single low-value item indicates limited monetisation depth.
Optimisation opportunities:
Send first_iap_completed events as postbacks to ad platforms (Meta, Google, TikTok) to optimise campaigns toward payers rather than installers. Create starter pack offers optimised for first purchase conversion. Analyse which content moments correlate with first purchase to place monetisation prompts strategically.
Event #5: D7 Active (Retention Milestone)
Event name: d7_retention_achieved
When to fire: When player returns to play on the 7th day after install (not 7 consecutive days, but active on Day 7 specifically)
Why it matters: D7 retention is gaming's standard retention benchmark. Players who return on Day 7 have formed habit around your game and are significantly more likely to become long-term players. This event enables calculation of D7 retention rate by cohort and acquisition source, which is essential for forecasting LTV and optimising UA.
Properties to track:
Player level at D7
Total sessions played (D0-D7)
Total playtime (D0-D7)
IAP made (yes/no, total spent)
Days active in D0-D7 period (1-7)
Benchmark targets: 15-25% D7 retention is healthy for most mobile games. Casual games typically see 18-25%, while mid-core and hardcore games see 12-20% but with higher LTV per retained player. D7 retention below 10% indicates fundamental engagement problems that UA optimisation cannot fix.
Common issues detected:
D7 retention significantly below D1 retention (more than 50% decay) indicates poor content depth or progression design. If D7 retention varies dramatically by acquisition channel (some channels 25%, others 8%), you're paying equally for very different quality players.
Optimisation opportunities:
Create D7 return incentives (login bonuses that peak on Day 7, special events starting Day 7). Use D7 retention as a key metric for campaign optimisation alongside IAP events. Analyse what separates D7 retained players from churned players to identify engagement patterns worth encouraging.
Event #6: Social Feature Used (Viral Coefficient Indicator)
Event name: social_feature_used
When to fire: When player uses any social feature: friend invite, guild join, leaderboard share, multiplayer match, gift sent
Why it matters: Social feature usage indicates engagement depth and viral potential. Players who connect socially with your game are significantly more likely to retain long-term and become advocates who bring new players organically. This event measures viral coefficient contribution and social retention multipliers.
Properties to track:
Feature type (friend_invite, guild_join, leaderboard_share, gift_send, multiplayer_match)
Player level at usage
Days since install
Previous social feature uses (first time vs repeat)
Outcome (invite accepted, guild found, match completed)
Benchmark targets: 15-30% of D7 retained players should use at least one social feature within 14 days. Higher rates indicate strong social mechanics and engaged community. Lower rates suggest social features are hidden, unappealing, or the game doesn't benefit from social play.
Common issues detected:
High friend invite sends but low acceptance rates indicate your invite messaging or incentives aren't compelling to recipients. Guild features with low join rates suggest matchmaking or discovery problems. Social features used only by high-level players indicate premature exposure to new players.
Optimisation opportunities:
Prompt social features at natural moments (after achievement, during waiting time). Create two-sided incentives for friend invites (both inviter and invitee receive rewards). Track which social features correlate with highest retention and monetisation to prioritise those in UX.
Event #7: Ad Watched (Hybrid Monetisation Signal)
Event name: rewarded_ad_watched
When to fire: When player voluntarily watches a rewarded video ad to completion
Why it matters: For games using hybrid monetisation (IAP plus ads), rewarded ad views generate revenue from non-paying players and serve as a leading indicator of potential IAP conversion. Players willing to watch ads for rewards are engaged and value your in-game currency, making them candidates for future IAP conversion.
Properties to track:
Ad network (AdMob, Unity, ironSource, etc.)
Reward type received (currency, lives, power-up, skip)
Player level
IAP status (never paid, has paid before)
Session ad watch count
Daily ad watch count
Benchmark targets: 40-60% of daily active players should watch at least one rewarded ad in games with well-implemented ad monetisation. Ad views per player per day typically range from 2-6. Higher engagement indicates ad placements and rewards are well-designed. Very low ad engagement suggests rewards aren't valuable or placements are poorly positioned.
Common issues detected:
Players watching maximum allowed ads daily but not converting to IAP may be "ad whales" who you should monetise differently (ad-focused experiences). Ad watches concentrated at specific game moments indicate those moments create strong reward need. Very high ad views from low-retention players might indicate your free experience is too ad-dependent.
Optimisation opportunities:
Position rewarded ads at friction points where players need help (after level failure, before difficult content). Create ad-to-IAP conversion paths ("Watch 5 ads for X, or buy X for ₹50"). Segment players by ad engagement to identify candidates for IAP conversion campaigns.
Attribution Windows for Gaming: Why D1/D7/D30 Cohorts Matter
Gaming has the shortest consideration period of any mobile app category. Players decide within minutes whether they'll continue playing. However, monetisation decisions take longer. Attribution windows for gaming should balance fast install-to-session conversion with longer monetisation journeys.
Recommended gaming attribution configuration:
Click-through attribution: 7 days
View-through attribution: 1 day
Primary optimisation event: first_iap_completed (not install)
The 7-day click window captures players who see your ad, research or forget, then install days later. The 1-day view window avoids over-crediting awareness impressions for organic installs.
Cohort analysis timeframes:
D1: Tutorial completion, first session length, early engagement quality
D7: Retention milestone, level progression, monetisation readiness
D14: First IAP timing, social feature adoption
D30: LTV stabilisation, repeat purchase patterns, whale identification
Track these cohort metrics by acquisition source. A campaign driving 50,000 installs with 15% D7 retention and 1.5% D30 payer rate performs very differently than a campaign driving 20,000 installs with 25% D7 retention and 3% D30 payer rate, even if both have similar CPI.
Most critically, send IAP events back to ad platforms via your MMP. Platforms like Linkrunner enable postback configuration for first_iap_completed and subsequent purchase events, allowing Meta, Google, and TikTok to optimise campaigns toward paying players rather than just installers.
Frequently Asked Questions
How do I track progression in games without traditional "levels"?
Define equivalent milestones based on your game structure. For endless runners, use score thresholds (first 1,000 points, first 10,000 points). For sandbox games, use content milestones (first building constructed, first quest completed). For multiplayer games, use rank or MMR thresholds. The key is identifying early milestones that correlate with long-term retention.
Should I track every level completion or just milestones?
Track milestone levels (1, 5, 10, 20, 50, 100) as distinct events for funnel analysis. Track all level completions as a generic event with level_number property for detailed progression analysis. This balances funnel clarity with analytical depth without creating hundreds of separate events.
What's the best way to measure whale potential early in player lifecycle?
Create a predictive scoring model based on early behaviours that correlate with future whale status: tutorial completion speed, session length, progression pace, first IAP timing and amount, ad engagement patterns. Players who exhibit 3+ whale-correlated behaviours in first 7 days deserve special attention (VIP treatment, premium support, targeted offers).
How do I attribute revenue when players make purchases across multiple sessions/days?
Each purchase event should carry the original acquisition attribution from install. Your MMP handles this automatically. For LTV analysis, aggregate all purchase events per user and associate total LTV with the campaign that drove their install. This reveals which campaigns drive high-LTV players, not just which campaigns drive first purchases.
Should I optimise ad campaigns for installs or for first IAP?
Optimise for first_iap_completed when possible. This tells ad platforms to find users similar to your payers, not just users who install games. However, first IAP requires minimum conversion volume (typically 50-100 conversions per week per campaign) for algorithm optimisation to work. For new games, start with install optimisation and switch to IAP optimisation once you have sufficient conversion volume.
How do I measure the LTV impact of different in-game events?
Create cohort comparisons. Compare D30 and D60 LTV between players who did vs didn't achieve specific milestones (reached level 5 in first 3 days vs took longer, watched first rewarded ad vs never watched, joined guild vs solo player). This reveals which early behaviours predict higher lifetime value, informing game design and UA targeting decisions.
If you're building a mobile game and want to connect player behaviour events to acquisition sources for true ROAS measurement, request a demo from Linkrunner to see how unified attribution and monetisation tracking works in practice.




