Attribution for Product-Led Growth: Measuring Virality, Referrals, and Organic Loops

The reluctant pantry manager.
Lakshith Dinesh

Lakshith Dinesh

Reading: 1 min

Updated on: Feb 18, 2026

A community app hits 500,000 monthly installs. The marketing team celebrates. Then someone asks: "How much of this is from paid campaigns?" The answer: 35%. The remaining 65% shows up as "organic" in the MMP dashboard, a single grey bar with no breakdown, no channel detail, and no actionable insight.

That 65% includes user referrals, share-driven viral loops, word-of-mouth recommendations, search installs from content, and network effects pulling in adjacent users. But because none of these touchpoints involve a paid click, traditional attribution treats them identically: unmeasured.

For product-led growth (PLG) apps, this is not a minor reporting gap. It's a fundamental measurement failure. When 60-80% of your growth comes from organic and viral channels, optimising exclusively on the 20-40% you can attribute through paid campaigns means you're steering the business using a fraction of the picture.

This guide provides a framework for measuring viral loops, referral attribution, organic growth channels, and network effects so PLG teams can finally understand what's actually driving their growth and invest accordingly.

Why Paid Attribution Misses 80% of PLG Growth (The Organic Blind Spot)

Traditional mobile attribution was built for a specific model: marketer buys ad, user sees ad, user clicks ad, user installs app, MMP attributes install to ad. This linear, paid-click model works well for apps where the majority of growth comes from paid user acquisition.

PLG apps break this model because their primary growth engine isn't advertising. It's the product itself.

Consider how a messaging app grows. User A invites User B through the app's share feature. User B installs and invites Users C and D. Users C and D each invite two more friends. Within a week, one organic user has generated seven additional installs, none of which involved a paid ad click.

In a standard MMP dashboard, all seven installs appear as "organic" with identical attribution. You can't distinguish referral-driven installs from App Store browse installs from direct-URL installs. You can't measure which users are generating the most referrals, which share mechanics are most effective, or whether your viral coefficient is improving over time.

The financial consequence is severe. If a PLG app allocates ₹40 lakh monthly to paid UA based on install-level ROAS but doesn't account for the organic multiplier effect, they might be dramatically overvaluing some channels and undervaluing others. A Meta campaign driving 10,000 installs might generate 3,000 additional organic installs through referrals from those paid users, a multiplier that transforms a 1.5× ROAS into an effective 2.0× when accounting for downstream virality.

Without organic attribution, that multiplier stays invisible.

The PLG Attribution Challenge: Measuring What You Don't Pay For

PLG attribution requires measuring five distinct organic growth drivers that traditional MMPs either ignore or collapse into a single "organic" category.

Each driver has different measurement mechanics, different time horizons, and different optimisation levers. Treating them identically is like treating Meta, Google, and TikTok as one channel because they all involve "digital ads."

The five drivers are: direct referrals (user-to-user invites), viral loops (share features and social mechanics), network effects (value that grows with user base), word-of-mouth (untracked social recommendations), and content/SEO (search-driven installs). Each requires specific attribution approaches, and the most effective PLG teams measure all five independently.

Organic Growth Driver #1: Direct Referrals (User-to-User Invites)

Direct referrals are the most measurable organic growth channel. A user explicitly invites another user through an in-app referral mechanism, typically involving a unique link, a referral code, or a share action.

How to measure it:

Generate unique referral links for each user containing a referral identifier. When the invited user installs and opens the app, the referral link resolves through your deep linking infrastructure, connecting the new install to the referring user.

The attribution chain looks like this: Referring User (ID: 12345) → Generates referral link → Invited User clicks link → Installs app → Opens app → Deep link resolves → Install attributed to User 12345.

This requires dynamic deep links that preserve referral context through the install process. Standard app store links lose all referral parameters during installation. Deferred deep links maintain context even when the app isn't installed at click time, which is essential for referral tracking.

Key metrics to track:

  • Referral send rate: Percentage of active users who send at least one referral per month

  • Referral conversion rate: Percentage of referral link clicks that result in installs

  • Referral-to-activation rate: Percentage of referred users who complete onboarding

  • Referral chain depth: How many "generations" of referral do you see? (User A refers B, B refers C, C refers D)

Benchmarks:

  • Healthy referral send rates range from 8-25% of MAU depending on product category

  • Referral click-to-install conversion: 15-35% (significantly higher than paid ad conversion)

  • Referred users typically show 25-40% higher D7 retention than paid-acquired users

Organic Growth Driver #2: Viral Loops (Share Features and Social Mechanics)

Viral loops differ from direct referrals in one critical way: the sharing action isn't explicitly "invite a friend." Instead, users share content, results, or experiences from the app, and those shares indirectly drive new installs.

Examples include a fitness app user sharing their workout results on Instagram Stories, a gaming app generating shareable score cards, a financial app letting users share savings milestones, or an edtech app creating shareable quiz results.

How to measure it:

Each share action should generate a tracked link containing the sharing user's ID and the content context. When someone clicks the shared content and installs the app, attribute the install to the viral loop, the specific content type, and the sharing channel (WhatsApp, Instagram, Twitter, etc.).

Key metrics to track:

  • Share rate: Percentage of active sessions that result in a share action

  • Share-to-install rate: Percentage of share link interactions that result in installs

  • Viral content effectiveness: Which content types (scores, achievements, results, offers) drive the most installs per share

  • Channel effectiveness: Which social platforms generate the highest share-to-install conversion

Benchmarks:

Share rates vary dramatically by product category. Gaming apps with social mechanics see 5-15% session share rates. Content apps see 2-8%. Utility apps see 1-3%.

Share-to-install conversion is typically lower than referral conversion (5-15% vs 15-35%) because the intent is weaker. Someone seeing a friend's workout stats is less motivated to install than someone receiving a direct personal invitation.

Organic Growth Driver #3: Network Effects (Value That Grows with User Base)

Network effects create organic growth that's difficult to attribute to specific actions because the growth driver is the product's increasing utility as more users join.

A messaging app becomes more useful as more of your contacts join. A marketplace app becomes more valuable as more sellers list products. A community app becomes more engaging as more members participate.

How to measure it:

Network effect attribution is indirect. You can't attribute a specific install to "network effects" the way you attribute to a referral link. Instead, measure the proxy signals.

  • Organic install velocity: Track organic installs per week. Accelerating organic installs with constant paid spend indicate network effects kicking in.

  • Contact overlap: For communication apps, measure what percentage of new users already have contacts on the platform. Higher overlap correlates with network-effect-driven installs.

  • Marketplace density: For marketplace apps, track organic installs relative to seller or listing growth. More supply drives more demand-side installs organically.

  • Geographic clustering: Network effects often manifest geographically. Track organic install density by city or region. Rapid acceleration in specific areas indicates network effects.

Key metrics to track:

  • Organic multiplier: Ratio of organic installs to paid installs per month (increasing ratio signals growing network effects)

  • Contact graph density: Percentage of new users with 3+ existing contacts on platform

  • Organic install acceleration: Week-over-week organic install growth rate at constant paid spend

Organic Growth Driver #4: Word-of-Mouth (Untracked Social Recommendations)

Word-of-mouth is the organic growth channel that resists direct attribution. Someone mentions your app in conversation, a friend searches for it in the App Store, installs it, and your MMP reports it as an organic install. No link was clicked. No referral code was used. The recommendation happened in the physical world or through untracked digital conversations.

How to measure it (indirectly):

You cannot directly attribute word-of-mouth installs, but you can estimate their volume and track proxy signals.

Method 1: Residual analysis. Total organic installs minus referral-attributed installs minus viral-loop-attributed installs minus search/SEO-attributed installs equals estimated word-of-mouth. This is imprecise but directionally useful.

Method 2: Post-install surveys. Ask new users "How did you hear about us?" within the onboarding flow. Options: "Friend recommended," "Saw on social media," "Searched in App Store," "Saw an ad." Survey data combined with attribution data reveals word-of-mouth volume.

Method 3: Branded search correlation. Track branded keyword search volume (App Store and Google) alongside your marketing activity. Increases in branded searches during periods of constant paid spend indicate word-of-mouth driving awareness.

Organic Growth Driver #5: Content and SEO (Search-Driven Installs)

Content-driven installs come from users who discover your app through blog posts, social media content, app store search (ASO), or web search results.

How to measure it:

Web-to-app attribution tracks users who land on your website through SEO and then install your app. This requires web SDK implementation that connects the web session to the app install through deferred deep linking.

App Store Optimisation (ASO) attribution is partially available through App Store Connect and Google Play Console analytics, which show install sources including search terms. However, connecting specific search terms to downstream in-app behaviour requires MMP integration.

Key metrics to track:

  • Web-to-app conversion rate: Percentage of mobile web visitors who install the app

  • ASO-driven installs: Installs from App Store/Play Store search (branded vs non-branded)

  • Content-to-install path: Which blog posts or web pages drive the most app installs

For teams implementing this tracking, understanding how deep linking drives conversion from web to app is essential. Without deep links preserving context from web content to app experience, web-to-app conversion rates typically drop 40-60%.

Attribution Taxonomy for PLG: Separating Paid, Organic, and Viral

Building a PLG attribution taxonomy requires expanding beyond the standard paid/organic binary. Here's a practical taxonomy that separates meaningful organic channels.

Level 1: Paid vs Organic

  • Paid: Any install with a tracked paid media touchpoint (Meta, Google, TikTok, etc.)

  • Organic: Any install without a paid media touchpoint

Level 2: Organic breakdown

  • Referral: Install attributed to a unique referral link from an existing user

  • Viral Share: Install attributed to a content share link (not explicit invite)

  • Search (Branded): Install from App Store/Play Store search on your brand name

  • Search (Non-Branded): Install from App Store/Play Store search on category terms

  • Web-to-App: Install from a user who visited your website first

  • Direct: Install with no identifiable touchpoint (includes word-of-mouth, untracked recommendations)

Level 3: Source detail

  • Referral → by referring user tier (power referrers vs casual referrers)

  • Viral Share → by content type (score cards, achievements, offers, results)

  • Viral Share → by platform (WhatsApp, Instagram, Twitter, SMS)

  • Web-to-App → by landing page or content piece

This taxonomy transforms "65% organic" from a useless grey bar into actionable intelligence: "22% referral, 15% viral share, 12% branded search, 8% web-to-app, 8% direct."

Measuring Viral Coefficient: How Many New Users Does Each User Bring?

The viral coefficient (K-factor) is the fundamental metric for PLG growth. It answers: for every user who joins, how many additional users do they bring?

K-factor formula:

K = (invitations sent per user) × (conversion rate per invitation)

If the average user sends 5 invitations and 20% of those invitations result in installs: K = 5 × 0.20 = 1.0

A K-factor of 1.0 means each user generates exactly one new user, creating self-sustaining growth (zero marginal acquisition cost for organic growth). A K-factor above 1.0 means exponential growth. Below 1.0 means organic growth decays without continued paid acquisition.

Realistic K-factor ranges:

  • Most consumer apps: 0.15-0.40

  • Strong viral products (messaging, social): 0.40-0.80

  • Exceptional viral products: 0.80-1.5

  • Sustained K > 1.0 is rare and typically temporary (launch phase, feature virality)

How to improve K-factor:

Increase invitations per user by reducing friction in sharing flows, adding contextual share prompts, and incentivising referrals. Increase conversion per invitation by improving referral landing experiences, using deferred deep links to route invited users to relevant content, and ensuring the first-session experience matches the referral promise.

Referral Attribution: Tracking Invite Sender to New User Install

Referral attribution connects three data points: who sent the invite, who accepted the invite, and what happened after acceptance.

Implementation requirements:

  1. Unique link generation: Each referral action creates a unique link containing the referring user's ID and optional context (which feature prompted the share, what content was shared).

  2. Deferred deep linking: The referral link must survive the app install process. User clicks link → directed to App Store → installs app → opens app → link context resolves, attributing the install to the referring user.

  3. Event chaining: After referral attribution, track the referred user's journey: onboarding completion, activation, first purchase, retention at D7/D30. This reveals whether referrals drive quality users or just installs.

  4. Referrer scoring: Identify your most effective referrers. Typically, 10-15% of users drive 60-80% of referral installs. These "super referrers" deserve different treatment (higher rewards, exclusive features, early access).

Referral quality benchmarks:

Referred users typically outperform paid users across lifecycle metrics. Expect 25-40% higher D7 retention, 15-30% higher activation rates, and 20-35% higher first-purchase conversion. These differences compound over time, making referral the highest-quality acquisition channel for most PLG apps.

For teams building cohort analysis around acquisition channels, separating referral cohorts from organic and paid cohorts reveals the true value of viral mechanics.

Channel Mix Reality: Understanding True Paid vs Organic vs Viral Split

Most PLG apps misunderstand their channel mix because they measure only direct attribution (paid clicks → installs) and ignore indirect attribution (paid users → referrals → organic installs).

The attribution cascade:

Paid campaign drives 10,000 installs in January. Of those 10,000 users, 2,500 send referrals in their first 30 days. Those referrals generate 1,200 additional installs. Of those 1,200 referred users, 300 send their own referrals, generating 120 more installs.

Direct attribution says: 10,000 paid installs.

True attribution says: 10,000 paid + 1,320 referral cascade = 11,320 total installs driven by the paid campaign.

The effective CPI drops by 12% when you account for the referral multiplier. The effective ROAS increases proportionally.

How to calculate your organic multiplier:

  1. Track referral installs attributed to users acquired through paid campaigns

  2. Track viral share installs attributed to content shared by paid-acquired users

  3. Sum paid installs + referral cascade + viral cascade

  4. Divide total by paid installs = organic multiplier

A healthy PLG app might show an organic multiplier of 1.15-1.40×, meaning every 1,000 paid installs generate 150-400 additional organic installs through referral and viral mechanics.

K-Factor and Viral Loop Cycle Time: Quantifying Exponential Growth Potential

K-factor alone doesn't tell the full story. Cycle time, how long it takes for one generation of referrals to produce the next, determines whether viral growth is fast enough to compound.

Viral cycle time formula:

Time from User A's install → User A sends referral → User B installs → User B sends referral

Short cycle times (1-3 days) create rapid compounding. Long cycle times (14-30 days) mean organic growth accumulates slowly even with high K-factors.

Example:

  • App A: K = 0.5, cycle time = 2 days. In 30 days, 1,000 users become approximately 1,930.

  • App B: K = 0.7, cycle time = 14 days. In 30 days, 1,000 users become approximately 1,700.

App A grows faster despite a lower K-factor because its shorter cycle time allows more compounding iterations.

Optimising cycle time:

  • Prompt referrals during high-engagement moments (post-achievement, post-purchase, onboarding)

  • Reduce friction in the referral flow (one-tap sharing, pre-populated messages)

  • Ensure invited users activate quickly (deferred deep links to relevant content, streamlined onboarding)

Implementation Playbook: Setting Up PLG Attribution in Week One

Day 1-2: Audit Current Organic Visibility

Start by quantifying your organic blind spot. Pull your current channel mix from your MMP and calculate what percentage shows as unattributed "organic." If it exceeds 50%, you have significant measurement gaps.

Document all existing share and referral mechanics in your app. Map which ones generate tracked links and which ones don't.

Day 3-4: Implement Referral Link Tracking

Generate unique deep links for every referral action. Each link should contain: referring user ID, referral context (which feature/screen), and timestamp.

Configure deferred deep linking so referral context survives the install process. Test the complete flow: share → click → install → open → attribution resolves.

Day 5-6: Implement Viral Share Tracking

Add tracked deep links to all share features (score sharing, content sharing, achievement sharing). Include content type and sharing channel in link parameters.

Set up event tracking for: share_initiated, share_completed, share_link_clicked, share_install_completed.

Day 7: Build PLG Attribution Dashboard

Create a dashboard showing your expanded attribution taxonomy:

  • Paid installs by channel

  • Referral installs (with referring user data)

  • Viral share installs (by content type and platform)

  • Branded search installs

  • Web-to-app installs

  • Unattributed organic (residual)

Track K-factor and cycle time weekly. Set alerts for significant changes in organic multiplier.

FAQ: Common PLG Attribution Questions Answered

How do you attribute installs from WhatsApp shares?

WhatsApp shares require tracked deep links embedded in the shared message. When the recipient clicks the link, standard deep link attribution applies. Without tracked links, WhatsApp shares appear as unattributed organic installs.

What attribution window should PLG apps use for referral tracking?

Referral attribution windows should be longer than paid attribution windows because referral links may sit in message threads for days or weeks before being clicked. A 30-day referral attribution window captures late conversions without over-attributing.

Can you measure K-factor for specific features?

Yes. Track referral and share actions by triggering feature. A feature-level K-factor analysis reveals which product mechanics drive the most organic growth, informing product development priorities.

How do you avoid double-counting when a user sees both a paid ad and a referral link?

Use last-touch attribution with source priority. If a user clicks a referral link and later sees a retargeting ad before installing, the referral link should receive attribution (it was the first meaningful touchpoint). Define clear priority rules for mixed-source scenarios.

What's the minimum user base needed before viral metrics become meaningful?

K-factor calculations become statistically reliable at approximately 5,000-10,000 MAU. Below that, small sample sizes create noisy K-factor estimates that fluctuate week to week.

Here’s how Linkrunner Supports PLG Attribution

Platforms like Linkrunner provide PLG apps with the deep linking and attribution infrastructure needed to measure organic growth channels. Every link generated through Linkrunner is dynamic and deferred by default, meaning referral and share links preserve context through the install process without requiring separate deep linking tools.

For PLG teams specifically, the ability to generate unique tracked links at scale (for referral programs, share features, and web-to-app flows) while attributing downstream installs and in-app events back to the originating user or content creates the visibility needed to optimise viral mechanics.

At ₹0.80 per attributed install, PLG apps can track referral-driven and viral-driven installs alongside paid installs without the per-seat or per-link pricing that makes organic attribution prohibitively expensive on legacy platforms.

Key Takeaways

PLG attribution requires expanding beyond the paid/organic binary to measure five distinct organic growth drivers: direct referrals, viral shares, network effects, word-of-mouth, and content/SEO.

K-factor and viral cycle time are the two metrics that determine whether organic growth compounds fast enough to reduce paid acquisition dependency.

Referred users consistently outperform paid-acquired users on retention, activation, and revenue metrics. Measuring referral quality by source justifies continued investment in product-led growth mechanics.

The organic multiplier, additional installs generated through referral cascades from paid users, reveals the true ROI of paid campaigns in PLG apps. Without measuring it, you're systematically undervaluing your best acquisition channels.

For PLG teams ready to measure virality, referrals, and organic loops alongside paid attribution, request a demo from Linkrunner to see how unified deep linking and attribution can reveal the growth drivers hiding behind your "organic" bar.

Empowering marketing teams to make better data driven decisions to accelerate app growth!

For support, email us at

Address: HustleHub Tech Park, sector 2, HSR Layout,
Bangalore, Karnataka 560102, India

Empowering marketing teams to make better data driven decisions to accelerate app growth!

For support, email us at

Address: HustleHub Tech Park, sector 2, HSR Layout,
Bangalore, Karnataka 560102, India