Why Your Meta ROAS Doesn't Match Your MMP Data (And How to Fix It)

The reluctant pantry manager.
Lakshith DineshChristmas Hat

Lakshith Dinesh

Reading: 1 min

Updated on: Jan 7, 2026

Your Monday morning growth meeting starts the same way it has for the past month. Your Meta Ads Manager shows a 3.2x ROAS on your top-performing campaign. Your MMP dashboard reports 1.8x for the same campaign, same time period, same conversion event. Your CFO asks which number is real. Your performance lead says "it's complicated." And another week of budget decisions gets delayed because no one trusts the data.

This isn't a minor reporting quirk. When Meta ROAS and MMP attribution data diverge by 40-70%, it creates paralysis. Marketing teams can't confidently scale winners. Finance teams question whether paid acquisition is profitable at all. And everyone wastes hours in spreadsheets trying to reconcile numbers that should already match.

The frustrating part? Most ROAS discrepancies stem from five specific, fixable configuration issues. Not vague "attribution is hard" problems, but concrete setup gaps that create systematic reporting drift. This guide walks through each root cause, shows you how to diagnose which one applies to your setup, and provides the exact validation steps to align your reporting within days.

The ROAS Mismatch Problem: Why CFOs and CMOs Lose Trust in Data

ROAS discrepancies destroy decision confidence faster than any other measurement issue. When your ad platform and attribution system report different economics for the same campaign, three things happen immediately.

First, budget allocation stalls. If Meta says a campaign is profitable at 3x ROAS but your MMP shows breakeven at 1.5x, do you scale or pause? Without a clear answer, most teams default to caution, leaving profitable campaigns underfunded whilst unprofitable ones continue burning budget.

Second, cross-functional trust erodes. Performance teams defend Meta's numbers because that's what the algorithm optimises toward. Finance teams trust the MMP because it's supposed to be the "source of truth." Product teams reference their own analytics platform showing yet another ROAS figure. This isn't collaboration, it's attribution warfare.

Third, strategic decisions get made on flawed assumptions. We've seen growth teams reallocate six-figure budgets based on ROAS rankings that were systematically wrong due to attribution window mismatches. One fintech app paused their best-performing creative variant because MMP data showed negative ROAS, only to discover weeks later that conversion events weren't being sent to Meta properly, meaning Meta's algorithm never learned which users converted.

The stakes are particularly high for performance-driven apps spending ₹50 lakh to ₹2 crore monthly on Meta. A 40% ROAS reporting gap on ₹1 crore in spend means ₹40 lakh in budget could be misallocated monthly. Across a quarter, that compounds into genuinely material profit impact.

But here's what most diagnostic guides miss: ROAS discrepancies aren't random. They follow predictable patterns tied to specific configuration gaps. Fix the root cause, and the numbers converge within 48-72 hours.

Root Cause 1: Attribution Window Differences (Meta 7-Day Click vs MMP Custom Windows)

The single most common reason Meta ROAS differs from MMP ROAS is attribution window mismatch. Meta and your MMP are counting conversions from different time periods, creating systematic over-reporting on one side or under-reporting on the other.

Meta's default attribution window is 7-day click and 1-day view. This means if a user clicks your ad on Monday and converts on Wednesday, Meta attributes that conversion. If they click on Monday and convert on the following Tuesday (day 8), Meta does not attribute it. Meanwhile, many MMPs default to different windows: 30-day click, 24-hour click, or fully customisable windows that vary by campaign type.

Here's how this creates ROAS divergence in practice. Imagine you're running a fintech app selling investment products. Your typical user journey involves clicking a Meta ad, downloading the app, exploring for a week, then making their first deposit (your revenue event). With Meta's 7-day click window, roughly 60-70% of conversions get attributed. With your MMP's 30-day click window, 90-95% get attributed to the correct campaign. Meta reports ROAS of 2.1x. Your MMP shows 3.4x for the identical campaign.

The inverse happens with shorter MMP windows. If your MMP uses a 24-hour click window (common in gaming and ecommerce for faster signal), but Meta uses 7-day click, Meta will show higher ROAS because it's capturing conversions your MMP excludes.

How to validate this is your issue:

Pull the same campaign's data from Meta Ads Manager and your MMP for the past 7 days. Note the attribution window each platform is using (this is usually visible in reporting settings or documentation). If Meta is using 7-day click and your MMP is using 30-day click, and your MMP ROAS is consistently 40-80% higher, attribution window mismatch is likely the primary cause.

The fix:

Align attribution windows across platforms. Most MMPs allow you to configure custom attribution windows per campaign or globally. Set your MMP to match Meta's 7-day click, 1-day view window for campaigns where you need direct comparison. Alternatively, if your product has longer consideration cycles, set Meta to report on a custom attribution window (available in Ads Manager under Columns > Customise Columns > Attribution Setting) that matches your MMP's 30-day window.

The goal isn't to pick the "right" window universally. It's to ensure both platforms count conversions from the same time range so ROAS calculations use comparable numerators and denominators. For most consumer apps, 7-day click provides sufficient signal whilst maintaining reasonable comparison accuracy.

Root Cause 2: Conversion Event Mapping Errors (Purchase Event Not Sent to Meta)

The second major culprit is conversion event misconfiguration. Your MMP is tracking revenue events correctly, but those events aren't being sent back to Meta via postback, meaning Meta optimises toward a proxy metric (install, signup) instead of actual purchases.

This manifests as Meta showing strong ROAS based on cheap cost-per-install, whilst your MMP shows poor ROAS because those installs aren't converting to paying users. The numbers can diverge by 100-300% when event mapping breaks.

Here's the technical detail that matters: Meta's algorithm learns to find valuable users only when it receives purchase or revenue postbacks. If your MMP is configured to send "install" events to Meta but not "purchase" events, Meta's system treats every install as equally valuable. It optimises toward cheap installs, not quality users who generate revenue. Your MMP, meanwhile, correctly tracks which campaigns drove actual purchases and calculates ROAS based on real revenue.

We've audited setups where marketing teams assumed postbacks were working because they saw conversions in Meta Ads Manager. But those "conversions" were app installs, not purchases. The purchase event existed in the MMP, but the postback configuration never sent it to Meta. The result: Meta reported 4.2x ROAS (based on cost-per-install efficiency) whilst the MMP showed 1.1x ROAS (based on actual revenue per user).

How to validate this is your issue:

Go to your MMP's postback configuration (usually under Integrations > Meta > Postback Settings). Check which events are being sent to Meta. You should see both "install" and your key revenue event (purchase, subscription, first_deposit, etc.) listed as active postbacks. Then verify in Meta Events Manager that purchase events are actually being received. Navigate to Events Manager > Data Sources > App Events > Activity, and filter for purchase events over the past 7 days. If you see installs but no purchase events, your postback mapping is broken.

The fix:

Configure your MMP to send purchase or revenue events to Meta as conversion postbacks. Most modern MMPs (including Linkrunner) provide automated postback setup where you select which in-app events should trigger Meta conversion signals. Ensure your primary monetisation event (purchase, subscription_start, first_transaction) is enabled. Also verify the revenue value is being passed correctly, not just the event name. Meta needs both the event occurrence and the revenue amount to calculate accurate ROAS.

After enabling purchase postbacks, expect a 48-72 hour learning period where Meta's algorithm adjusts. Your Meta-reported ROAS may initially drop (as it starts counting fewer, higher-value conversions instead of all installs), but within a week it should converge much closer to your MMP's numbers. More importantly, Meta will start optimising toward users who actually generate revenue, improving campaign performance over time.

Root Cause 3: View-Through Attribution Inclusion (Meta Counts, MMP Doesn't by Default)

View-through attribution (VTA) is one of the most underappreciated sources of ROAS discrepancy. Meta includes view-through conversions in its ROAS calculation by default. Many MMPs either don't track view-throughs or exclude them from attribution reports unless explicitly configured.

A view-through conversion happens when a user sees your ad (impression) but doesn't click, then later installs your app or completes a purchase through another route (organic search, direct navigation, different campaign). Meta attributes this conversion to the original ad impression within a 1-day view window. Your MMP, unless specifically tracking impression-based attribution, attributes it to the last click or labels it organic.

For top-of-funnel awareness campaigns with high impression volume and low click-through rates, view-through conversions can represent 20-40% of Meta's total attributed conversions. If your MMP isn't counting these, Meta's ROAS will appear significantly higher.

Here's a realistic scenario: You're running a broad targeting Meta campaign for a D2C app. It generates 2 million impressions, 10,000 clicks, and Meta reports 800 conversions (600 from clicks, 200 from view-throughs). Your MMP, tracking only click-based attribution, reports 580 conversions (some click conversions are lost to organic overlap, discussed in Root Cause 4). Meta calculates ROAS at 2.8x including view-throughs. Your MMP shows 2.0x excluding them. The 40% gap is entirely view-through attribution methodology.

How to validate this is your issue:

Check your MMP's attribution settings for view-through tracking. Most MMPs require impression tracking to be explicitly enabled, often through an additional SDK configuration or server-to-server integration for impression data. If impression tracking isn't enabled, your MMP cannot attribute view-through conversions by definition.

Then compare Meta's conversion breakdown. In Meta Ads Manager, add columns for "1-day view" conversions separately from "7-day click" conversions (Customise Columns > Attribution Setting > breakdown by attribution window type). If view-through conversions represent more than 15-20% of total attributed conversions, and your MMP doesn't track view-throughs, this is a significant portion of your discrepancy.

The fix:

Enable impression tracking in your MMP if view-through attribution matters for your evaluation. Note that impression-based attribution requires additional SDK configuration in most MMPs to capture ad impression events. This isn't always worth the implementation effort, particularly if your campaigns are heavily click-focused (search, retargeting) where view-throughs are minimal.

Alternatively, adjust your comparison methodology. Pull Meta's ROAS excluding view-through conversions by filtering reports to show only click-based attribution. This provides a cleaner apples-to-apples comparison with click-only MMP data. For most performance campaigns, click-based ROAS is the more actionable metric anyway, as it reflects user intent (they actively clicked) rather than passive exposure.

Root Cause 4: Organic Traffic Misattribution and Double-Counting

Organic traffic creates one of the trickiest attribution challenges: the same conversion being counted by Meta and labelled as organic by your MMP, leading to ROAS over-reporting on the Meta side.

This happens because Meta's attribution model is self-reported and probabilistic. If a user clicked a Meta ad 6 days ago, then searches for your app by name and installs organically (direct App Store search, no UTM parameters), Meta still attributes that install within its 7-day click window. Your MMP, seeing no click ID or referrer data, labels it organic. Meta's ROAS includes this conversion. Your MMP's paid ROAS does not. The same install is counted once by Meta, zero times in your MMP's paid attribution, creating systematic ROAS inflation on Meta's side.

This is particularly pronounced for apps with strong brand presence or viral loops. If 30-40% of your installs would happen organically regardless of paid acquisition (because users heard about you from friends, press, or content marketing), but a portion of those users previously saw a Meta ad, Meta's attribution claims credit whilst your MMP correctly labels them organic. Your Meta ROAS looks exceptional. Your MMP ROAS reflects the true incremental impact of paid spend.

How to validate this is your issue:

Compare total attributed install volume between Meta and your MMP over a 7-day period. Meta should report total attributed installs (claimed by Meta). Your MMP should report paid attributed installs (with clear campaign source) plus organic installs (no source). If Meta's attributed volume is 20-50% higher than your MMP's paid attributed volume, and your MMP shows significant organic volume, you likely have organic misattribution creating ROAS divergence.

Also check for audience overlap. Apps with high brand search volume, strong PR cycles, or referral programmes will see more organic installs from users who previously saw ads. This isn't a bug, it's a fundamental attribution philosophy difference: Meta uses post-view/post-click windows aggressively; MMPs require deterministic click data for paid attribution.

The fix:

There's no perfect fix for organic overlap because it reflects a genuine methodological difference. However, you can quantify the impact to build a "discrepancy budget" that makes reporting gaps predictable.

Run an incrementality test by pausing Meta campaigns for a control period (7-14 days) whilst tracking organic install volume. If organic installs remain stable or increase during the pause, those installs were genuinely organic and Meta's attribution was over-claiming. If organic installs drop significantly, some of those "organic" users were incrementally driven by Meta ads, even without direct clicks.

For ongoing reporting, maintain two ROAS views: Meta's claimed ROAS (useful for campaign optimisation within Meta's ecosystem) and MMP incrementality-adjusted ROAS (useful for true profit calculation and cross-channel comparison). Accept that these will differ by 15-35% in most consumer apps, with Meta's number running higher. The key is understanding why the gap exists, not eliminating it entirely.

Linkrunner's campaign intelligence dashboard surfaces this overlap by pulling both Meta's native reporting and MMP attributed data into a unified view, highlighting where claimed conversions exceed attributed conversions so teams can assess how much of the gap is organic overlap versus other configuration issues.

Root Cause 5: Currency and Revenue Calculation Differences

Currency handling and revenue calculation methodology create surprisingly large ROAS discrepancies that often get overlooked because they're "just a settings issue." But when your app operates across multiple markets or uses dynamic pricing, currency configuration errors can cause 20-40% ROAS reporting gaps.

The most common issue: Meta is calculating ROAS in USD by default, whilst your MMP calculates it in INR (or another local currency), and the exchange rate used differs between platforms or fluctuates daily. If you're running campaigns in India spending in INR, but Meta is reporting revenue in USD using a stale exchange rate, the ROAS calculation will be systematically off.

Here's a real example: A mobility app spends ₹10 lakh on Meta campaigns in February. They generate ₹28 lakh in attributed revenue that month. Actual ROAS is 2.8x. But Meta's currency settings default to USD reporting, converting spend and revenue at different exchange rates (spend converted at campaign launch rate, revenue converted at event occurrence rate). Meta's dashboard shows ROAS of 2.4x due to unfavourable conversion timing during a month when INR weakened against USD. The MMP, calculating everything in INR with no currency conversion, correctly reports 2.8x. The 14% gap is entirely currency methodology.

Another variant: revenue calculation differences when handling refunds, discounts, or subscription renewals. Meta counts gross revenue (purchase event value as sent). Some MMPs allow you to configure net revenue (purchase value minus refunds, cancellations, discounts applied post-purchase). If your MMP is reporting net revenue and Meta is reporting gross revenue, Meta's ROAS will appear 10-25% higher even with perfect attribution.

How to validate this is your issue:

Check both platforms' currency settings. In Meta Ads Manager, go to Account Settings > Currency. Note whether it's set to USD, INR, or another currency. In your MMP, check revenue reporting settings to see which currency revenue events are logged in and whether currency conversion is applied before ROAS calculation.

Pull revenue totals for a specific campaign over 7 days from both platforms. If the revenue amounts differ by more than 5-10% and attribution windows are already aligned, currency or revenue calculation differences are likely the cause. Also check whether your MMP applies refund adjustments or net revenue calculations that Meta doesn't account for.

The fix:

Standardise currency reporting across platforms. If you operate primarily in India, set both Meta and your MMP to report in INR with no automatic currency conversion. This ensures spend and revenue are calculated in the same currency using the same exchange rates (or none at all).

If you must use multiple currencies due to multi-market operations, ensure both platforms use the same exchange rate source and update frequency. Most MMPs allow you to set custom exchange rates or pull from standardised sources (ECB, RBI) to match Meta's conversion logic.

For revenue calculation methodology, decide whether gross or net revenue is your north star for ROAS, then configure both platforms identically. If you track refunds and cancellations in your MMP, either include them in Meta's revenue events via updated postback values, or exclude refund adjustments from MMP ROAS calculations when comparing against Meta. Consistency in revenue definition matters more than which definition you choose.

The Validation Workflow: 15-Minute Check to Identify Which Cause Applies

You now know the five root causes of Meta ROAS discrepancy. Here's a systematic 15-minute validation workflow to diagnose which one (or which combination) is affecting your setup.

Step 1: Pull comparison data (2 minutes)

Export campaign-level data from Meta Ads Manager and your MMP for the past 7 days. Include: campaign name, spend, attributed conversions, attributed revenue, ROAS. Ensure both reports use the same date range and campaign grouping.

Step 2: Check attribution windows (3 minutes)

In Meta Ads Manager, verify attribution window settings (usually 7-day click, 1-day view). In your MMP, check global or campaign-specific attribution window configuration. If they differ, this is your primary cause. Note the specific windows each platform uses.

Step 3: Verify event postback configuration (4 minutes)

Log into your MMP's integrations panel. Navigate to Meta postback settings. Confirm which events are being sent to Meta (should include install + purchase or revenue events). Then check Meta Events Manager to verify purchase events are being received in the past 7 days. If purchase events are missing, this is a critical gap.

Step 4: Compare conversion volume (2 minutes)

Look at total attributed install or conversion volume from Meta versus your MMP. If Meta's volume is 20-50% higher and your MMP shows significant organic volume, organic misattribution is part of your discrepancy. If volumes match closely but ROAS differs, focus on currency and calculation methodology.

Step 5: Check for view-through attribution (2 minutes)

In Meta, add view-through conversion columns (1-day view) to your report. If view-through conversions represent more than 15% of total conversions, and your MMP doesn't track impressions, this is a meaningful contributor. Note the percentage of VTA conversions.

Step 6: Validate currency settings (2 minutes)

Confirm both Meta and MMP are reporting in the same currency (INR, USD, etc.). Pull a specific campaign's revenue total from both platforms. If revenue amounts differ by more than 5-10% despite matching attribution windows, currency or gross/net revenue calculation differences are the cause.

At the end of this 15-minute workflow, you'll have identified which 1-3 root causes are driving your ROAS discrepancy. Most setups have a primary cause (usually attribution windows or event mapping) and 1-2 secondary contributors (organic overlap, currency settings). Fix the primary cause first, validate the improvement, then address secondary issues if the remaining gap justifies the effort.

Frequently Asked Questions

Why does my Meta ROAS change when I check it days later for the same campaign?

Meta's attribution windows mean conversions can be attributed retroactively. If you check ROAS on Thursday for Monday's campaign, you're seeing conversions that happened Monday-Thursday. If you check the same Monday campaign the following Tuesday, you're now seeing conversions that happened Monday-Monday (7 days later), including delayed conversions within the attribution window. Your MMP typically reports on conversion date, whilst Meta reports on click/impression date, creating temporal differences that look like "changing" numbers.

Should I trust Meta's ROAS or my MMP's ROAS for scaling decisions?

Use MMP ROAS for strategic scaling decisions (allocating budget between channels, determining overall campaign profitability). Use Meta ROAS for tactical optimisation within Meta (which ad sets to scale, which creatives to test). Your MMP provides cross-channel truth; Meta provides platform-specific signal.

How much ROAS discrepancy is normal versus concerning?

Expect 10-20% discrepancy even with perfect configuration due to organic overlap and view-through attribution methodology differences. Gaps larger than 30% indicate configuration issues (attribution windows, event mapping, currency) that should be investigated and fixed. Gaps exceeding 50% almost always point to broken event postbacks or fundamentally misaligned attribution windows.

Can I make Meta and MMP ROAS match exactly?

Exact matching is unlikely and unnecessary. The goal is predictable, explainable discrepancy where you know why numbers differ and can reconcile them for decision-making. Attempting perfect alignment often means over-configuring systems in ways that reduce their individual utility (Meta's optimisation signal quality, MMP's cross-channel view).

What if my MMP doesn't support view-through attribution?

This is common and acceptable for most performance campaigns. Simply acknowledge that Meta's ROAS will run 10-25% higher due to view-through inclusion, and use click-only ROAS as your comparison basis. You can request Meta reporting filtered to click-only conversions for cleaner comparison without changing your MMP setup.

How often should I reconcile Meta vs MMP data?

Weekly reconciliation is sufficient for most teams. Set a recurring Monday morning process: pull prior week's ROAS from both platforms, validate the gap hasn't widened unexpectedly, document any new discrepancies in your tracker. Monthly deep-dives can investigate specific campaigns with unusual gaps, but weekly spot-checks catch configuration drift early.

Taking Action: Build Measurement Confidence This Week

ROAS discrepancies aren't mysterious forces. They're specific, diagnosable configuration gaps that create systematic reporting drift. The difference between teams that struggle with conflicting data and teams that make confident budget decisions isn't access to better tools, it's understanding which levers create alignment and knowing how to validate that alignment holds over time.

Start with the 15-minute validation workflow outlined above. Identify your primary root cause (attribution windows, event mapping, view-through methodology, organic overlap, or currency settings). Fix that one issue and measure the impact over 7 days. Then address secondary causes if remaining discrepancies justify the effort.

For most mobile apps spending ₹50 lakh or more monthly on Meta, eliminating ROAS confusion unlocks immediate value: faster scaling of winners, clearer creative insights, stronger finance partnerships, and ultimately better unit economics as budgets flow to genuinely profitable campaigns instead of ones that merely look profitable in one platform's reporting.

If your team needs measurement infrastructure that makes ROAS reconciliation automatic rather than manual, request a demo from Linkrunner. Linkrunner's campaign intelligence pulls Meta's native reporting alongside cross-channel MMP attribution, highlights discrepancies automatically, and gives you unified ROAS visibility from creative level to channel level without spreadsheet dependency. Implementation takes days, not weeks, and pricing starts at ₹0.80 per install with no seat limits or hidden export fees.

The choice isn't between trusting Meta or trusting your MMP. It's between understanding why they differ and operating blind. Build that understanding this week, and next Monday's growth meeting starts with decisions instead of reconciliation debates.

Empowering marketing teams to make better data driven decisions to accelerate app growth!

For support, email us at

Address: HustleHub Tech Park, sector 2, HSR Layout,
Bangalore, Karnataka 560102, India

Empowering marketing teams to make better data driven decisions to accelerate app growth!

For support, email us at

Address: HustleHub Tech Park, sector 2, HSR Layout,
Bangalore, Karnataka 560102, India