Skip to content
← ALL WRITING

2026-04-21 / 11 MIN READ

The Meta attribution audit that found 31% more conversions

A reconstructed Q&A with a DTC operator who ran a Meta attribution audit and recovered 31% of conversions their Pixel never reported. What we found, and why.

An operator asked me last year why they should pay for a Meta attribution audit when they already had CAPI running. I told them to check their match quality score first.

It was 5.8.

What follows is a reconstructed version of the conversation that came after. R. is a DTC founder I've known for two years. She had a functional Shopify store, a healthy ad budget, and CAPI enabled through Shopify's native integration. Everything looked fine from the outside.

Conversion visibility · Oct–AprPixel onlyActual
Match quality before5.8/ 10
Match quality after9.1/ 10
Conversions recovered31%in 30 days
Six months of conversion visibility: browser Pixel versus actual conversions. Match quality 5.8 to 9.1 after full CAPI implementation.

What a Meta attribution audit actually looks at

R.: "I'm confused about what an attribution audit even is. We have CAPI on. Our agency confirmed it's sending events. What else is there to check?"

Me: The presence of CAPI is not the same as CAPI working well. There are four things an audit actually looks at: match quality score, event_id deduplication, external_id hashing on authenticated sessions, and upper-funnel event coverage. Shopify's native integration handles some of those by default, and ignores others entirely.

Match quality is the one to start with because it summarizes how accurately Meta can match your server events to real user profiles. A score of 5.8 means roughly half your events are matching to the right person. A score of 9+ means almost all of them are.

The difference is not cosmetic. When an event doesn't match, Meta can't attribute the conversion. It doesn't count. From your Ads Manager perspective, that sale never happened.

R.: "So we had CAPI running and still had events that weren't matching?"

Me: Correct. CAPI being on is a prerequisite, not a guarantee. The quality of what you're sending matters as much as whether you're sending anything.

Where the 31% gap was hiding

The audit found three things contributing to the gap. They're common and they compound.

Upper-funnel events were missing hashed email. R.'s store was sending ViewContent and AddToCart events from the browser Pixel, but the server-side container wasn't enriching those events with hashed email from the session data. Meta needs personally identifiable data - hashed, but present - to match an event to a profile. Without it, upper-funnel events were landing with a match quality near zero.

Me: The browser Pixel fires ViewContent when someone looks at a product page. If they're logged in, you have their email in the session. The server container should be pulling that and hashing it before sending to Meta. Yours wasn't.

R.: "Our Shopify integration handles that automatically though, doesn't it?"

Me: For Purchase events, yes. Shopify's native CAPI connector sends hashed customer data with the order. But it doesn't go back upstream and enrich ViewContent or AddToCart with session data. Those events fire without the matching signals that make them useful for attribution.

The second issue was event_id mismatch. When both the browser Pixel and the server container fire the same event, Meta needs an identical event_id on both to know they're duplicates. Without that deduplication signal, Meta counts the same conversion twice - or in some configurations, rejects one of the pair entirely.

The third: no external_id on authenticated sessions. External ID is your own customer identifier, hashed. It's the strongest matching signal you can send and the easiest to implement once someone is logged in. R.'s implementation wasn't sending it at all.

R.: "So we were basically sending half-complete data and Meta was doing its best with what it had."

Me: That's a fair way to put it. And "its best" in that case was attributing about 69% of what actually happened.

What the numbers looked like before and after

Before the fix: match quality score of 5.8. Ads Manager was showing roughly 69% of actual purchase conversions. Upper-funnel attribution was close to useless for optimization.

After: match quality moved to 9.1. Ads Manager showed 31% more attributable conversions in the first 30 days post-implementation. ROAS calculations shifted accordingly.

R.: "So our ROAS was wrong?"

Me: It was pessimistic. Which sounds like a minor problem until you realize that your agency was making bid and budget decisions based on ROAS numbers that were 31% lower than reality. If a campaign looks like it's breaking even, and it's actually returning 1.31x, those are different decisions.

R.: "We almost shut off a campaign last month that looked like it wasn't working."

Me:

The fix wasn't complicated. It took about 48 hours from diagnosis to production on a standard Shopify + GTM server container setup: external_id added to all authenticated sessions, hashed email pulled into upper-funnel events, and event_id generation shared across the browser and server containers for proper deduplication. You can read the full details on how Shopify Pixel and CAPI deduplication actually works if you want to understand the event_id mechanics specifically.

Why Shopify's native CAPI integration isn't enough

The native connector does one thing well: it sends Purchase events with good customer data after an order completes. That's useful. It's also a narrow slice of the conversion path that Meta's algorithm wants to optimize against.

Meta's current recommendation is to send at least four event types: ViewContent, AddToCart, InitiateCheckout, and Purchase. Each one tells the algorithm something about who is moving through your funnel. Purchase-only data is accurate but sparse. It teaches the algorithm about your buyers after the fact, but gives it very little signal for optimizing who to show ads to before the purchase decision.

R.: "So we were paying for optimization that only had access to a quarter of the data it needed."

Me: The algorithm had full purchase data and almost no upper-funnel signal. It was optimizing a journey it could only see the end of.

Beyond event coverage, the native integration has two structural gaps: no external_id on authenticated sessions, and no shared event_id generation with the browser Pixel. Both of those require a custom server container implementation to fix. Shopify won't patch them because they're not technically broken from Shopify's perspective - the Pixel still fires, the server still sends, the events still show up in Events Manager. The match quality gap is downstream.

This is why a proper CAPI audit looks at match quality score before anything else. Events Manager being green means events are arriving. It doesn't mean they're matching.

The question operators ask most after audits like this

R.: "Was the revenue always there? Like, were we actually making those sales and just not seeing them?"

Me: Yes. The sales happened. The revenue hit your bank account. The tracking wasn't capturing it for Meta, so the ad platform couldn't take credit for it. From Meta's perspective, those conversions didn't exist. From yours, they absolutely did.

That framing matters because it means the fix is not about generating new revenue. It's about making your current revenue visible to the platform that's supposed to be driving it. When the platform can see what it's actually doing, it can optimize more accurately, which over time tends to improve the ratio.

R.: "And how do we make sure this doesn't drift back? The match quality score was 5.8 for a while before anyone noticed."

Me: That's the right question, and most teams don't ask it. Match quality can degrade quietly - a library update changes how events fire, a Shopify app conflicts with your GTM container, a new checkout flow skips a hashing step. None of those trigger an alert.

The answer is a scheduled audit cadence. Not weekly, but at least quarterly. Check match quality score, spot-check event_id deduplication in Events Manager's Test Events tool, and verify that external_id is still appearing on authenticated sessions. Fifteen minutes every three months catches most of the drift before it compounds.

What this taught me about how DTC brands interpret ad data

R.'s situation is common. Most DTC operators spending $5K or more a month on Meta ads have some version of a tracking gap. The specific causes vary, but the pattern is consistent: CAPI appears to be running, match quality is mediocre, and the discrepancy between actual revenue and attributed conversions is invisible unless someone goes looking.

If you've ever turned off a campaign that looked like it wasn't working, it's worth checking the match quality score from that period. You might find that it wasn't the campaign.

That's the reason I put this audit methodology at the center of the CAPI Stack Audit product: match quality first, event coverage second, deduplication third, and external_id last. In that order because that's the order where problems tend to compound. You can also see the same approach applied across the broader product suite if you want to understand how it fits into a full tracking and attribution review.

FAQ

What is a Meta attribution audit and what does it check?

A Meta attribution audit looks at how accurately your CAPI implementation is matching server-side events to Meta user profiles. It covers four areas: match quality score, event_id deduplication between browser Pixel and server container, external_id hashing on authenticated sessions, and whether upper-funnel events (ViewContent, AddToCart, InitiateCheckout) are being sent with the customer signals Meta needs to match them.

What's a good match quality score in Meta Events Manager?

Meta's own threshold for "good" is 6.0 or above. In practice, a score below 8 usually indicates a fixable gap. A score of 9 or above is achievable for most Shopify stores with a properly configured server-side container and external_id implementation. The case this article is based on moved from 5.8 to 9.1 after implementation.

Does Shopify's native CAPI integration cover all of this?

Not fully. The native Shopify CAPI connector handles Purchase events with customer data reasonably well. It doesn't send external_id on authenticated sessions, doesn't share event_id generation with the browser Pixel for deduplication, and doesn't enrich upper-funnel events (ViewContent, AddToCart) with hashed customer data. All three gaps require a custom GTM server container implementation to address. A breakdown of the most common CAPI failure modes covers these patterns in more detail if you want to understand the full failure mode landscape.

How do I know if I have a tracking gap right now?

Check your match quality score in Meta Events Manager under "Overview." If it's below 8, you likely have a gap. If it's below 6, the gap is significant. You can also compare your server-side event count against your browser Pixel event count in Events Manager's deduplication report - a large discrepancy in either direction suggests a dedup problem.

How long does it take to fix?

For a standard Shopify store using GTM, the full implementation - external_id, hashed email on upper-funnel events, and event_id deduplication across browser and server containers - typically runs 24-48 hours from diagnosis to production. The audit itself (understanding what's broken and why) usually takes a few hours. The time-to-fix depends on how complex your current GTM setup is.

Will fixing this change my ROAS?

It will make your ROAS accurate. If you had a tracking gap, your previous ROAS was lower than your actual return because conversions were going unattributed. After the fix, the number reflects what's actually happening. Whether that's higher or lower than your benchmark depends on how large the gap was. The operator in this article found that campaigns she had considered turning off were outperforming her benchmarks once attribution was accurate.

Sources and specifics

  • Match quality score before implementation (5.8) and after (9.1) are from a real Q2 2024 Shopify DTC engagement on file. Client is NDA, anonymized as "R." throughout.
  • The 31% figure represents the increase in Meta-attributed conversions in the first 30 days after full CAPI implementation. The underlying case study documents a 30-40% recovery band across multiple measurement approaches. 31% is the representative number from 30-day post-implementation Ads Manager data.
  • "~35% more conversions tracked" and "Accurate ROAS for the first time in 18 months" are from the case study record. The 31% figure is the 30-day attribution delta from Ads Manager specifically; the broader 35% figure includes GA4 and direct tracking comparisons.
  • event_id deduplication is required by Meta's CAPI documentation to prevent double-counting when both browser and server events fire for the same user action.
  • Fixing external_id alone typically moves match quality 1-2 points; adding hashed email to upper-funnel events typically adds another 1-2 points, per observed patterns across multiple implementations.

// related

DTC Stack Audit

If this resonated, the audit covers your tracking layer end-to-end. Server-side CAPI, dedup logic, and attribution gaps - all mapped to your stack.

>See what is covered