Skip to main content
Back to Docs

Why Trellis

Attribution and Data Quality

How Catalyst verifies conversion data in three layers: tracking validation, source attribution, and attribution intelligence.


Ad platforms tell you how many conversions they drove. Your store tells you how many orders actually came in. A Catalyst Audit℠ checks whether those numbers agree, determines which platform most likely drove each sale, and surfaces the gaps that platform dashboards quietly paper over.

The Short Version

  • Catalyst verifies conversion counts in three layers, each answering a different question about your ad data.
  • Layer 1 — Tracking Validation: Are the ad platforms counting purchases correctly? The audit compares platform-reported conversions against your actual order records.
  • Layer 2 — Source Attribution: Which ad platform actually drove each sale? When someone buys, the audit traces the click back to its source instead of trusting the platform’s self-reported credit.
  • Layer 3 — Attribution Intelligence: What are the ad platforms not telling you? Platforms blend paid and organic results, count non-purchase events as conversions, or use fractional credit models that don’t match real revenue. Layer 3 uncovers these discrepancies.
  • If tracking accuracy falls below 80%, Catalyst pauses the performance analysis entirely. Recommendations built on unreliable conversion data do more harm than good.

The Three Layers

LayerNameThe Question It Answers
L1Tracking Validation”Can I trust the numbers?”
L2Source Attribution”Who actually gets the credit?”
L3Attribution Intelligence”What is the platform hiding from me?”

Each layer builds on the one before it. L1 is the foundation — if the conversion counts don’t match, nothing else matters. L2 assigns credit. L3 reveals what the platforms leave out.

Layer 1 — Tracking Validation

Every audit starts here. Catalyst compares the number of conversions your ad platform reports against the number of orders your store actually processed during the same period. If Google Ads says it drove 50 purchases and your order records confirm 48, tracking is healthy. If the platform says 50 but your records show 30, something is broken.

This is a pass/fail gate with three outcomes:

StatusAccuracyWhat Happens
Validated80% or higherTracking is reliable. The audit proceeds with full analysis.
Warning60-80%Tracking has gaps. The audit flags the issue and adjusts its confidence.
UnvalidatedBelow 60%Tracking is unreliable. The audit focuses on diagnosing the tracking problem before making any performance recommendations.

L1 is a prerequisite, not a metric to celebrate. Passing it means “your tracking works.” It doesn’t mean “your attribution is perfect” — that’s what Layers 2 and 3 address.

Layer 2 — Source Attribution

Once the counts are validated, the next question is: which platform and campaign actually drove each order?

Ad platforms assign themselves credit for conversions. That’s their job — they’re incentivized to show results. But when you run ads on multiple platforms, each one may claim credit for the same sale. A customer who clicks a Google ad on Monday and a Microsoft ad on Wednesday before buying on Thursday might show up as a conversion in both dashboards.

Layer 2 uses your actual order data — specifically the tracking parameters attached to each click — to determine which platform delivered the final click before the purchase. Instead of trusting each platform’s self-reported number, the audit traces each order back to its source.

The result is a ground truth comparison: what each platform claims versus what your order data confirms. This is where you find out whether your real cost per acquisition and return on ad spend match the numbers in the dashboard.

Layer 3 — Attribution Intelligence

This is where Catalyst earns its keep. Layer 3 surfaces insights that platforms don’t surface prominently in their native dashboards.

Paid vs. organic blending. Some platforms count both paid ad clicks and free organic clicks as ad-driven conversions when the attribution window overlaps. Your dashboard shows a healthy conversion count, but a portion of those sales would have happened without any ad spend. Layer 3 separates the two so you know what your ads actually produced.

Conversion goal inflation. Not all “conversions” are purchases. Platforms can be configured to count page views, add-to-cart events, or newsletter signups as conversions alongside actual sales. If non-purchase goals are included, your reported conversion count may overstate actual sales volume by several times. Layer 3 identifies when this is happening and isolates the purchase-only numbers.

Revenue model discrepancies. Some platforms use fractional credit models that distribute revenue across multiple touchpoints, producing a reported revenue figure that doesn’t match what your store actually collected. Layer 3 compares platform-reported revenue against your order records and quantifies the gap.

Cross-platform deduplication. When you advertise on multiple platforms, the same order can appear as a conversion in more than one dashboard. Layer 3 checks for overlap so your total conversion count reflects actual unique orders.

How Order Data Gets In

Attribution verification depends on one thing: access to your actual order records. Catalyst supports two ways to connect this data.

Direct integration. If your e-commerce platform supports a direct connection, Trellis can pull order data automatically. This is the simplest setup — orders flow in on a recurring basis with no manual work.

CSV upload. For stores without a direct integration, or as supplementary ground truth, you can upload order data as a CSV file. The upload template captures the fields needed for attribution: order IDs, dates, revenue, product details, and the UTM tracking parameters attached to each sale. Uploaded data is processed immediately and used in your next audit.

Both methods produce the same attribution analysis. The audit doesn’t care how the data arrived — it cares that the data is there.

When Attribution Falls Short

Attribution accuracy depends on proper tracking configuration. If UTM parameters are missing, malformed, or stripped during checkout, the audit can’t trace orders back to their source. Common causes of low accuracy:

  • Tracking templates not set up. No UTM parameters means no attribution trail. The audit will flag this as a tracking issue, not a performance issue.
  • Checkout flow stripping parameters. Some checkout configurations drop URL parameters between the landing page and the order confirmation. The click happened, but the tracking evidence is lost.
  • Attribution window mismatches. Platforms and stores may use different lookback windows for crediting conversions. A 30-day window on the platform and a 7-day window on the store will produce different counts by design.

When tracking accuracy drops below 80%, Catalyst prioritizes the tracking diagnosis over campaign analysis. The reasoning is straightforward: if you can’t trust the conversion data, you can’t trust any recommendation built on it.

What Catalyst Doesn’t Do

  • Catalyst does not configure tracking in your ad platforms. It verifies that your existing tracking is producing accurate data. Setup changes happen in the platform UI.
  • Catalyst does not modify attribution windows, conversion goals, or tracking templates. It tells you when these settings are causing data quality issues.
  • Catalyst does not guarantee 100% attribution accuracy. Some order types (phone orders, in-store purchases driven by online ads, repeat customers with cleared cookies) fall outside what click-based attribution can capture. The audit accounts for this by using accuracy thresholds rather than expecting a perfect match.

What’s Next