The Future of Marketing Measurement

Share This Post

Chloe from RescueMetrics sat down with Ground Control’s Phil Zohrab and his former Dentsu colleague from the US (and now SVP of RescueMetrics) Alex Langshur, to riff about the future of marketing measurement

Alex is the founder and former Co-CEO of Cardinal Path, a Merkle company, and once led Dentsu’s global Google Practice. Alex met Phil when he was in Australia delivering training on Google Analytics for local companies.  Alex now leads the North American sales division for RescueMetrics.

Watch the video or read the summary, below. 

During the call they reflect on measurement models of the past and the limitations of MMM (Marketing Mix Models). Fast forward to digital attribution’s single-touch and multi-touch models, the guys begin to talk about the state of signal loss today.

Combine cookie deprecation with the rapid uptake of adblockers and aggressive new browsers, and you get the perfect storm for discrepant analytics. With consented data facing the same erosion, what does this mean for the models and algorithms that digital marketers trust with their budgets? Put frankly, it’s bleak. But Phil and Alex were enthusiastic over the new evolutions spurred by generative AI and consented data protection.

Alex coins the genesis of undoing for measurement models perfectly – we have been blessed with an explosion of new media opportunities. Meaning that data sources and channels to allocate media budget have become so broad.

We once relied on MMM and econometrics for budget allocation decisions, until the models were no longer reflective of the degree of variation possible in a customer journey.

Multi-touch attribution stepped up to give marketers visibility over the impact of various touch points in the journey. However, as Phil points out, this model fails to consider offline and brand marketing impacts. Without the consideration of these elements, models end up with misleading and over-inflated reporting on direct traffic. This can make for biased predictive insights.

What’s more, multi-touch models are notoriously reliant on cookie data to map and analyse the customer journey. The models are now limited in how many touch points they can attribute to one user due to shortening cookie windows. The result is returning visitors being reported as new users, breaking attribution and journey insights. Algorithms missing robust and comprehensive data are failing to credit the impact that the higher-funnel activity has on driving conversions.

The necessity for higher-quality input signal has never been greater.

Here’s where Alex completes the picture on how today’s models and other algorithms are causing media budget wastage and neglecting key audience profiles during targeting and optimization.

The problem is signal loss. Attribution models and the broader AdTech stack need a quality data signal to deliver value. If the data set is skewed in any way, then over-indexing is a given.

The guys reflect on Google’s refactor of UA to GA4 as evidence of the scale of impact that quality signal mass has on effective modelling. The GA4 migration pains have been felt globally, yet Google still risked industry uproar in the pursuit of better conversion modelling.

So why is quality data so elusive and why are the trusted models of today at risk?

Alex points to popular new browsers like Brave, along with VPNs and Adblocking software as primary sources of signal loss. These third party players are blocking AdTech pixels and tracking tags from firing, even within the browser of a consented user. The widespread use of this software, along with iOS interventions, is disrupting over 50% of consented data in some cases.

Not only is this disruption eroding the signal mass serving measurement models, but it is also breeding bias. Research finds that specific audience profiles are partial to software causing signal loss. Young, male, tech-savvy and affluent audience profiles are more likely to use adblockers, iOS devices, VPNs and privacy-centric browsers. This is why algorithms are under-indexing this audience.

Audiences impacted may have consented to tracking onsite, or be active in walled gardens like Facebook and Google, but their data is still getting blocked. Alex explains how this audience is being excluded from A/B tests and algorithms are being skewed away from optimization toward these high-value segments.

Alex gets statistical with us and explains the concept of a bi-modal data set. By considering that Android and iOS user data is managed very differently, regardless of user consent preferences, measurement models are being fed a bi-modal data set. Research tells us that young, affluent, higher education presenting users will be over-represented in the iOS data distribution.

The danger here, he explains, is that the average generated from the mode of the Android- iOS data pool will be in a trough. And if your AdTech is optimising for a trough, then it’s targeting profiles least likely to convert and increase ROI.

Regardless of the model used, a meagre data input lacking representation of audiences inclined to ad-blocking extensions, will result in models amplifying bias. 

The Future of Measurement

It’s during this part of the chat that we see Phil get excited because they start talking about generative marketing attribution as the future of measurement.

Generative marketing attribution leverages people-based data and demographics, by working back from captured conversions. The engine considers the Consumer Decision Theory, factoring in the predisposition and propensity for conversion based on demographics. Multiple models will drive the data analysis delivering reports and predictive insights.

Phil speaks to one of the greatest strengths of generative attribution: that it doesn’t rely on historical data. As he points out, the vast variability in the people-based data feeding the algorithm means that we don’t need to go back in time to get signal mass. 

COVID-19 showed us how quickly online and offline consumer behaviour can change. Suddenly the habits of our past are not reflective of today’s customer journeys. Phil coined this the ‘Covid taint’ and relayed the frustration of marketers that AdTech today is optimizing based on data representing previous customer journeys less prevalent in today. Generative attribution solves this problem by omitting redundant data of the past.

This future of measurement is holistic and omni-channel. It boasts intra-campaign optimization and the ability to test and learn, live. Plus with only a few weeks required to adopt the tool, no wonder Phil is excited.

But as Alex points out, even the smartest measurement model cannot deliver accurate insights and optimisation without a quality input signal, free of bias.

The guys agree on another point, marketers should be auditing their tags as a part of good data hygiene practices (side note: Alex’s Cardinal Path business was an early client of DataTrue).

Final words of wisdom if you are spending on media

From Alex: In a budget-tightening climate, it’s crucial to be curious about your data quality.

From Phil: We are seeing increased industry investment in CX, UX and A/B testing tools, but marketers need to be aware that signal loss is skewing data in these tools. An estimated 1 in 5 users will be excluded from tests, because of signal loss. 

“You are leaving money on the table” by allowing this audience to be excluded, says Phil.

Alex agrees, highlighting that Boston Consulting Group research found that companies investing in these tools will see better returns because accurate targeting and optimisation is impactful.

More To Explore


Ground Control turns 4

Ground Control launched in Melbourne on March 1st, 2020. There was a whiff of COVID at the time, but we certainly were not expecting to


Business moves quickly. You need information and updates in real time on what's happening to your business, industry and competitors.



Crisis Control brings together your team on the right information as it happens using live dashboards, data firehoses, smart categorisation, trend detection and realtime alerts to inform time critical decisions and get ahead of the issue.

Key Technology
  • Full Twitter and Reddit Data Firehoses for news as-it-happens.
  • AI-powered trend detection to understand drivers of conversation.
  • Customisable realtime dashboards and alerting.
  • 495 million publically available posts daily across Facebook, Instagram, Twitter, Reddit, YouTube, Online news, Forums, Blogs, Reviews and websites.