Login is restricted to DCN Publisher Members. If you are a DCN Member and don't have an account, register here.

Digital Content Next


InContext / An inside look at the business of digital content

Preparing measurement for the post-cookie data divide

October 11, 2021 | By Mark Bausinger Director, Publisher Business Development – DoubleVerify @dvpubsuite
blue and red led light motion blur

The post-cookie world is facing a measurement problem. There is plenty of talk surrounding FLoCs, Unified IDs, first-party data, contextual advertising and more. However, there is not enough conversation about how the worthiness of these solutions will be understood by buyers and sellers. 

There will likely be no silver bullet replacement for third-party cookies for some time, so teams need the ability to analyze the effectiveness of new technology and data sets. This is particularly true for publishers, who will suddenly be attempting to meet new advertiser KPIs based on a much broader range of data types and technologies.

More data, more discrepancies

Publishers have typically been in a reactive position relative to advertiser expectations. But what happens when those expectations evolve due to the industry-wide push to replicate the performance of behavioral targeting through various post-cookie solutions? Complexity will increase, and with it, there is the potential for increased friction. This will inevitably impact publisher revenues.

Discrepancy management was historically limited to the differences in first-party and third-party campaign pacing. But as new data is used and advertiser KPIs change to incorporate different technologies for the same campaigns, discrepancies will become more of a three-dimensional problem. 

Cookie-based targeting is an example of where this could get more complicated. Third-party cookies made it very simple for a sports brand to serve ads to an audience of soccer fans in the 18-36 age range. Without access to third-party cookies, advertisers will want a way to analyze the audience data that determined the age group and interests, or the context of the page, or a number of other potential factors in order to understand whether the ad was served to the right audience. This makes campaign analysis and compensation inevitably more complicated. 

Another example can be found in the case of contextual segmentation and semantic differences in classification. An alcoholic drink recipe that one advertiser may consider as “cooking” may also be classified as “alcohol ” with a different advertiser. Since publishers work with many partners, they need the ability to recognize and normalize these nuanced differences that can have a large impact on campaign discrepancies. In short, the new world we’re entering is even more fragmented, and existing publisher workflows must adapt. 

The need for measurement alignment

In a soon-to-be-released survey conducted by DoubleVerify, 47% of publishers cited measurement standards as one of the biggest challenges with relying on first-party and contextual data. The same will likely be true for advertisers, as more complex data sets will be closely tied to KPIs. This highlights the need for neutral measurement that improves collaboration and reduces friction between buyers and sellers.

Even within individual solutions such as FLoCs or Unified IDs, there have been delays and setbacks that highlight the complexity of these issues. The industry will need different types of data to fill the gap left by third-party cookies. Ensuring this data is trusted, verifiable and actionable will be a key challenge for the industry.

Ensuring post-cookie success with holistic data strategies

The looming industry changes will make it more important for publishers to address instances of data silos and manual reporting that slow down decision making and reduce transparency. This can be made possible with a holistic data strategy and tech stack that can manage the rising complexity. Publishers focusing on aggregating their data and optimizing workflows can better position themselves to be proactive and take control of how targeting and measurement impacts their business. 

In this age of experimentation where new variables lead to more complicated deals, it’s even more important for publishers to invest in tools that will facilitate the ingestion of a variety of both first- and third-party metrics holistically. Successful campaigns must account for making first-party audience data actionable for advertisers while considering key media quality metrics and any third-party data that buyers rely on. Many industry stakeholders expect first-party data to be key, but this is only the case if the data is able to be used to effectively replace third-party targeting.

What you need for a holistic data strategy

Holistic data strategies will help manage complexity in discrepancy management and ad targeting, but what does this look like in practice? To help publishers get into a holistic mindset when it comes to data and workflows, they should ask themselves the following questions:

  • Which data sources does your organization rely on now? How is it collected and aggregated?
  • Which data is valued by your top advertising partners? How will this change in the future due to cookie deprecation?
  • Does the organization have a system to connect these resources in a meaningful way?
  • What teams stand to benefit from data that they don’t collect themselves?

Once publishers decide to move forward with a holistic approach, they must determine whether they will build a technology solution in-house or purchase one that can adapt as the industry evolves. One key point to consider moving forward is the potential for automated, algorithmic optimization based on the intersection of these data sets. For publishers, it’s important to think about the long-term implications of this choice, as workflows and tech stacks are often difficult to transition out of once they are established.

No one can fully predict what’s coming next for post-cookie targeting and how those results will trickle down to publishers. What publishers can prepare for is the fact that complexity will only increase from here. With that in mind, it’s vital for media companies to prepare by evaluating their tech stack and data foundation in order to aggregate and analyze more complex data sets.

Liked this article?

Subscribe to the InContext newsletter to get insights like this delivered to your inbox every week.