Login
Login is restricted to DCN Publisher Members. If you are a DCN Member and don't have an account, register here.

Digital Content Next logo

Menu

InContext / An inside look at the business of digital content

Why adtech loves fragmentation–and why publishers shouldn’t 

Data asset consolidation and data collaboration are vital to creating a more dynamic, effective, and privacy-safe ecosystem. Yet publishers face obstacles, from fragmented systems to cumbersome workflows

September 23, 2024 | By Chloe Grutchfield, VP of Products – Permutive@permutiveConnect on

The advertising landscape is fracturing due to third-party signal loss, causing targeting scarcity for publishers and advertisers. Half of users have “disappeared” from digital advertising by using browsers and devices that limit tracking and preserve privacy. 

While Chrome supports cookies, 40% of users modify privacy settings or browse incognito to block ad tracking. We’ve seen this reflected in our publishers’ audiences, where 70% of the internet is now invisible to adtech, leaving only 30% addressability. Chrome’s ‘reject all’ buttons in Europe and the proposed American Privacy Rights Act of 2024, or APRA, will further reduce data visibility and increase the signal loss we see today.

As businesses seek ways to address signal loss, solutions flood the market that are short-term workarounds that lack interoperability. The result is a fragmented, chaotic adtech ecosystem.  

Publishers have not experienced signal loss because they have first-party signals. This means publishers can append signals to 100% of impressions, whether an ID for users that authenticate or a cohort based on known attributes and behaviors collected on the user. The challenge for publishers is to build signals that the buy side will want to buy. This requires navigating fragmentation and having the tools to connect data and collaborate with the ecosystem on their signals.

Connecting disparate data

According to a recent AdMonster’s Publisher Pulse survey, monetizing audience data is seen as a significant opportunity for growth for 50% of publishers, and 33% plan to leverage audience data to create new revenue streams.

Consolidating their data asset is critical for publishers as it offers an opportunity to leverage  a lot more dimensions to build the audiences that advertisers want to buy. Publishers can augment the first-party signals they collect across their digital properties with additional attributes such as demographic, preference, and intent data. However, today’s fragmented data ecosystem makes it challenging to harness the full potential of this information. Data often sits in silos across different environments, different teams and consolidation requires data engineers, complex workflows, and significant time investment. 

Greater connectivity creates seamless, integrated systems that efficiently combine all data assets. This empowers publishers to activate their data without the burden of engineering-heavy processes.

Greater connectivity not only simplifies data management but also enhances its effectiveness. By consolidating various data sources, publishers can build a richer, more comprehensive profile of their audiences, allowing for better targeting and more personalized advertising experiences. This, in turn, creates new revenue opportunities and strengthens relationships with advertisers.

Collaboration reimagined

The challenges don’t end with data consolidation. Retargeting, once a cornerstone of performance marketing, is losing its effectiveness as signal loss impacts the scalability of traditional tactics. Advertisers are now struggling to achieve scale and performance, with all buyers competing for the same shrinking 30% of the internet’s inventory. This lack of scale and poor performance leaves many legacy retargeting strategies falling flat.

Data collaboration holds the potential to solve these challenges, but the current state of the Data Clean Room (DCR) space presents two significant hurdles:

1. Fragmentation and operational complexity

The DCR space is highly fragmented, making data collaboration cumbersome and resource-intensive. Moving data between systems requires complex operations, lengthy data processing agreements (DPAs), and heavy legal and operational overhead. This complexity not only delays campaign execution but also reduces return on ad spend (ROAS). Advertisers and publishers are left with inefficient processes that yield minimal performance gains.

2. The wrong signal to activate against

For companies that manage to navigate these hurdles, the results often don’t justify the effort if the signal that is activated is a matched record. Match rates between data sets are typically low. However a seed of matched records is a great asset to help a publisher identify which of their signals might have the highest propensity to hit advertiser KPIs or the highest affinity towards the brand.

The industry must move towards streamlined connectivity and more efficient data collaboration to overcome these challenges. By creating unified environments where data can flow seamlessly between publishers, advertisers, and platforms, we can reduce the operational burden. This would allow for more effective use of first-party data, greater scale in retargeting efforts, and better performance across the board.

Embrace collaboration & connectivity 

This reimagined approach to data collaboration and connectivity allows publishers to unlock the full potential of their data, creating better consumer experiences, more opportunities for growth, and a stronger, more transparent ecosystem where data works for everyone.

Liked this article?

Subscribe to the InContext newsletter to get insights like this delivered to your inbox every week.