For years, publishers have relied on a patchwork of point solutions to manage subscriptions, consent, adblock recovery, newsletters, and more. Each solved a problem, but together they created inefficiencies, siloed data, and disjointed visitor experiences. This results in sites weighed down by multiple tags, overwhelmed teams, and insights lost between systems.
On publisher websites, this shows up as tag bloat. Scripts and widgets accumulate, dragging down performance and complicating analytics. An analysis of the top 200 marketing websites found an average of 12 tags per site. Some publishers reported 60–80 tags as typical, and in extreme cases the number of tags ran into the hundreds. Each additional tag adds latency, increases vendor management overhead, and compounds compliance risk.
The Reuters Institute’s Journalism, Media, and Technology Trends 2025warns that technical debt is a growing concern. Site quality and personalization are “difficult, expensive, and time-consuming” when stacks are bloated. For publishers on lean margins, these inefficiencies are unsustainable. The solution, increasingly, is consolidation—streamlining vendors and functions into unified platforms that bring data, workflows, and visitor journeys together.
Why consolidation is gaining momentum
Economic pressures
The downturn in digital advertising and ongoing cost-cutting in media have forced publishers to scrutinize every contract. Across industries, SaaS spending shrank by 11% in 2023, with companies cutting their tool counts by an average of 8%. Redundant vendors are something few can afford.
Integration and compliance
Marketers report that after cost, integration is the top reason they replace tools—cited by 51% of respondents. Fragmented data not only limits insights but also complicates compliance with GDPR and CCPA. Larger, integrated platforms are often better equipped with consent management and privacy features built in.
Industry consolidation
The adtech ecosystem itself is shrinking. Publishers use an average of 19 SSPs. However, the ANA recommends cutting that number to about seven for efficiency. At the same time, standalone ad exchanges have been absorbed into broader supply-side platforms, leaving publishers to navigate fewer, but larger, partners.
As Admiral CEO Dan Rua observes, “We’re seeing top publishers actively moving from six separate tags and vendors to a single consolidated stack, not just to cut costs, but to speed page loads, simplify vendor management, and optimize the entire visitor journey.”
Benefits of a consolidated ad tech stack
Cost savings
Eliminating overlapping licenses and contracts delivers immediate savings. HubSpot customer Liquidity Services cut software costs by 50% after replacing eight different tools with one platform, while also improving pipeline visibility by 80% and boosting email deliverability by 70%. Salesforce’s Total Economic Impact study found companies that consolidated saw more than $13M in net benefits, from lower tech costs to productivity gains.
Revenue growth
Ultimately, cohesion drives monetization. HubSpot clients reported 36% more deals closed after moving to the platform. In publishing, integrated content packages across consolidated media groups generate 30% higher audience engagement than siloed campaigns.
Admiral’s CEO emphasizes the performance angle. He says that, “Turning five or six disparate tags into one doesn’t just simplify operations—it directly improves site performance. That speed boost translates into higher conversions and more revenue opportunities for publishers.”
Unified data and analytics
Consolidation creates a single source of truth for audience interactions. Publishers can connect data from consent, engagement, and monetization touchpoints to see the full visitor journey. That kind of holistic view is impossible when data lives across half a dozen systems.
User experience
Fragmented stacks create jarring, redundant prompts—a consent banner from one vendor, a newsletter pop-up from another, a separate subscription wall later. Operating in silos, these tools often trigger targeting conflicts, overlapping pop-ups, and inconsistent styling—borders, fonts, and formats that don’t match, resulting in a disjointed, unprofessional user experience.
A consolidated platform coordinates these touchpoints into a seamless journey, where consent flows into registration and then subscription offers, all timed and styled consistently. The result is smoother interactions that reduce annoyance, build trust, and lift engagement and conversion.
Compliance, security, and vendor management
Fewer vendors mean fewer vulnerabilities and simpler consent recordkeeping, reducing the compliance burden for publishers under constant regulatory pressure. Just as important, consolidation streamlines vendor management: instead of juggling multiple contracts, account managers, dashboards, and data exchanges, publishers work with a single point of contact and unified support team. That efficiency saves time, cuts administrative overhead, and frees lean teams to shift focus from maintenance to innovation.
Challenges to Consider
However, consolidation isn’t without challenges. Publishers must weigh:
Vendor lock-in: Fewer vendors can mean greater reliance on one. Choosing platforms with open APIs and strong data portability is key.
Transition costs: Migrating off legacy systems takes planning. A phased approach, starting with a tech stack audit, helps minimize disruption.
Team adoption: Departments may resist losing familiar tools. Leaders need to frame consolidation as a win for everyone. Focus on the advantages of faster work, better insights, and less manual effort.
Publisher focus: Not every vendor understands the nuances of digital media. Some platforms are built for ecommerce or generic websites, but publishers need partners who know the demands of ad operations, audience engagement, subscriptions, and consent.
Customer service. A consolidated partner takes on more responsibility. Great customer support is essential when one vendor is handling multiple critical parts of the stack.
As one adtech executive told Campaign Asia: “Layered technology, by all means, presents a sustainability issue, and publishers are under pressure to streamline.”
Cohesion as the future
Consolidation has evolved beyond a cost-cutting exercise. It’s how publishers combat digital debt, improve agility, and build stronger relationships with audiences.
Catherine Beattie, then Director of Digital Ad Operations at Encyclopaedia Britannica, summed up the operational need for consolidation:
“My dev team’s involvement has practically become a critical KPI these days. If they can do something once and be done, then I can iterate, work with product teams, and make adjustments without going back to that queue to beg for resources. That makes everything so much more efficient.”
In a competitive environment where margins are thin and complexity is rising, publishers that simplify and consolidate today will be better equipped to innovate tomorrow.
Do you have more than one remote to operate various connected and streaming devices?
Among your family members, are there as many different streaming habits and preferences as there are people? And do these disparate behaviors lead, for instance, to heated arguments over whether captions should always be on and who’s responsible for turning them on and off?
Have you rented someone else’s home for a short stay recently? If so, could you log in and start using your streaming services on the TV? Within five minutes? After multiple emails with the host?
For all the joy that streaming “Stranger Things” and “Abbott Elementary” has brought us, the user experience of Smart TVs and connected devices has delivered far too much frustration. The mere act of turning on the TV to find something to watch is an inexcusably disastrous viewer experience—a complete fail on the part of the television industry.
Unfortunately, the proliferation of streaming services has not improved viewers’ TV usability experience. In fact, it seems exponentially worse. This is despite 81% of consumers citing “ease of use” as the second-most important attribute for video streaming, close behind “cost” at 84%, according to a 2020 Nielsen study. If streaming service providers and device manufacturers spent as much time improving the viewer usability experience as they spend building pricing models, perhaps they wouldn’t be forced to continuously raise prices.
While I think we’d all agree that there are much bigger problems in this world, viewer frustration with streaming service features and connected devices illustrates how much the TV industry takes its customers for granted. Sure, some products are arguably better than others. And right now, there’s probably someone making the argument somewhere that their supposedly excellent usability is a competitive advantage. And though there are some excellent products out there, there is considerable room for companies to improve the viewer usability experience and to gain loyalty and attention for the entire sector.
Let’s examine a few areas in need of improvement:
The multiple remote e-waste factor
First, proliferation of different and incompatible remotes is rampant. This is not a new issue with the streaming era, but it is now even more difficult to solve. Universal remotes notoriously challenge even the most tech-savvy and usually end up taking space in a drawer. In fact, many people have drawers full of extra, old and broken remotes, exemplifying the wastefulness of this sector of consumer devices.
Perhaps we need a new way of looking at this particular frustration, beyond the fact that it’s hard for consumers to navigate. So, let’s look at a similar issue in which an entity made a decision intended to minimize e-waste long-term: In June, the EU mandated the use by 2026 of a common charger for all new portable devices such as smartphones, earbuds, wireless keyboards and laptops. The argument could be made that mandating even a baseline of standardization for TV and streaming device remotes would minimize e-waste. Which government wants to take it on?
The personalized feature set
Second, personalizing one’s TV experience, within the operating system navigation, is ridiculously complex. Because this topic is vast, let’s look at just one example – turning captions on and off. Nothing exemplifies the lack of care for the TV viewer more than the inaccessibility of captions, especially given that the original purpose of the caption feature is … accessibility!
In one family I surveyed, one person likes to always have captions on. This preference is not due a hearing issue. Rather, this person believes captions or subtitles augment their understanding of a show, especially while they simultaneously are playing a video game on their laptop and scrolling TikTok on their phone.
This sort of use case is hardly uncommon. In fact, four out of five viewers ages 18-25 say they use subtitles all or part of the time according to one study. Meanwhile, within the family surveyed, other members complain that captions ruin their experience, often revealing a joke or a plot twist before it happens on screen.
The difference in preferences means that family members sharing the same TV and streaming services must frequently turn captions on and off. If you’ve tried to do this at all, you know that every service and device is different. If you haven’t tried yet, please do just to see how needlessly complex it is.
As we know, the lack of standardization for caption functions is the same lack of standardization for everything from start to fast forwarding to volume functions. It’s all unnecessarily complex and frustrating for the consumer.
The unfamiliar TV panic
Third, the issue of standardization goes beyond consumers’ own homes. Based on my recent unscientific, informal survey of a half dozen short-term rental property hosts and their guests, struggles with unfamiliar remotes and/or TV operating systems are one of the top sources of friction between hosts and guests. This applies to those young and old, tech-savvy or not.
One guest called his host frequently, complaining that the TV was broken. In fact, the TV worked just fine, but the guest was flummoxed by the Apple TV remote, despite several lessons. The guest is still complaining about the host’s Apple TV, although he has moved on to another short-term rental property (and despite the fact that this TV-challenged property was a beautifully designed home in an idyllic natural setting).
Another host described a guest who emailed a half dozen times in advance of his arrival, concerned about how he would watch TV. He asked for photos of both the front and back of the TV and a detailed assessment of whether he’d be able to plug in an antenna to watch local TV. After his arrival, this guest complained that he found the remote “awkward.” The host described losing sleep over whether this guest would write a bad Airbnb review solely based on his TV experience.
The bottom line is that both hosts and guests clearly would appreciate a TV experience that just works, easily, without instructions. And they really don’t want to have to communicate with one another about it. By setting a new baseline common standard for TVs and streaming services, the industry could perhaps improve the rental home industry’s host-guest relations – and help the rest of us at the same time.
Let’s fix the consumer experience
Forget for a moment some of the common complaints about today’s streaming industry, such as too many mediocre shows or ideas that should have been 90-minute movies but are needlessly stretched to seven-part series. Given all the investment in Smart TV technology, and all of the money to be made from distributing content via that technology, the industry would benefit from providing viewers with a much better experience. For what we are paying, we certainly deserve it.
We should have more standardized devices, accessible navigation interfaces, excellent search capabilities across devices and services, and more. Achieving this would require some cooperation among the major players. But primarily, it would require that the players start to care about their customers. It’s evident that they don’t.
The marriage of convenience between programmatic advertising and digital media is not over. Publishers must remain vigilant as deceptive, irrelevant ad experiences easily damage audience acquisition, retention, and engagement.
Credibility is the new currency
Trust is a crucial element in the user-publisher relationship. Daily run-ins with bad ads devalue publishers’ brands in the eyes of users. Online audiences have become increasingly wary of off-brand and unsafe ads, which makes them lose faith in publishers’ ability to ensure valuable experiences.
For incremental revenue, some publishers give up white space for a function that adds little value for audiences—and may remove value. More than three-quarters of publishers face ad quality challenges that impact user experience on their sites. Six out of 10 publishers reported that ad quality challenges have negatively impacted revenue. These ads frustrate users and reduce time spent reading content by 45%.
Clickbait is one of the most frustrating forms of advertising and 56% of publishers have identified clickbait ads on their sites. We identified the five most prevalent clickbait scams served through publishers’ websites: financial scams, misleading product offers, brand infringement, tech support scams, and forced browser notifications. It’s no surprise that ads like this drive audiences away.
If you’re not listening to your audience, someone else will
Not only are many ad experiences unsafe, they’ve also become less relevant because they’re based on guesswork. Publishers set their ad quality policies based on assumed audience preferences. However, assumptions don’t make for adequate quality control. Unfortunately, this means that publishers often serve ads and brands that are inappropriate for their audience, which leads to user churn and a tarnished reputation.
In the era of fake news and crumbling cookies, publishers must continually work to better understand audiences and maintain ongoing two-way communication. Since there’s no universal guide for what makes a “good” or “bad” ad, publishers must define what a bad ad looks like for their unique audience.
It’s time to implement new ways to listen to your audience. The idea of driving audience feedback isn’t new. It makes sense that in order to better engage your audiences you have to listen to what they want. To do so, publishers require a feedback loop and custom tech controls to maintain their standards.
Resist the impulse to paint audiences with a broad brush. A user-first approach lets you get specific about your audience’s ad quality preferences and close the feedback loop by allowing audiences to respond to bad ads directly. Publishers require audience feedback, without additional platforms intermediating. For example, AdChoices reports bad ads to Google and any other platform that serves the ads. Audience input creates a framework for consistently relevant and engaging ad experiences. This helps foster a sustainable, profitable ecosystem.
The sustainable feedback loop: Adapt to constant change
User-first publishers know what their audiences expect and take immediate action to eliminate unwanted ads. Allowing users to report poor experiences doesn’t just increase users’ trust in publishers’ brands. It also differentiates publishers from the competition, setting them apart as truly audience-centric.
Publishers should follow the following principles:
Eliminate guesswork by closing the feedback loop and learning which ads your users are unhappy with.
Fine-tune ad quality policies based on user feedback. Customize and adapt ads to suit your audience as their preferences change.
Learn to guide your calls to encourage audience interaction in real-time. As a rule, the more instanuous and simple the better.
Create engaging ad experiences for each site. Each of your channels attracts a different audience and you should cater specifically to them.
Stop screaming, start listening
It’s nearly impossible to regain a user’s trust once the user has been shown an ad that contradicts the publisher’s tone, looks suspicious, or is malicious. This is especially true if the user clicks through to a misleading ad and gets slapped with an offensive site or deceptive scam. Listening to your audience is vital to the success of your long-term monetization strategy. But, listening takes time. Publishers don’t have to start from scratch, real-time ad quality tools eliminate the heavy lifting. When you start listening to users, what you learn will have immense benefits for long-term monetization.
Remember: Effective communication is a reciprocal process. You never learn anything by talking.
Bugs Bunny and Michael Jordan co-starred in Space Jam. Bill Cinton was re-elected to serve a second term as President of the United States. Tiger Woods became a professional golfer. The Summer Olympics were hosted in Atlanta. And washingtonpost.com went live. What do all of these events have in common? They all took place in 1996.
It has been 25 years since those first readers could get their news from The Washington Post online. Back then, Post articles couldn’t be “googled,” since Google — as a company — would be founded two years later. And sharing a news article with friends couldn’t involve Facebook or Twitter, as these networks wouldn’t come to market for eight and 10 more years, respectively… TikTok was only the sound an analog clock made and early-social media adopters were closer to Tom being their first friend on MySpace than influencers going viral and becoming millionaires from creating content on Instagram, Snapchat and/or YouTube.
News consumption was a one-size fits all paradigm: heard or seen via broadcast news on TV or radio, read from printed ink on paper, and skimmed from websites that were effectively static brochureware representations of their print big brothers (with some supplemental content online). There was no personalization. The model was one-to-many: here are the top things reader X, Y and Z need to know to stay informed. That model is changing and has changed. And The Post has shown success in personalizing the news to readers’ interests through My Post, newsletter subscriptions and much much more.
Stay tuned, Washington Post readers are about to see more personalization in 2022!
Creating and distributing the news: then vs. now
An “Apple-to-Apple” Comparison of Reading The Washington Post on December 20 in 1996 and 2021 through then-Modern Apple Technology:
The December 20, 1996, homepage of washingtonpost.com on an Apple Macintosh
Rhe December 20, 2021, home screen of the WashPost iOS app on an Apple iPhone
Change is good. But change needs to be managed. A lot has changed in this last quarter century at the intersection of media and technology. The Post has responded to change by building new systems that manage how content is created, distributed and amplified. But one thing has remained constant — great reporters and editors create great journalism.
Another constant is that quality journalism will be seen or heard by consumers looking to stay informed. And it can shine through the cloudy haze of mis-and-disinformation maliciously shared online.
Although these constants of good journalism from trusted institutional brands and other media players communicating the news remains, how consumers get their news has certainly changed with the times. In today’s digital new media landscape, according to The Pew Research Center, “more than eight-in-ten U.S. adults (86%) say they get news from a smartphone, computer, or tablet ‘often’ or ‘sometimes.’”
As a media AND technology company, The Washington Post has not just followed consumers to their preferred destinations, it has been a leader in creating content and bringing it to readers — readers who may have an interest in politics can get their Daily 202 newsletter emailed to them; food enthusiasts can cook with confidence with Voraciously recipes and guides; podcast listeners can subscribe to Post Reports, Please Go On, Can He Do That, and other audio format news; and over 1.2 million fans of @washingtonpost on TikTok can be informed and entertained by short, witty, videos by a creative team of content creators.
All of this work needs to live somewhere. Platforms, tools and services power this news before it reaches readers’ smartphones, computers, or tablets. The Washington Post has had to understand not just the scalable infrastructure needs of today to deliver this news where and how readers want it, but technology leadership has also had to set the organization up to be successful in the future with new and expanding infrastructure and Infrastructure-as-a-Service (IaaS) resources. It’s like the old sports adage — Wayne Gretzky wasn’t the fastest skater on the ice in the NHL and he wasn’t the biggest professional hockey player. He was the greatest because he played not to where the puck currently is, but where the puck was going.
The Post’s aspiration and northstar is to not just continue to deliver excellence in journalism, but also to equally deliver excellence in engineering and innovation. The Post is playing to where the innovation puck is going by as, Deloitte Insights suggests, “designing systems in which humans and machines work together to improve the speed and quality of decision-making.” The Post is doing this to improve the reader experience through personalization and to allow company leaders to turn more data into actionable intelligence at scale.
“I’ve always understood and appreciated the work that The Post contributes to the journalistic space, but interviewing [for my role at The Post] quickly made me realize the sophistication behind the engineering effort supporting that mission.”
— Washington Post Data Engineer Jack Miller, who joined The Post in 2021.
Data, data, everywhere. Data, data, time to share.
Moore’s Law highlights the correlation of computing power essentially doubling every two years. That’s become more of a rule than a law over-time. Another rule that has held steady is the total amount of data created or copied doubles roughly every two years — therefore, The Post has seen a whole lot of redoubling of total data since 1996 and Post engineering leadership expects that trend continue in the coming years.
Inside-and-outside of the newsroom, The Post — as a business — relies heavily on data-informed decision making at strategic and operational levels. Over the years, in addition to the increased need to approach data management in a holistic way, The Post has experienced a significant increase in subscriptions and traffic across various platforms and channels. This increased data volume and velocity coupled with new sources and complexities has created new challenges (and opportunities) to turn raw, siloed and unstructured data into business intelligence.
To address these challenges/opportunities and gain maximum journalistic and business benefits from reader interests, The Post began to develop a more integrated approach to data management in 2021 under the leadership of Beth Diaz (Vice President of Audience Development & Analytics), Venkatesh Varalu (Head of Data and Analytics), and in collaboration with leaders across Subscriptions, Advertising, Newsroom, Marketing, Finance, Product and Engineering.
This data was available and accessible prior to 2021, but The Post began to manage it in a more innovative, agile and programmatic way. Under this new approach, customer data is being positioned to power various marketing and reader personalization efforts through enhanced workflows, automations and data activations via homegrown tools or services and vendor platforms. The Post is calling this macro-initiative WaPo 360.
“I’ve always been a huge fan of data. Working as a newsletter analyst, I got the opportunity to explore The Post’s various data sets to answer interesting questions about how our readers behave, and to find evidence of what works best for keeping them engaged,” said WaPo 360 Senior Data Engineer Patrick Haney. “It was a fantastic experience. However, while working with these data sets, it became almost immediately clear that they weren’t arranged in an optimal format for analysis. Answering simple business questions could take hours instead of minutes due to the siloed nature of each data set, along with the business logic that needed to be applied in a consistent fashion and often it required reaching out to a subject matter expert for validation.”
“I was ecstatic when I learned about this new data integration initiative because it would solve all these aforementioned issues and enable analysts and non-analysts to quickly and efficiently use our data to answer vital business questions,” said Haney regarding his choice to transfer from one Post team onto another.
According to a recent Deloitte study, “most executives do not believe their companies are insight-driven. Fewer than four in 10 (37 percent) place their companies in the top two categories of the Insight-Driven Organization (IDO) Maturity Scale, and of those, only 10 percent fall into the highest category. The remaining 63 percent are aware of analytics but lack infrastructure, are still working in silos, or are expanding ad hoc analytics capabilities beyond silos.”
WaPo 360 will improve the turn-around time for The Post to turn data and signals into insight-driven business decisions.
WaPo 360 and the engineering experience
When he applied to work at The Post, Jack Miller said his “interviewers stressed the importance of the WaPo 360 project across many different verticals within the organization. Being able to join a growing team supporting that project was a huge reason why I decided to pursue the position and so far it has been a great experience.”
Fellow team Data Engineer Zach Lynn agrees, saying, “the WaPo 360 project struck me as an excellent opportunity to learn and also support The Washington Post’s core mission.” Lynn’s interest included working in several business areas and collaborating with other software teams.
The first step of WaPo 360 has been focused on stitching data signals from various data sources together. Data that previously was unstructured and accessible only to data analysts is becoming democratized for Washington Post engineers and technical business users. This first pillar of work is essentially warming up the oven and organizing all of the ingredients to make it easier for business stakeholders, in different departments, to bake their own pies. Data from site and app traffic, newsletter engagements, ads, subscriptions, and other sources are becoming more structured in WaPo 360 through Customer 360 — our first pillar of the initiative.
A Washington Post data analyst recently presented how his work has been impacted by WaPo 360. In his presentation, he outlined how he experienced a nearly 96% improvement in a SQL query run time by switching data sources from the siloed unprocessed data that he was looking for to the same data signals that were structured and pre-processed in WaPo 360. As noted earlier, different data sets have been accessible before 2021, but with WaPo 360, The Post is turning data into intelligence and making it easier for staff to do their jobs. WaPo 360 is essentially replacing their hand tools with power tools.
WaPo 360 and the business-user experience
The data that is becoming structured and pre-processed in Customer 360 isn’t just going to live on an island to be visited by data analysts and data engineers. The second pillar of WaPo 360 is to make that data accessible to those with a business need to access it, in anonymized ways, through improved self-service tools.
Joshua Zieve, Senior Data Analyst, joined the WaPo 360 team, to “help catalyze The Washington Post’s data sources to better understand and serve our current and prospective readership.” Zieve has been active in coordinating with business and technical users on many fronts. “Working across the Analytics & Engineering teams, I’m grateful for the opportunity to develop systems that facilitate, deepen and expedite analytics for use-cases throughout the organization,” Zieve said.
Good data is the foundation for WaPo 360 and that leads to personalization benefits. Following the team’s work in delivering structured data in Customer 360, WaPo 360 sends relevant data to power the business use-cases that Zieve references into a new Customer Data Platform (CDP). The CDP then works as an engine to allow business-users to perform exploratory data analysis, build audience segments, create marketing and reader engagement campaigns, analyze their success, then deliver an improved personalized experience to readers through integrations with Washington Post-built tools and popular offsite services that The Post utilizes to reach potential readers.
“[I’m] most excited about the self-service potential for The Post’s newsroom and business teams … with data in one place, which is aggregated and ready to be queried, users can get their data without waiting for The Post’s Analytics team to prepare the data. For the Analytics team, this will also reduce time spent for serving ad hoc requests from the newsroom/business side.”
— Sam Han, Director of Zeus Technology and Artificial Intelligence (AI) / Machine Learning at The Washington Post
WaPo 360 and the reader experience
The Post will be doubling down on personalization in 2022 — directly and adjacent to the work being conducted by the WaPo 360 team.
Early work is underway to improve the onboarding experience for new subscribers. And the team plans to unlock significant opportunities to retool, rethink and reshape how articles are suggested to readers — such as through improved content insights and an updated Content Taxonomy System with new article subjects/topics metadata powering future innovation.
Members of the WaPo 360 team recently presented the team’s work at a company-wide virtual forum. Washington Post Organizational Development Consultant Cameron Booher said, “Planning for any What’s Next event involves talking with many project teams about their ongoing and upcoming initiatives. And the usual format of What’s Next is to highlight three projects from different areas of the business. But it very quickly became evident through conversations just how significant of an undertaking WaPo 360 is. It’s extremely collaborative, and has been built upon expertise from almost every department at The Post. It will be rolled out in various phases, which speaks to the iterative process of develop-test-improve.”
“Some of the insights we’ll gain will help us improve reader and user experiences in spades,” Booher said.
This article originally appeared on Washington Post Engineering and is re-published with permission.
Publishers have long understood that user experience and monetization go hand in hand. So it only makes sense to examine the impact of ad quality on experience and, by extension, on monetization.
According to our research, more than three-quarters of publishers (76%) say user experience on their sites has been impacted by ad quality challenges. What’s more, 66% say poor ad quality has affected their revenue. It’s clear that user experience and ad experiences are one and the same in the eyes of the audience. So why should digital publishers see it any differently?
Maintaining user trust in a post-trust-era
Few publishers are spared the burden of poor ad quality, hitting them where it hurts most—audience trust. Digital advertising is often characterized by clickbait, as well as deceptive and offensive advertising. These days, the audience sits at the most vulnerable position in advertising as digital scams and fraud run rampant. But they are also the most powerful element in the digital ad ecosystem, determining the success or failure of a publisher’s business.
Along with sensitive ad categories, publishers should be wary of ads touting misleading product offers, financial scams, and brand infringement. It may seem that programmatic monetization automatically translates into poor ad quality, but publishers should understand that there is a better alternative. Quality advertising can reinforce a positive user experience, benefit a publisher’s relationship with the audience, and serve as a differentiator within a crowded market.
The first step is to acknowledge the need to shift your approach to one that puts the user first.
What it means to be a user-first publisher
Implementing a user-first approach is vital to ensuring a positive experience and safeguarding programmatic as a strategic revenue channel. But not every publisher who says they take a user-first approach actually does.
In 2022, publishers that want to differentiate themselves must offer a user experience that surpasses the standard—and user expectations—because audiences see bad ads as more than a red flag. Beyond being simply annoying, bad ads send a message that a publisher doesn’t value its users. In fact, nearly half of publishers claim that poor ad quality impacts the lifetime value of visitors to their sites. Our researchers found fewer than 10% of users who experience bad ads even let publishers know. The vast majority churn silently. They leave the site, end their subscription, and opt for the competition, leaving publishers wondering what went wrong.
To uphold truth-in-advertising, bad ads cannot appear on the page in the first place. This is the only way to exceed audience expectations, ensure user protection, and maintain user trust. Our research revealed less than a quarter of publishers who serve programmatic ads claim they have total transparency regarding the ads that come through those channels. Unless programmatic platforms can reign in bad ads, scams will continue to plague publishers and their audiences.
Making technology work for you and your audience
Publishers are the obvious choice to determine brand suitability. They set the standard for ads that meet their site’s content and audience expectations. After all, publishers know their audiences best. To ensure a high-quality user experience, technology must be used to provide complete transparency and control for publishers.
We found that a staggering 80% of publishers say brand suitability enhances audience trust and ensures confidence. Real-time tech should provide publishers maximum granularity enabling internal teams to define the standards for what is and isn’t allowed on their site.
Publishers require dynamic, real-time tools that proactively ensure brand suitability standards across programmatic channels most susceptible to bad ads. Though it is important to remember that, while high standards for brand suitability can be maintained with technology, the technology itself must be reliable.
Thinking forward (not one step back)
A user protection approach can’t simply involve putting out fires. A positive and trustworthy user experience is at the heart of audience loyalty—and it requires a proactive approach.
Trust in advertising requires truth in advertising. To put users first, publishers cannot compromise on user protection. Publishers must synchronize their user protection strategy with their business objectives by using a technology-based ad quality strategy.
Online news organizations have had a rough go of swaying audience behavior. First you had to get them to click. Then you had to convince them to stay. Now you have to keep them coming back. And with any luck, they’ll tell you who they are while they hang out. But the question that needs to be answered to get them past any of these thresholds is: Why should they?
The biggest barrier in converting anonymous users to known community members isn’t a reluctance to hand over data. Because despite waning trust in Facebook following the Cambridge Analytica scandal of 2018, social media platform still counted 1.93 billion daily active users in its third quarter of 2021. Elsewhere, a study conducted by data privacy company Entrust shows that almost 64% of consumers are quite willing to let companies know who they are if they feel they’re going to get “relevant, personalized, and convenient services” out of it.
Experience matters
There’s the rub: relevance, personalization, and convenience. Relevance can cut both ways. It can be about the content you produce, but it’s also about the interactive environment you create and how essential it is for your user base to get the most out of your platform. Personalization can be a provision for registration. (If you sign up, we’ll make sure you only see the things that interest you.) Convenience is about having fewer hoops to jump through to get to the things you’re interested in. All three are experiential conditions.
So, what does it look like to provide an experience that leads to registrations? First, tackle convenience with a simple sign-up process. Then, start hosting engaging, community-led features on your owned platform that make sense not just for the content you produce, but for the types of interactions people want to have around that content.
A perfect example is The Independent, the national U.K. publication that went fully digital in 2016. It implemented different engagement solutions, including live blogs for breaking news, live AMA series on current events, and automated comment moderation. One popular AMA series is hosted by travel correspondent Simon Calder, who answers questions about ever-changing travel restrictions due to Covid-19. Part of a recent strategy was to make the UI of their engagement solutions simpler for users.
Otherwise, users who want to participate in any discussion on The Independent’s platform have to sign up, and boy, have they ever! Over 12 months, The Independent reported a 100% increase in registrations, with over 2,000 monthly registrations driven just by the comments section. The newspaper attributes 1 million article views from AMA content, and reports 15 times more time spent on the site after a user registers.
Gametime
Meantime, the Philadelphia Inquirer ushered in Gameday Central in September 2021. Its goal is to create excitement around Philadelphia Eagles games. From the Gameday Central hub on the Inquirer’s website, viewers can watch a live pre-game video that’s also streamed on social media. During a game, users can interact on the Inquirer’s platform with a live blog, polls, comments and pinned comments. Managing editor of sports Michael Huang recently told Digiday that Gameday Central has attracted thousands of users.
It’s not surprising that a major sporting event would attract this much attention. But what’s interesting is that the Inquirer combined several experiences at once to keep users hooked.
Engaging solutions
Our own research shows that it only takes a few engagement solutions to make your users spend more time on your website, and when they do that, they’re 25.4 times more likely to register. Features like sharebars, email notifications, trending carousels, and personalized newsfeeds generate more page views and considerably increase dwell times. In fact, a user that clicks on a personalized newsfeed profile spends on average 42 minutes more per month on a news website than an anonymous user.
It also can’t be understated just how significant a well moderated comments section can be, because when it’s safe and generates civil discussions, even users who don’t necessarily contribute will spend time reading the comments. When we culled the data from 5 of our media clients, we found that both registered and unregistered users spent an average of over 1.6 million minutes in the comments.
So, you can try to convince your audience verbally that registering is a good idea, or you can incentivize them with a rich experience they won’t want to miss out on. A live blog during a major sporting event, an AMA about a hugely unpopular policy: these are things people are talking about on social media already. What if they were talking about it on your troll-free platform instead?
People working in CTV advertising often toss around phrases like “the living room of the future” and “the connected consumer.” TV technology today is pretty darn impressive and its easy to be optimistic. With content like choose-your-own adventure programming and sophisticated hardware, today’s TV tech puts the “home theaters” of even a decade ago to shame.
However, there’s also a central dilemma: The tension between wanting to innovate and wanting to play it safe in order to avoid annoying or alienating consumers. How high-tech is too high-tech for certain audiences? There are a number of emerging technologies designed to enhance the viewing experience that advertisers could leverage. However, just because they can doesn’t mean they should.
For brands, designing a campaign for unproven territory can mean designing a campaign that doesn’t work because it fails to resonate with consumers. On the other hand, if you pass up the chance to be first-to-market, someone else will get there instead. The trick is to find the just-right spot in between.
Luckily, there are some lessons to be found in some of the past few decades of innovation in the living room. As with ad overload, history can teach us a few things about marketing on CTV.
Watch for consumer adoption
First, track consumer adoption. Just because a technology exists doesn’t mean that consumers are using it. That might be because it’s too expensive, too difficult to use, or just not useful. Consumer uptake is what really matters, especially with an emerging medium. And the reality is that plenty of TV technologies promoted by mainstream consumer electronics retailers encounter lukewarm reception from buyers.
Remember curved TVs, which were hyped up back in 2013? It turns out that you can’t replicate an IMAX experience in your living room, no matter how high your ceilings are. Meanwhile, take a look at voice control, something that may have seemed like a science fiction technology not long ago. You can thank the marketing behind Amazon’s Alexa and Apple’s Siri for convincing consumers that voice control is both useful and user-friendly. As a result, we’ve found that voice-controlled ads really do engage users.
Consumers crave convenience
Second, never forget that consumers crave convenience. Integrating new tech into your advertising often isn’t worth it if it makes the brand experience more complicated for the consumer. This is particularly important when translating digital campaigns from desktop and mobile to the lean-back CTV environment.
There’s perhaps no better cautionary tale here than 3D TV. Watching some entertainment in 3D – sports, for example – seemed thrilling at first. But needing to wear specialized glasses for the experience, in addition to the added cost of 3D TVs, posed a barrier to entry that consumers weren’t willing to surmount. Then ESPN backtracked on a plan to put live sports into 3D, which eliminated the one incentive many consumers had to adopt the new tech – a more immersive sports experience.
There’s an exception to this, though. You can offset added complexity with a value exchange. If you’re going to make an ad experience more complicated, be upfront with what consumers will get in return. That might be fewer commercials for the rest of the episode they’re watching. It might be discount that they can capture with their mobile phones straight off their TV screen (or both).
Prepare to adapt to change
And finally, be ready to adapt to change. Things change fast – hardware, software and consumer habits. Five-year plans for a new medium may need to be altered in a matter of weeks.
In addition, keep an eye on market signals even if you consider them to be externalities. Some media executives and technologists were singularly focused in the mid-2000s on the HD-DVD vs. Blu-ray race to supplant the standard DVD. However, that might have distracted them from the fact that before either could “win,” streaming media became the choice of consumers worldwide. Millions of dollars were invested in technologies that were obsolete before they even hit the market.
“We’ve screwed up in the past but we won’t do it again” is a common mindset that’s been in the ad industry for decades. Typically – spoiler alert – we do screw up again. But even as we look to the future, having an eye on the past can never hurt to help avoid a few missteps.
Ever opened an article and started reading, only to lose your spot because you were bumped down the page? This annoying occurrence is the result of layout shift, one of many performance issues that can crop up without consistent care and attention.
Improving the experience of our users is always a top priority at The Washington Post. With Google’s focus on user-centric performance metrics via new page experience signals called Core Web Vitals (CWV), we can better monitor and iterate on those experiences.
CWV signals are currently being rolled out globally. They consist of three new performance metrics: Largest Contentful Paint (LCP), Cumulative Layout Shift (CLS) and First Input Delay (FID). We set a target to reach the highest threshold across all three metrics by August 2021, as Google’s rollout of the new signals is set to be completed by the end of the month.
For publications like The Post, which need to balance rich advertising, rich content and page speed, CWV poses an interesting challenge: How do we ensure a performant experience on all pages across our domain in the short term? And, looking ahead, how can we bake good performance and attention to UX into our long-term processes and culture?
In August 2020, aggregate CWV scores show room for improvement (Note: This data is for our entire domain on mobile devices only.)
Last summer, the engineering team at The Washington Post began exploring just that. Since then, we’ve seen significant lifts across all three scores.
In August 2021, aggregate CWV scores show significant lifts after performance investments.
To accomplish this positive shift, we put our users at the forefront of our engineering and design practices. Here are some key takeaways.
Understand the baseline
We began by determining a baseline for our CWV scores across our site using several different monitors to understand the complete picture. Luckily, there are a lot of awesome tools to help.
Step 1: Identify Patterns with GSC
The Google Search Console dashboard for CWV provided a good first pulse. This allows us to drill down into our less-performant urls and begin to identify patterns in templates or features.
Step 2: Get more granular segments with real-user data
Next, we collaborated with our Analytics team to set up dashboards that would show us an approximate distribution of scores for real users. This gives us more granular insight to triage issues on lower connection speeds, varying devices, etc.
Step 3: Automate and monitor release impact
We highly value continuous deployments and set up synthetic monitoring using Calibre App against our common page templates, with automated alerts going to Slack when we fell below our threshold. This allows us to gauge impact of each release so we can #alwaysbeshipping.
Make it a priority and assemble the team
Communication and a shared commitment to deliver high-quality user experiences are the cornerstones of our success. From engineering to design to advertising and our newsroom, web vitals became a common language and a common goal through a series of communication and alignment methods:
Recurring (and actionable) checkpoints across departments and teams to check in on progress, flag concerns and share tips
Slack channel for sharing web vitals wins, resources and questions
Company-wide presentations on web vitals in town halls
Getting company buy-in for performance work is possible by illustrating to stakeholders how CWV will benefit customers, increase the bottom line and speed up engineering productivity to ship more features, faster! It’s a win-win-win.
Fix the things!
We took several strategic steps to improve CWV scores at The Post.
Optimize Rendering of Main Images for LCP
We reworked the way we rendered the main image in our standard article layout, and as a result, it was the biggest improvement we saw in LCP.
How it worked:
Preloaded the img element’s srcset
Used an LQIP (low-quality image placeholder) in the form of a blurred svg as the background-image of the image element until the high-res request loaded
Assigned the attribute decoding=”async” to the img element
These changes resulted in an improvement to our LCP score that was immediately noticeable.
Reserved Space for Components to Minimize CLS
Anywhere we used lazy loading, we used a placeholder element with a min-height — and potentially min-width — to reserve the space so the element would not jump in and push content elsewhere on the page. We invested in skeleton states to improve the user experience for our readers. This is one of those techniques that became a pattern for us in our quest to improve our CLS scores.
One example of this execution is with our advertisement placeholder. Previously, ads would jump into the slot and push content down the page, which could be frustrating for users, particularly on mobile devices or on smaller displays.
Improving our CLS score required close collaboration with the advertising team. To evaluate impact, we developed and ran extensive A/B tests that served different creative with predictable sizes into the slot for a split segment of users. By analyzing the CWV and broader impact, we were able to make data-driven business decisions. In most cases — and to our surprise — serving predictable creative sizes increased revenue and we were able to commit several units to a set size and eliminate CLS. By opting to serve larger ad sizes (300×600 units only instead of a combination of multiple sizes, which caused a layout jump), we saw higher CPMs. We also eliminated several ad format types that were jumpy. These were traditionally poor performers with our users. We replaced those formats with units that loaded fully in view, resulting in higher ad viewability metrics and, ultimately, higher engagement and improved ad performance.
As the page loads, min-height is reserved with a skeleton state for inline ads:
Because of this, the ad can be fully loaded in without impacting CLS:
Code-splitting for Faster FID
While we have made continuous improvements to our FID, we found opportunities to further enhance the score by utilizing code-splitting. We were able to ensure that, on any given page, unnecessary code was excluded from our JavaScript or CSS bundles, making our files lighter and our pages faster.
FID is described as a phenomenon that “happens because the browser’s main thread is busy doing something else, so it can’t (yet) respond to the user. One common reason this might happen is the browser is busy parsing and executing a large JavaScript file loaded by your app.”
By leveraging Next.js dynamic imports, we can conditionally load the JavaScript for components that are needed by each page. We took a similar approach to code-splitting CSS with updates to our webpack build:
Maintain, maintain, maintain
As with many things, improving CWV scores is not something you do once and then move on. We’re working at The Post to integrate CWV into our ongoing product, design and development processes, in addition to regular review of changes in our scores. It’s an ongoing effort.
We were able to introduce CWV as a new set of standards in cross-team collaboration — and dedicate resources to maintaining these standards. This will serve as a point of reference for other cross-team initiatives in the future. Now, we’re looking at structuring other projects the same way, for example, on web accessibility.
This article originally appeared on The Washington Post Engineering blog and is republished with permission.
Want to share CWV tips, interested in working with us, or just feel like saying hi? 👋
This project wouldn’t be made possible without the hard work of this outstanding team: Holden Foreman, Robyn Bortz, Anna Scalamogna, Taylor Scott, Christopher Kankel, Erika Johnson, Hannah Mahon, Amanda Bozzi, Arturo Silva, Dan Weaver, Joey Weed, Sarah House, Amanda Hicks, Ted Cook, Thomas Chan, Tommy Lisiak, Eric Lin, Hope Tambala, Ryan Luu, Leo Dominguez, Nathan Walker, Gregory Auld, Ryan Coughlin, Matt Callahan, Dave Merrell, Jeff Turner, Julie Bacon and Kat Styons.
When it comes to the future, increasing user registrations and building robust first-party datasets should be at the forefront of every publisher’s strategy. Building first-party data unlocks a publishers’ ability to increase user loyalty, activate their audience, monetize their content, and improve their editorial strategy.
However, according to a recent survey from Teads, 65.3% of publishers say they are not planning on increasing the usage of logins to gather first party data. Why is that? According to the report, they’re worried about a “potential disruption of user experience.”
It’s a valid concern: 67% of customers say poor user experiences are a reason for churn.
It doesn’t have to be this way. Many leading publishers are leveling-up user experiences on their website by creating social experiences and using personalization to create a seamless registration and log in process. These publishers stand to attract and grow an audience of loyal readers. Let’s take a look at how they’re doing it.
Three tactics to engage loyal audiences
1. Give users a reason to register with immersive social experiences
Stand-out content will always drive readers to a publisher’s website. Increasingly, though, readers are looking for other ways to interact and engage once they’re done reading an article.
Today, the publishers who are creating social experiences on their website see more registrations and greater retention rates. At OpenWeb, we’ve found that publishers who create these experiences have users who spend 35% more time on-site. They are 450% more likely to return month-after-month, and drive 4x the revenue as the average user.
There are a number of ways that publishers can create social experiences. Many are adopting engagement tools like one-click polls where readers can weigh in on the article they just read. Others are producing interactive ask-me-anything style interactive forums (AMAs) with authors or journalists. Another effective social experience is hosting a comments section. When publishers invite readers to interact in quality conversations where they feel valued, they’re more likely to register and return again and again.
2. Use active personalization to provide value and keep users coming back
Users not only crave engaging social experience, they’re loyal to publishers who “get” them.
That’s why more publishers are creating personalized content experiences to drive registrations and loyalty. In fact, 88% of marketers believe that audiences now expect personalization. One of the most effective ways that publishers can provide value to their users is through active personalization, or personalization that allows users to self-select their favorite authors, topics, and more.
Readers who receive targeted content tend to stay on-site longer, return more often, and view more content. They also provide publishers with more information about their content preferences, allowing publishers to take advantage of rich first-party data insights. One of the great things about personalized experiences is that, with the right tools, just about any publisher can create them—even those without a comments section.
3. Optimize your first party data success strategy
First-party datasets will play a pivotal role in every publisher’s future. In fact, they’re absolutely critical for any sustainable strategy, particularly in light of cookie depreciation. Through exceptional user experiences, publishers will drive more registrations. This will, in turn, allow them to collect more robust and accurate first party data which ultimately enables them to provide additional value to their audiences. In other words, all roads lead to a loyal, active community.
Time to interact
So if you’re still contemplating how to navigate the cookieless future, start by evaluating your user experience and ask these questions:
What opportunities are there to create social experiences and build a thriving, active community?
How can we leverage personalization to provide targeted content experiences for readers?
With this approach, publishers stand to see more registered, active users and higher retention rates, like Salon has. Thanks to immersive social experiences on their website (including a comments section), they built a loyal community of active users—and boosted their retention rate 4X.
The bottom line? Give users exceptional experiences and see exceptional results.
About the author
Joel Bejar is the Vice President of Business Development at OpenWeb.
When talking about conversion strategies within the digital media industry, we often focus our efforts on the same tried-and-tested methods: email marketing, social media campaigns, SEO, PPC, etc. These are all important and effective ways to drive readers along the path to registration. They make up the foundation for a complete digital strategy, including subscriptions.
But there’s one aspect of the online content offering that is too-often overlooked, despite being pivotal to a publisher’s bottom line: the on-site user experience.
By UX we don’t simply mean a seamless UI and attractive website design (although these are vital, too). We’re talking about data-driven, deeply personalized, unique experiences that make anonymous visitors far more likely to register, and repeat customers far less likely to churn.
This improvement can take many forms, such as differentiated trigger points for your most engaged readers, messaging dependent on on-site behaviors, and personalized offers. Any move to more closely align value with offering – in other words, every step taken into deeper personalization – makes your relationship with customers stronger and more likely to last.
Ultimately, optimizing the user experience is a commercial gamechanger. It significantly affects two major KPIs : lifetime value (LTV) and customer acquisition cost (CAC).
Optimize the user experience to drastically lower CAC
With more competition than ever and customer expectations at an all-time high, customer acquisition is tough. This is particularly true in the overcrowded market for digital content.
However, experience optimization makes it much easier for publishers to target and convert their audiences. When it comes to messaging, 72% of consumers won’t respond to marketing efforts unless they are personalized, according to SmarterHQ. And while personalized messaging alone may not lead directly to a sale (although, it very much could), it may be just what your business needs to stop visitors falling at the first hurdle. If you can’t get your readers to engage, there’s no chance they are going to convert.
But this is just the first step in a first-party data strategy. Progressively profiling readers by collecting new information at different stages of the subscriber lifecycle allows you to accurately target readers with registration walls at the right time, using user data such as on-site activity, device type and time of day. The incremental build up of your datasets gives your commercial decision makers a better understanding of the customer base. This, in turn, informs their ability to tweak messaging and experiences to nurture leads.
Further down the funnel, unique user experiences will also help get new customers over the line. The first-party data collected on registration can then inform tailored content, offers and timing. This accelerates conversation rates and keeps bounce rates low.
It takes a lot of effort (time and money) to get users from awareness to intent. So, turning them off with an ill-timed paywall or irrelevant content offer leads to an automatic loss. On the other hand, Marketing Insider Group found that tailored content offerings increase purchasing intent for 78% of Americans. This makes triggers such as a datawall or paywall a make-or-break point for digital publishers.
Focus on personalization to increase LTV
Optimized experience increases your chances of converting a user. However, focusing on the customer experience straight after purchase also sets the foundation for higher value, longer-term subscribers.
Maintain personalized content to keep engaged readers happy. This paves the way for an upsell opportunity for your most loyal advocates. It is famously 10x easier to sell to a current customer than a new one. However, this is often forgotten in the world of media. Increasing subscription value, or lengthening subscriber lifetime directly boosts revenue and profitability. Doing both has a double whammy effect on your commercial goals.
And personalization isn’t just for those at the top of the funnel. Use customer engagement and behaviors to identify churn risks — the subset of your customer base that seem disengaged and are therefore disproportionately likely to leave. Give them a tailored offer, potentially even a downsell to a lower plan, or nudge them with relevant content to maintain each and every relationship.
Conversion is nothing without retention. Personalized experiences at the beginning of the customer journey (such as getting the right users to convert with relevant messaging) help keep churn at a minimum just as much as last-ditch attempts.
The bottom line
Experience optimization is the best conversion strategy publishers aren’t doing, But they should be. That’s because it has such a direct impact on commercial outcomes. Purchases are made on relevant, unique experiences just as much as bad impressions are via irrelevant or ill-timed messaging.
And at the end of the day, optimizing the user experiences at each point of the subscriber lifecycle is an improvement that every publisher can make without needing to build a new team, or create a new marketing channel from scratch. Getting it right can turbocharge revenues. Getting it wrong can ruin all your other commercial efforts.
For most, it’s a simple yet powerful step that you just can’t afford not to take in today’s media landscape.
Over the last decade, digital publishing has seen tremendous development and technological advances. However, one thing remains constant: content is king. The rapid and always shifting market has forced many publishers to change their marketing and publishing models. This has impacted both readership and retention and also forced technical innovation to capture minds tuned for instant gratification.
Stagnating platforms and user experiences can be the difference between a reader looking for more content and returning, or never coming back. Fast, immersive, responsive, and non-intrusive experiences drive adoption and minimize churn. And they are best executed closest to the reader – at the network edge where the ever-important response time can be minimized.
The edge explained in 30 seconds
As mentioned above the edge means bringing your articles as close as possible to your reader as you can. It means deploying your applications no longer on your own servers. Instead, they are distributed across the U.S. or even internationally. Using an edge cloud caching product (such as ours at Fastly) means caching copies of your content not at the origin, but close to your readers. This provides scale and speed that would otherwise be extremely difficult and costly to achieve using your origin servers alone.
As an additional benefit, this approach opens the door to serverless computing. This brings your application closer to the customer and frees it from conventional cost and rules associated with server space and infrastructure maintenance
Power of the edge
If you’re still wondering how the edge can change your publishing platform, let’s talk specifics:
1. Increase response times
How quickly does your website respond to new users across the country or on the other side of the world? An optimized platform at the edge will be able to store copies of your web page in servers globally, reducing the time to first load of your web page and every subsequent content load after. This will increase user satisfaction. It can also improve typical SEO ranking pushing your news to the top of search results.
2. Publish – and update – instantly
Ever since the advent of cache, clearing it has been a problem. Stale or outdated goes against what any digital publisher is all about. With an optimized edge publishing platform, copies of your content are instantly stored around the globe. So, when needed, you and your readers will benefit from its ability to purge or update in an instant.
3. Effective and rapid A/B testing
A/B testing is detrimental to rolling out new (web) features. While there are many ways to do this, certainly the easiest and most effective way is to execute rapid A/B tests right at the edge. Using a programmable caching layer, developers can easily route a small percentage of traffic to a test site or add a test header to expose a fraction of the readers to a new feature or headline before going live.
4. Unintrusive paywall
We live in a world of paywall, but is it slowing your visitors down? Is it dragging out load times, slowing down the overall user experience, and therefore discouraging returning readers? By migrating your paywall to the edge, you can significantly increase the speed of reader validation to confirm variables such as identity, location, and subscription status.
5. Ad blocker detection done right
Ads continue to be the primary source of income for many Digital Publishers and the proliferation of ad blockers is a serious challenge to the industry. Do you have appropriate ad blocker protection in place? A powerful publishing platform at the edge will be able to quickly detect and deny traffic to any and all readers using ad blockers.
6. Content targeting for increased engagement
One of the benefits of the digital workflow and process is the immense flexibility it gives you as a content delivery service. With the amount of programmatic power available at the edge today, why deliver the same content to everyone? By using powerful customer insights paired with a strong edge preflight system (to figure out who a user is when they request your website), you can tailor content to the reader and deliver it at high speeds.
7. Reap the benefit of targeted ads
It can seem like nobody wants ads. However, readers often pick “irrelevance” as the reason why they ignore and dislike ads within content or on news outlets. As is the case with above, ad revenue and engagement also benefit greatly from personalization. Making intelligent and localized at the edge brings benefits to the publisher and reader.
8. Image optimization on-the-fly
A modern edge platform should be able to enhance your publishing workflow by letting you customize and optimize images. This may be cropping or fine-tuning resolution, as well as other transformations, but it should be done close to your reader and on the fly when the requests come in for ultimate performance and cost savings.
9. Deliver right-sized content with device detection
With the sheer number of devices existing today, building responsive websites to serve your digital content is no easy task. However, when using a powerful edge platform, you will be able to automatically detect devices and help you steer your users to the correct application (mobile or web).
Success on the edge
An industry-leading edge cloud platform should let you securely edit and publish at the edge of your network—right where your readers are and do so at speed and scale. In light of the challenges digital publishing is facing, the key to customer retention and satisfaction is serving the most up-to-date personalized content instantly. You must also ensue that online experiences are fast, safe, and secure.
Overall technology continues to shift away from centralized infrastructures. We hope to have demonstrated the significant amount of power available at the network edge. Whether you have an established content platform already or you are planning to start from the ground up, you must take advantage of every bit of speed and power you can get to bring your publication into the next phase of digital delivery to delight and retain.
Around Halloween 2019, I was catching up on politics on a major news publisher’s site when an image in a 970X90 banner ad stopped me dead in my scrolling. I flashbacked to anatomically accurate depictions of female genitalia shown in junior high sex education class. Except this genitalia appeared to be sitting on top of a person’s neck.
This mask was graphic. But I could not look away or resist the urge to click. Yes, I took the risk that the same ad might follow me across the web for weeks. (Beware: the retargeting curse…)
“Full-head clam shell mask” blared the landing page title. A side angle revealed a harmless sea shell that someone was wearing atop of their head. Front and center, however? Female anatomy, and definitely an intentional resemblance.
At the risk of being called a prude, I can’t imagine most people would enjoy being subjected to a “clam shell mask” ad while perusing the news. And I know publisher revenue folks would be horrified if this image appeared in front of their audiences. I can just imagine the screams from the editorial team.
Out of publisher control
Brand safety conversations too often focus on advertiser concerns when publishers are trying to shield consumers from offensive or harmful creative. As this ad proved, a premium publisher was serving me a tasteless ad. Yet, my curiosity was piqued—how did this explicit mask end up in front of me? If that was targeted advertising, what ever did I do online to make an advertiser make me think I’d want to purchase that?
The ad had been served by a well-known retargeting firm, though I’d never visited the site in question. Investigation revealed the same site also sold a variety of guitar equipment, something for which I’m very much the target market. So I was most likely part of a lookalike pool .However, the advertiser or its DSP had been careless in selecting relevant creative.
Which brings up the conundrum—did the publisher have any control over the creative in that situation? Thanks to real-time bidding, no one on the revenue team had any clue that specific creative was going to give me nightmares.
Sure, advertisers can use the IAB Content Taxonomy to pre-declare ad content. But if you’re serving a huge amount of creative dynamically—like a clam shell head mask—are you really going to spell out what might offend the end user?
Publishers and supply-side platforms are forced to have a lot of faith that their upstream partners (DSPs and buying platforms). The assume they are looking out for their best interests and complying with their acceptable creative policies. All too often, that faith is unwarranted and unwanted creative sails through the programmatic pipes. Publisher reputation and user experience is damaged, but the ultimate victim is the end user.
Brand safety goes both ways
I thought about the clam shell mask when I was reading the new brand safety report from the Global Alliance for Responsible Media (GARM), a World Federation of Advertisers initiative. Analyzing content across seven major social media platforms including Facebook, Instagram, Twitter, and YouTube, GARM discovered that more than 80% of the 3.3 billion removed posts fell into the categories of Spam, Adult & Explicit Content, and Hate Speech & Acts of Aggression.
That’s certainly alarming, but it’s only one side of the coin. When the ad industry talks about brand safety, it’s accepted that we’re referring to advertiser concerns about offensive web content.
However, publishers are brands too. Also, keeping publisher brands safe from offensive advertising is also protecting consumers. This is particularly important when publishers monetize through the open programmatic marketplace. In this case, they have limited control over the kinds of creative that shows up on their sites. (E.g., disturbing clam shell masks!)
Categorically bad ads
Of the 1 million-plus sensitive creatives identified by The Media Trust, more than two-thirds were Adult and Provocative ads.
The Year in Sensitive Creative, a report from The Media Trust analyzed objectionable ads between March 2020 and February 2021. It reveals that nearly two thirds of the over 1 million creatives singled out were Adult (nudity, sex toys, adult entertainment) or Provocative (sexually suggestive content, profane language) ads. Other high risk categories include Alcohol and Political (big spike in inflammatory ads in October 2020). Another is Weapons, which has been on a disturbing incline since the November 2020 elections.
Bad creative goes beyond serving someone an ad with nudity or inflammatory subject matter. According to the report, nearly half of the ads related to coronavirus over this period were outright scams. These ranged from price-gouging on PPE to false or unverifiable claims about masks, air filters, and sanitizers. That isn’t just a risk to user experience. It’s also a safety concern for consumers being preyed upon by bad actors. On top of that, publishers face legal liability in heavily regulated categories such as Medical or Pharmaceutical.
Overall, scams made up 47% of all coronavirus-related creative, but during summer 2020, that share was closer to two-thirds, putting consumers at high risk.
Publishers have long been the last line of defense in protecting consumers from bad ads—particularly when it comes to malware and malvertising. DSPs and buying partners have shrugged off their responsibility in keeping consumers safe from objectionable ads and outright scams, all while watching revenue roll in.
Embracing responsibility
The good news is that many upstream partners like DSPs are starting to recognize the importance of complying with publisher creative policies. They are also realizing their responsibility in protecting consumers. Ensuring a high-quality ad experience is increasingly seen as good for business for buy-side platforms. So, they’re using AI-powered categorization systems to identify potentially offensive creative at scale.
The amount of creative violations detected in the second half of the period analyzed in “The Year in Sensitive Creative” was markedly lower than the first half. In particular, the amount of Adult creative has fallen 73% since November 2020. This suggests that advertisers either switched tactics or moved onto less stringent platforms.
But that’s still a problem. Too many unsavory AdTech providers are willing to look the other way on publisher brand safety. Leaving consumers vulnerable to offensive and manipulative ads should be unacceptable. This is particularly true given that technology to identify sensitive creatives at scale is readily available. It’s time all upstream AdTech companies are held accountable for the creatives they let loose in the programmatic ecosystem—such as disturbing full-head clam shell masks.
I’m afraid I can’t find the screenshot I took of that graphic mask… But trust me, you don’t want to see it.