Login is restricted to DCN Publisher Members. If you are a DCN Member and don't have an account, register here.

Digital Content Next logo


InContext / An inside look at the business of digital content

Service journalism and the web advertising problem

April 27, 2016 | By Don Marti@dmarti

There’s a toenail fungus photo in my morning news.

And it looks like it’s an ad for some questionable toenail-fungus-treating multi-level-marketing scheme.

Yeech. How did that get on there? Pass the ad blocker already.

Forget tracking protection, forget new standards for responsible advertising, forget all that. Gross infected body parts and MLM ads before I have even had my coffee? Burn all this stuff down.

Terrible ads are a big reason why tracking protection seems like an incomplete solution to the problems of web advertising. Web users don’t just block ads because people are good applied behavioral economists, seeking signal and filtering noise. A lot of web ads are just deceptive, annoying, gross, or all three. (Oh, right, some of them carry malware, too.)

Even if we could somehow combine the efficiency and depth of the web medium with the signaling power of print or TV, won’t web ads still be crap? And won’t people still block them?

It doesn’t have to be that way.

Publisher standards
Print ads are less crappy than web ads. Why can’t publishers enforce better standards on the web? How can a newspaper have memorable, well-designed ads in print, while the ads on the web site have users looking for the computer sanitizer?

It’s hard for publishers to enforce standards when an original content site is in direct competition with bottom-feeder and fraud sites that claim to reach the same audience. And that competition is enabled by third-party tracking. As Aram Zucker-Scharff mentions in an interview on the Poynter Institute site, the number of third-party trackers on a site grows as new advertising deals bring new trackers along with them. All those third-party pixels and scripts—and a news site might have 50 to 70 of them—cause slowness and obvious user experience problems. But the deeper problem, data leakage, is harder to pick out. Any of those third parties could be leaking audience data into the dark corners of the Lumascape until it re-emerges, attached to a low-value or fraudulent site that can claim to reach the same audience as the original publisher.

Publishers can try to pin down their third parties with contractual restrictions, but it’s prohibitively expensive for a publisher to figure out what any one tracker is up to. You know that sign at the corner store, “only two high school students in the store at a time”? If the storekeeper lets 50-70 kids in, he can’t see who shoplifted the Snickers bar. The news site is in the same situation on third parties. Because any one publisher has contact with so many intermediaries, only the perpetrators can see where data is leaking.

A security point of view
Information security is hard. When you have to maintain software, you fix a bug when you can see that there’s a bug. You don’t wait until someone starts exploiting it. The earlier you fix it, the less it costs.

News sites work this way for some issues. If you found a bug in your site’s content management system that would allow a remote user to log in as “editor” and change stories, you would fix it. Even if you had no evidence that random people were logging in, it’s not worth taking the chance. Because it’s so hard to catch data leakage in the act, it makes sense to apply the same bug-fixing principle. When there is an emergent bug in the combination of your site and the user’s browser that allows for data leakage, then it is more effective to proactively limit it than to try to follow audience data through multiple third parties.

That doesn’t mean just walking away from all third-party tracking. Henk Kox, Bas Straathof, and Gijsbert Zwart write, in Targeted advertising, platform competition and privacy

We find that more targeting increases competition and reduces the websites’ profits, but yet in equilibrium websites choose maximum targeting as they cannot credibly commit to low targeting. [emphasis added] A privacy protection policy can be beneficial for both consumers and websites.

. . .

If websites could coordinate on targeting, proposition 1 suggests that they might want to agree to keep targeting to a minimum. However, we next show that individually, websites win by increasing the accuracy of targeting over that of their competitors, so that in the non- cooperative equilibrium, maximal targeting results.

When publishers lack market power, they have to play a game that’s rigged against them.

Changing the game
So how to turn web advertising from a race to the bottom into a sustainable revenue source, like print or TV ads? How can the web work better for high-reputation brands that depend on costly signaling?

C.H.E.D.D.A.R is a basic set of technical choices that make web ads work in a signal-carrying way, and restore market power to news sites.

Some of the work has to happen on the user side, but tracking protection for users can start paying off for sites immediately. Every time a user gets protected from third-party tracking, a little bit of competing, problematic ad inventory goes away. For example, if a chain restaurant wants to advertise to people in your town, today they have a choice: support local content, or pay intermediaries who follow local users to low-value sites. When the users get protected from tracking, opportunities to reach them by tracking tend to go away, and market power returns to the local news site.

And users see a benefit when a site has market power, because the site can afford to enforce ad standards. (and pay copy editors, but that’s another story.)

Service journalism
Users are already concerned and confused about web ads. That’s an opportunity. The more that someone learns about how web advertising works, the more that he or she is motivated to get protected. A high-reputation publisher can win by getting users safely protected from tracking, and not caught up in publisher-hostile schemes such as paid whitelisting, ad injection, and fake ad blockers.

Here is a great start, on the New York Times site. Read the whole thing:

Free Tools to Keep Those Creepy Online Ads From Watching You

The next step is to make it more interactive. Use web analytics to pick out a reader who is

  • valuable as an audience member
  • vulnerable to third-party tracking
  • using a browser for which you know a good protection tool

and give that reader a nice “click here to get protected” button that goes to your tool of choice. There is JavaScript to do this.

Tracking protection for users means fewer ad impressions available at bottom-feeder and fraud sites, which means more market power for news sites, which means sites gain the ability to enforce standards. Put it all together, and no more toenail fungus ads before breakfast.

Don Marti (@dmarti) is a contributor of code and documentation to the aloodo.org project, a low-friction way for sites and brands to reclaim the value of online advertising from fraud and ad blocking. He serves as a strategic adviser for Mozilla, and is the former editor of Linux Journal. Don is the subject of an out-of-date Wikipedia article which he will not edit himself, but wishes that someone would.

Liked this article?

Subscribe to the InContext newsletter to get insights like this delivered to your inbox every week.