Trying to compete in the advertising market with Google is a bit like trying to play poker with someone who not only knows all the cards in everyone’s hands but also gets to set the rules, deal the cards, call the bets, and take a significant cut of the winnings.
It’s hard to believe it’s been almost 10 years since we sounded the alarm at DCN. As you likely know, DCN was purpose built to advance the future for premium publishers who trust us to understand and advocate for high-quality news and entertainment.
Back then in 2014, as the digital advertising industry was booming – often growing by 15-20% per year – we noticed something disturbing. Almost all of the incremental revenue growth was being captured by just two companies: Google and Facebook. And it wasn’t just because they were simply competing better in the advertising business – it was due to unprecedented access to data, what many now rightly call surveillance advertising.
And that brings us to why we’re here today: the adtech antitrust case, United States vs. Google. Why are we so focused on this case? Because it is monumental. This case is laser-focused on Google’s dominance over the entire advertising supply chain which funds most of its operations, particularly through the software and data that fuel this real-time, live auction marketplace…
September 9, 2024 marks the start of the antitrust trial United States v. Google. This follows closely on the heels of the landmark antitrust case in which S District Court Judge Amit Mehta ruled that Google “is a monopolist and it has acted as one to maintain its monopoly.”.
In this antitrust case, Google is accused of illegally monopolizing the $700B+ digital advertising industry by virtue of owning the dominant service for advertisers buying ads, the dominant service for publishers selling ads and the dominant exchange where bids are awarded. As the evidence will show, Google has systematically abused its dominant position through acquisitions and anticompetitive behavior to protect its monopoly, suppress competition and maintain their unique ability to collect a treasure trove of data about consumers. As a result, Google exerts outsized control over where revenue flows with dire impacts on journalism. The Department of Justice is rightly seeking to break up this monopoly in order to increase competition and innovation for advertisers, publishers and consumers.
We’ve partnered with other industry organizations to create a site that offers a wide range of resources to provide background on the case itself as well as regular trial updates. We also created an informational video in an effort to provide clarity and transparency in the digital advertising marketplace (in support of the AMERICA Act), which illustrates and explain some of the complexities in the digital advertising ecosystem that will be central to this trial.
Hear insights about the trial, what’s at stake, and what possible outcomes look like from DCN CEO, Jason Kint:
On August 5, US District Court Judge Amit Mehta ruled that Google “is a monopolist and it has acted as one to maintain its monopoly” in search and in text ads. The trial will now move to the “remedies” stage where the judge will hear arguments about what fixes should be imposed going forward including impact on the developing GAI market as captured here.
Google has already announced it will appeal the ruling but this represents a significant milestone in our years long effort to call attention to Google’s anticompetitive behavior. Indeed, DCN first met with officials at the Department of Justice in 2017 on this matter before ramping up in late 2018. It also comes at a sensitive time for Google with the Department of Justice’s second trial (adtech) set for opening arguments on September 9th with many relevant findings from this decision carrying over to that case.
We will conduct an analysis of the ruling, which we plan to share with our membership in the coming days.
DCN stands with publishers around the globe in reaction to Facebook and Google’s efforts to undermine Canada’s new law to help address the imbalance in market power. Digital Content Next was one of 18 media organizations worldwide that issued a joint statement on July 5, 2023 in response to Google and Meta’s threat to take down news in Canada after Canada’s parliament passed the Online News Act (C-18) in June.
In the words of Canadian Prime Minister Justin Trudeau, “This is not just a dispute over advertising, it is also dispute over democracy. It’s a question of recognizing the role internet giants—like Facebook Meta, Google and others—have in our lives and therefore the responsibility they also wield. …this goes to the core of a free and informed society that is able to take responsible decisions in a democracy citizens need to have access to quality journalism that is properly paid.
The fact that Facebook doesn’t want to recognize the hard work of professional journalists is something that undermines the very fabric of democracy. So, Canada—and allies around the world—are going to stand strong and demonstrate that we will not flinch in our defense of fundamental foundational principles of democracy like a free, quality, informed press.”
At the 2023 Collision Conference, held June 26-30 in Toronto Canada, DCN’s editorial director Michelle Manafy sat down with three media executives to discuss the ethics of using generative AI in journalism. The conversation covered the evolution of AI and its usage in the media, up to today’s much-discussed generative AI tools. Panelists weighed in on a range of use cases and where they would – or would not – permit (or even encourage) the use of generative AI in their media organizations. They also discussed whether or not generative AI is an existential threat to journalism, journalists — and even humanity as a whole. Listen to the discussion here and/or read a few highlights below.
Navigating the ethical landscape of generative AI and journalism
Featuring:
Gideon Lichfield – Global Editorial Director, Wired
Harry McCracken – Global Technology Editor, Fast Company
Traci Mabrey – Head of News, Factiva
Michelle Manafy – Editorial Director, Digital Content Next
A few highlights from the panel discussion:
Traci Mabrey: We’ve been using [machine learning and AI] forever and that’s a really important component as we look at this. This new horizon is going to be something, and I don’t think any of us know exactly what that is yet. But we have been using the building blocks of it for quite some time…
Gideon Lichfield: I think what’s changed is that it now has the capability to produce something that looks like something humans would create from scratch. And I emphasize looks like because it’s very clear that what’s going on is imitation… the fact that it became available as an easy to use interface was really crucial… this technology was around already for a few years, but it wasn’t that easy to access. The big change last year was just that it became easy to access…
Michelle Manafy: We’ve heard of late that some big tech leaders, some really smart folks call generative AI an existential threat. Are we afraid? Should we be afraid? And I don’t just mean as the media. You guys all think about larger issues in society. Is this good? Is it bad? Should we all be scared?
Harry McCracken: I think the worrying about it blowing up the world or killing us all is a little overwrought, particularly because there’s a pretty long list of genuine concerns that are either an issue right now or pretty clearly will be over the next few years involving things like misinformation. There are huge privacy concerns with a handful of large companies grabbing all our data and synthesizing that for their own benefit. I’d say there are plenty of things to worry about with A.I… but destroying the world might be more like the way that social media has, in a lot of ways, degraded the human experience…
Gideon Lichfield: …the increasing volume of just sheer garbage that is out there that is going to be generated by AI: that’s a that’s a real worry. And the job displacement part is also a thing that I worry about. But I think there is a way to use it. There is a way to use A.I. that empowers people, gives them extra tools. But it’s also a great temptation for companies, for employers to simply look at it as a way to save costs…
Harry McCracken: …Journalism is unusual in that the writing is the product. Most of the writing that exists in the world is not the product, just the byproduct. There are a lot of cases where having a computer draft your internal memo or whatever makes a lot of sense and will fill you up to do more important things…
Traci Mabrey: …I think if we look at our journalists and our editors around the world, there’s a very personal scope that goes into everything somebody is writing and somebody is speaking about. And I think that’s a really big component when you look at it. The technology, as Gideon was saying, it is bringing up a set of words. It’s able to make 500 words on X topic regarding this. But that is not the way that I would infuse that information into the world. And it’s not those types of things that make organic journalism and all of the real nuggets that we get from it… I think for the drafting process and the information gathering, certainly saving a lot of time. But we’re certainly on the path of that being a still a very personal end product.
Learn more about how media leaders are developing their policies around the usage of AI and generative AI in their organizations:
Media Voices co-host Peter Houston is tired of hearing the same old industry buzzwords. The publishing platitudes are starting to wear a bit thin, and he’s decided to see if he can shake the conversation up a bit by speaking to some of the biggest characters in the business.
The latest episode of Media Voices’ Big Noises podcast features Michelle Manafy, Editorial Director at Digital Content Next (DCN).
Michelle started out as a journalist. The rise of digital media saw her embrace the changes and after working for a range of publications, from alt weeklies to B2B titles, she joined what was then the OPA to help premium publishers with their ongoing their digital evolution. She now manages online content and events for the group, which is known as DCN.
More than a decade in, Michelle still has hope for the media, but is frustrated by many of the publishing practices she sees. “Now we’re in a world where two thirds of our job is to rise above the noise. ‘Listen to me. Look at me’ right? Are we providing a value exchange? When people give us that gift of their attention, do we provide them with value… was it worth their time?”
DCN is proud to support our members and the health of the digital media industry overall. Thus, we are pleased to share some of the incredible work our members and other media companies are doing to recognize Juneteenth and its cultural significance in the American experience.
1) Developers and deployers of GAI must respect creators’ rights to their content. Developers and deployers of Generative Artificial Intelligence (GAI) systems—as well as legislators, regulators and other parties involved in drafting laws and policies regulating GAI—must respect the value of creators’ proprietary content.
2) Publishers are entitled to negotiate for and receive fair compensation for use of their IP. Use of original content by GAI systems for training, surfacing or synthesizing is not authorized by most publishers’ terms and conditions, or contemplated by existing agreements (for example, for search). GAI system developers and deployers should not be crawling, ingesting or using publishers’ proprietary content for these three stages without express authorization.
3) Copyright laws protect content creators from the unlicensed use of their content. Like all other uses of copyrighted works, use of copyrighted works in AI systems are subject to analysis under copyright and fair use law. Most of the use of publishers’ original content by AI systems for both training and output purposes would likely be found to go far beyond the scope of fair use as set forth in the Copyright Act and established case law. Exceptions to copyright protections for text and data mining (TDM) should be narrowly tailored to not damage content publishers or become pathways for uses that would otherwise require permission.
TRANSPARENCY
4) GAI systems should be transparent to publishers and users. Strong regulations and policies imposing proportionate transparency requirements are needed to the extent necessary for publishers to enforce their IP rights where publishers’ copyright-protected content is included in training datasets. Generative outputs that use publisher content should include clear and prominent attributions in a way that identifies to users the original sources of the output (not third-party news aggregators) and encourages users to navigate to those sources. Users should also be provided with comprehensible information about how such systems operate to make judgments about system quality and trustworthiness.
ACCOUNTABILITY
5) Deployers of GAI systems should be held accountable for system outputs. GAI systems pose risks for competition and public trust in publishers’ content. This can be compounded by GAI systems generating content that improperly attributes false information to publishers. Deployers of GAI systems should be legally responsible for the output of their systems.
FAIRNESS
6) GAI systems should not create, or risk creating, unfair market or competition outcomes. Regulators should be attuned to ensuring GAI systems are designed, trained, deployed, and used in a way that is compliant with competition laws and principles.
SAFETY
7) GAI systems should be safe and address privacy risks. Collection and use of personal data in GAI system design, training and use should be minimal, disclosed to users in an easily understandable manner and in line with Fair Information Privacy Principles (FIPPS). Systems should not reinforce biases or facilitate discrimination.
NEW YORK – March 30, 2023 – In an effort to provide clarity and transparency in the digital advertising marketplace, Digital Content Next (DCN), the trade association that exclusively represents more than 60 high quality digital media companies and thousands of trusted media brands, endorses the bi-partisan bill, the AMERICA Act.
“The digital advertising marketplace operates like a financial marketplace with brokers representing buyers and sellers. But, unlike financial markets, there are no requirements for transparency and there are no rules to avoid conflicts of interest and insider trading.” said Jason Kint, CEO of Digital Content Next. “We support the AMERICA Act because it will reinforce the basic free market tenets of competition and transparency in digital advertising, which fuels the online content we all read and love and are a necessity for all sorts of businesses – small and large – to efficiently reach consumers.”
As alleged by the Department of Justice and a bipartisan group of state attorneys general, the digital advertising marketplace is plagued by shady middlemen charging exorbitant fees and engaging in questionable business practices which ultimately translates into higher costs and reduced privacy for consumers. While dominant ad tech companies make huge profits, publishers and advertisers get squeezed. And high costs are passed on to American consumers.
“This lack of competition in digital advertising means that monopoly rents are being imposed upon every website that is ad-supported and every company—small, medium, or large—that relies on internet advertising to grow its business,“ said U.S. Senator Mike Lee (R-UT). “It is essentially a tax on thousands of American businesses, and thus a tax on millions of American consumers.”
If you’d like to watch or share a version without captions, click here.