/ An inside look at the business of digital content
Algorithmic angst: News Corp Chief Executive Robert Thomson’s “Breaking The News” keynote
June 12, 2018 | By DCNNews Corp Chief Executive Robert Thomson delivered a keynote presentation at “Breaking the News: Free Speech and Democracy in the Age of Platform Monopoly,” an all-day conference hosted by the Open Markets Institute and the Tow Center at the Columbia University School of Journalism on June 12, 2018. Speakers explored how the power and business models of large online and telecom intermediaries affect the ability of reporters and editors to gather and distribute news in the 21st century.
Below is the full text of the talk he delivered:
FREE SPEECH & DEMOCRACY IN THE AGE OF PLATFORM MONOPOLY
I guess that I am the amuse bouche ahead of your hearty lunch and Mark Thompson. We’re not related. The title of this ornery oratory, for some, what will be a difficult-to-digest diatribe is – when Irresistibility becomes Irresponsibility. It’s a theme familiar to Big Tobacco and, perhaps, in our time, to Big Digital. A possible subtitle is The Line of Least Compliance.
The subject of subtitles reminds me inevitably of film, and I do want to call out one film buff whose good taste in cinema is only exceeded by her zeal in trying to right Digital wrongs, and that would be European Commissioner Margrethe Vestager, herself reputedly the inspiration for a Danish political drama. Commissioner Vestager delivered a speech two weeks ago that is worthy of close scrutiny by all here, and certainly so by any company subject to her tender mercies. Most importantly, she referenced the original Blade Runner, and its poignant portrayal of loss and love and meaning and, particularly, the telling line used “like tears in rain”. I was intrigued and so we had an exchange, obviously not about pending cases, but about the inherent irony in Blade Runner, the irony being that it is the automatons, the replicants, who most cherish the essential human values, family, friends, loyalty, shared experiences, shared memories.
If journalism were a film, the last decade is certainly the equivalent of a slasher movie – the Silicon Valley Chainsaw Massacre – and there have been many tears shed in the rain. So when we contemplate a robust commercial future for journalism and for professional media, we must first ponder an eco-system that has punished the creators and profited disproportionately the distributors, who are indeed publishers but cannot quite pronounce the word “pu…pu..publisher.” It gets a little harder for YouTube not to admit to being a broadcaster when it spends many millions advertising its $40-a-month television package and surely even Facebook will concede that it is a broadcaster and publisher now that the company has bought English football rights in Asia and surfing rights globally.
Unfortunately, the current content ecosystem is fertile territory for the fabricators of the fabulous, as you can tell, I like alliteration, and the furnishers of the fake and the faux. I’ve been holding forth on, some would say ranting about, this vexed topic for a decade or more.
I was recently reminded by Nick Thompson of a UK House of Lords testimony in 2007 in which I warned that the internet was ripe for exploitation by bad actors who could take advantage of the large digital platforms to disseminate drivel. Being questioned by those august peers in 2007, I suggested:
“What you have is a lot of young people who are growing up surrounded by much more information but whose provenance is not clear. In the longer term critical judgment will not be as it should be. The rumors will be believed, the fiction will be thought of as fact and the political agendas, among other agendas, will be influenced by interest groups who are coming from some quite strange trajectory to issues based on collective understanding that is founded on falsity.”
Now that was not eerie prescience, so don’t come to me for advice on stock investments or racing tips or an NCAA bracket, or today’s Time Warner ruling, but it was really just the judgment of a journalist – at that time the editor of The Times of London – who had spent every working day since the ridiculously raw age of seventeen trying to assess the veracity and authenticity of information in a newsroom. It was clear back then that the internet had inherent vulnerabilities and that the digital platforms had a responsibility to project professional journalism and cultivate a culture of compliance so that the real could be separated from the unreal and the surreal.
If your business model is to commodify content – which is an egregious mistake because there is a hierarchy of content – and then allow a search engine or social platform to be easily manipulated by bad actors, then you’re failing a basic test of compliance. Basic compliance should have been the modest price of admission for the big digital players. And these are very profitable players – Facebook’s operating margin last year was around 48 percent. And yet almost daily we hear of issues in compliance seems to be an awkward afterthought – that is because we are indeed in a new era, an era in which the pervasiveness of the largest digital platforms makes Standard Oil look like a corner gas station. And so naturally there are comparisons already being made between Big Tobacco and Big Digital – if Big Digital wants to avoid the fate of Big Tobacco, it needs to take the initiative rather than being reactive.
So here we are in 2018, where sense and nonsense rub shoulders on platforms, where the artificial and the asinine thrive, and where the click-bait cultivators and search engine spivs reap bountiful harvests, and where professional journalism still faces an existential crisis in many countries. Now I could fulminate furiously on this subject ad infinitum and occasionally ad hominem. But I would like to highlight one issue, which is very relevant to today’s question about the future of new economic models for journalism – it is what I like to term algorithmic angst…
But, before the tirade, a moment of reflection. Media must also tend to its reputation. Our reporters have to create compelling content and be seen to have the objective of being objective. That is particularly the case in an age of bluster and bombast.
Our journalists have to be renaissance reporters – willing to traverse platforms to ensure that the story is told where people are reading or listening or watching. And journalists should be wary of being too self-referential or self-reverential…journalism is about society, not about self.
Enough about people and their foibles and their follies, let’s focus on something far more important, the machines and the software that is their soul. Algorithms are awesomely powerful, and they are destined to become far more so. The compounding impact of AI will mean that they will know much more about us and we will know much less about them. Their ability to create a growing audience of addicts will be enhanced and the young will be the most targeted and the most vulnerable, as they are already. Social platforms obviously do much good, but they can also be antisocial and idealism cannot be used as an excuse for deleterious behavior. Facebook has long believed that “connecting” people absolved it from other, “secondary” responsibilities. Google has long held that “informing” people is the greater good and that any negative consequences are a small price to pay. I make something of a distinction between Google, which is making an effort to reform, and YouTube, which must do far more to purge piracy and extreme extremism.
The modern metric that drives digital platforms is “engagement” – it is fair to say that Big Tobacco was also focused on “engagement”, but commercial priority can have a social cost. It is clear, for example, that young people are not just distracted by acts of reading or responding to posts or tweets or photos, to likes and dislikes, to friending or unfriending, to status updates that undermine a person’s status. The truth is that they are deeply distracted by and sometimes seriously distressed by the thought of, the anticipation of reading or responding. When the intention of a powerful algorithm is to increase so called “engagement,” what is the potential for a vulnerable person to be disengaged from society?
It is well-documented that radical groups, whether Islamic extremists or fascists, successfully groom and radicalize young people on the web. These are the obvious outcomes of obsession. What about the less obvious damage done to the psyche and to the self-esteem of the young? And that would include damage done to the ability to concentrate and to tolerate.
And yet there has been no serious movement to hold these omnipotent algorithms to account, which is why I think there should be an Algorithm Review Board to get more accountability and transparency from the three overwhelmingly dominant algorithms of our day: Facebook, Google/YouTube and Amazon. If you buy a small bar of chocolate in the US, you’ll be told the precise ingredients on the pack and generally how many calories per serving. There will be stark health warnings on even a low alcohol bottle of beer. Clothing labels are often synthetic screeds, in multiple languages, to ensure compliance. And yet the powerful, mind-altering, behavior-shifting, mood-changing algorithms are allowed to work their invisible alchemy on our personalities, on our societies and on our young people.
So what should an Algorithm Review Board, an ARB look like? Call it an Algorithm Transparency Board, if you like or, if you must, an Algorithm Altruism Board. It’s obviously important that experts in the related fields preside, not politicians, and that which should be confidential is kept confidential, but that which should change is changed.
There are a few basic categories into which an expert industry panel can provide insight and protection for the vulnerable. These priorities would include invasions of privacy, which are already commonplace, IP piracy, which is rife, news censorship, which is clearly already underway, and commercial abuse by dominant players, which is both real and difficult to monitor. There would also be a role during election campaigns to ensure that the sanctity of the system is not violated by bad actors. A further crucial area is the socio-psychological impact, particularly on children – essentially we have the world’s cleverest engineers and the world’s cleverest machines finding ways to make programs irresistible – the net as nicotine and technological tar.
In the midst of the modern morass, Facebook has made a contribution by highlighting the importance of “trusted publishers”, but who is to judge trustworthiness. The very citing of “trusted publishers” reinforces the rightness of Facebook paying premium publishers and premium journalists for the reputational and experiential services they provide.
And Google, which has actually become more responsive under Sundar Pichai, has been tweaking its algorithm in ways that seem to be a mystery to even the company itself. At the end of the prejudicial “first click free” – which we certainly applaud, as it punished professional journalism – there were extremely odd traffic patterns across our sites.
One week in February, we saw a sudden and remarkable surge in Google referrals to WSJ.com, the Wall Street Journal’s website. This erratic movement followed months of relative inactivity, so our CTO asked Google what had happened and was told that it was a “bug” in the algorithm. Now we’re certainly in favor of more referrals, but it is extraordinary that the algorithm was so anarchic. And BuzzFeed reported last month that a change in the Google News algorithm led to a bizarre dominance, in a supposedly eclectic broad feed, by the BBC. Every one of the top 50 results to a search was a BBC news article and 97 of the top 100 came from the UK broadcaster – that’s a little like the North Korean election result. Again, no coherent explanation, just an admission that the results were bizarre and unintended.
And Amazon can punish or reward companies with a tweak or a turn or a twist in its algorithm as they did so brutally a couple of years ago with Hachette – now there is a story for the Washington Post….In the US, Amazon has over 95% of the audio book market, and doubles as a publisher of books itself. In the language of an antitrust expert in Brussels, they dominate a horizontal and are thus able to influence a vertical containing their own products. Such dominance is clearly in need of close scrutiny and that scrutiny can only come from a rigorous assessment of its algorithm.
So, in conclusion, there is obviously a need for an Algorithm Review Board to track the intended and the unintended psychological and social and commercial and political impact of pervasive platforms. If the imperative of an algorithm is to be irresistible, when does irresistibility become irresponsibility?
New York Times CEO Mark Thompson also delivered a keynote presentation at “Breaking the News: Free Speech and Democracy in the Age of Platform Monopoly” event. Click here to read it.