/ An inside look at the business of digital content
Alexa, can I trust this news service?
November 19, 2020 | By Jason Kint, CEO – DCN@jason_kintNo, not that Alexa. Many years before Amazon hatched the infamous brain inside of their smart devices, there was another Alexa on the Internet. The original Alexa started as a 1990s plug-in for web browsers before it was acquired by, oddly enough, Amazon in 1999.
After acquisition, OG Alexa evolved into a simple web traffic service that, because it was free, was often cited by academics who lacked access to pricier (and better) alternatives. Those who were around in the early days of the world wild web may have forgotten about the Alexa Site Rankings — I certainly did. And it is safe to assume that you would never guess that today, this service influences global news on what is arguably the most controversial, if popular, platform: Facebook.
In 2014, I noticed that Alexa Site Rankings resurfaced in an industry advertising report that evaluated premium publisher experiences. I questioned how the researchers determined which sites were “premium” (they also called them “respected sites.”) I discovered they used Alexa Site Rankings as the proxy, effectively substituting popularity for quality. They also didn’t take into account that the list included porn and piracy sites which, while certainly popular, aren’t known for quality. All of this raised significant concerns about the findings, which I outlined on Re/Code, “False Advertising: When Lazy Headlines Undermine Great Research. Let’s not confuse clicks with respect.”
Popularity contested
Fast forward to October 2020 and what (seemed to be) the peak of U.S. election hysteria and concerns over election and information integrity. I read a brilliant report by Alex Hern at The Guardian that examined a controversy around Facebook’s fact-checking process and found some startling surprises – including that Facebook is using the same dubious popularity data from the Alexa Site Rankings as a proxy for trust. Hern’s analysis relies on sections of leaked Facebook content moderation guidelines including this (emphasis mine):
Stories can be “enqueued” for Facebook’s third-party factcheckers in one of two ways: either by being flagged by an AI, or by being manually added by one of the factcheckers themselves.
Facebook’s AI looks for signals “including feedback from the community and disbelief comments” to automatically predict which posts might contain misinformation. “Predicted content is temporarily (for seven days) soft demoted in feed (at 50% strength) and enqueued to fact check product for review by [third-party factcheckers],” the document says.
But some posts are not automatically demoted. Sites in the “Alexa 5K” list, “which includes content in the top 5,000 most popular internet sites”, are supposed to keep their distribution high, “under the assumption these are unlikely to be spreading misinformation”.
Wait. What? Indeed, it looks like Facebook gives a “fast pass” to any site on the Alexa 5k Site Rankings.
Integrity, please
In 2016, I wrote to Facebook CEO Mark Zuckerberg pleading for him to show actual leadership by inspiring a “moonshot” to address a growing information integrity crisis on his platform, asking him to use brands, not popularity or engagement, as proxies for trust.
Soon after, Facebook created its own “News Page Index” to maintain a list of trusted news organizations that could be used to surface additional context, exempt the press from rules applied to political advertisers and source stories for its News tab. Yes, there was some controversy in that Facebook included Breitbart on its list. But that was Facebook’s decision to make, not ours. Welcome to responsibility, Facebook. Kevin Roose, a brilliant technology columnist at The New York Times, has spent years documenting the effects of social media platforms. Recently, he has drawn significant attention with his daily list of the top ten news posts on Facebook. The pages appearing in his list are often sites that are popular beyond belief. They also clearly show the drawbacks in relying on engagement metrics.
Misinformation
By using the Alexa Site Rankings as its benchmark, Facebook provides these sites with the digital equivalent of an airport TSA Precheck under the assumption they “are unlikely to be spreading misinformation” (Facebook’s words, not mine) by being on the Alexa 5k. This is a list which includes Zerohedge, Infowars, PJ Media, Gateway Pundit, the Palmer Report, and Russia Today. If you are unfamiliar with these sites, they’re generally considered to be flush with misinformation and disinformation (Russia Today is even funded by the Kremlin.) This isn’t partisan, it cuts down both extremes where there’s often a common intent to deceive.
Do you know what sites don’t get this same Alexa 5k benefit of the doubt that they’re not spreading misinformation? The websites for the newspapers of record for nearly every local market in America, from the Portland Oregonian to the Lexington Herald-Leader to the Kenosha News to the Philadelphia Inquirer.
These news desks are full of journalists from cities large and small, from red states and blue states, and who have been on the streets over the last six months covering justice in America. Yet, their information isn’t given the same benefits as some of the most toxic sites on the web. Instead, they are presumed guilty and put into a long line for fact-checking whenever Facebook’s AI determines a story may be problematic. For years, The Knight Foundation has found that, among news sources, local news is the most trusted. So, this isn’t a rational or responsible decision by Facebook.
Outsourced responsibility
There have been many important reports on how Facebook outsources content moderation in order to separate itself from decision-making. Casey Newton at The Verge has been recognized for shining the light in this area, work that even led to a major Facebook settlement. Now, it appears that Facebook may also be outsourcing the criteria for its automated decision-making to the Alexa Site Rankings to determine whether a news site is trustworthy.
By choosing “popularity” rather than Facebook’s internal decision making, Facebook doesn’t disrupt its business need to drive engagement. It also avoids political backlash for favoring a particular ideology; conservative sites are every bit as popular and engaging, as Roose’s reporting demonstrates. And, finally, Facebook outsources liability for maintaining its list.
I’ve put these concerns to Facebook’s leadership asking for clarity or a correction. But, three weeks later and: crickets. Earlier this week, Twitter CEO Jack Dorsey testified to the powerful Senate Judiciary Committee that algorithmic amplification has a profound impact. We agree. The largest news platform in the world reaches nearly 3 billion people and we don’t know how or why it makes these decisions. We all deserve better.
With great power, comes great responsibility. Facebook is a monopoly with the power to limit choice for consumers and advertisers. Now we know that this platform, where half of Americans consume news, uses a vetting process that equates clicks with quality, that devalues many of the most trusted newspapers and gives a fast pass to sites that serve Facebook’s priorities – engagement and popularity. If that’s the case, the public should know about it so they can decide if that’s where they want to put their precious time and attention.