Login is restricted to DCN Publisher Members. If you are a DCN Member and don't have an account, register here.

Digital Content Next


Policy / DCN perspectives on policy, law, and legislative news surrounding digital content

Seeing through Facebook’s transparency report

August 26, 2021 | By Chris Pedigo, SVP Government Affairs – DCN @Pedigo_Chris

“Transparency is an important part of everything we do at Facebook.” Well, at least that’s according the first line of the company’s just-released Widely Viewed Content Report. The report is intended to provide insight into what content flowed across Facebook in the second quarter of 2021. In particular, it examines what content was viewed most in consumers’ News Feeds.

Facebook claims that most of the content on its platform comes from its users’ friends and family. The report states that news (or content that appears to be news) only represents a small fraction of what users see and share. It appears the company wants us to believe its platform is filled with family pictures and GIFs of kittens. Move along, industry watchers. According to Facebook there’s nothing to see here!

Cleaning house

The New York Times uncovered that this wasn’t the company’s first such report. According to the Times, Facebook prepared a Widely Viewed Content Report for Q1 but not release it. Apparently, the picture it painted wasn’t pretty.

According to the Times, “the most-viewed link was a news article with a headline suggesting that the coronavirus vaccine was at fault for the death of a Florida doctor.” Although the article came from a major publisher, it was the headline – and the way it was shared in a particular context by anti-vaxxers – that put it on top. Meanwhile, the 19th most popular page during Q1 was from the Epoch Times, an outlet widely known for pushing conspiracy theories.

Weird. These aren’t exactly posts about grandma’s mouth-watering apple pie or dare devil kittens hanging from ledges.

Remember, Facebook only reluctantly published the Q2 report after it spent a year discrediting external research from its own CrowdTangle data, which measures engagement (liking and sharing). When Facebook dismissed these reports, academics and researchers responded by pointing out that they were simply using the data Facebook had, itself, put out into the market. Meanwhile, CrowdTangle’s personnel were dispersed to other parts of Facebook, and the tools were pared back by Facebook, in what looked like an effort to regain control of the data and narrative.

It kind of feels like Facebook is hustling us in a giant game of Three-card Monte.

Revisionist history on repeat

Maybe there are those still inclined to give the company the benefit of the doubt. As Facebook spokesman Andy Stone soft-played their actions when he tweeted “we might have been guilty of cleaning up our house a bit before inviting company.” Unfortunately, the dumpsters full of incendiary trash impacted untold numbers of “guests” way before Facebook purportedly tidied things up and invited the “company” of critical analysis.

Facebook has a long track record of covering up and thwarting transparency at critical moments including the countless times where they enabled companies to collect data about consumers and then repurpose that data for number of reasons. Notably, these included selling Facebook’s data to third parties like Cambridge Analytica, which turned out to be one of the largest data breaches in history.

Deception is part of the business model

Recently, Facebook blocked researchers ability to gather information about what political ads are being shown across their services. It was especially galling that Facebook justified its decision as a protection of “user privacy.”

Given the company’s track record, it shouldn’t be surprising that Facebook once again opted to block full transparency. And yes, once again, the real reason is because it could hurt the business model. Even the FTC weighed in and warned Facebook not to use privacy “as a pretext to advance other aims.”

Truth told

For the Facebook model, it doesn’t matter whether the content is truthful. In fact, if there is a bias at all, it must be in favor of the least trustworthy, most salacious content (e.g. vaccine misinformation, Epoch Times) because that content tends to catch fire more quickly. And profits are way up. The business model is clearly working.

On top of that, at key junctures in Facebook’s history, executives have unequivocally chosen to hide the truth to maintain these profits. Vague commitments to transparency come only when they find themselves once again caught with a hand in the proverbial cookie jar. And now, even those promises to “do better” come with a giant caveat: We’ll only tell you something if it makes the company look good.

At this point, there are legitimate questions about whether Facebook is a good-faith actor that can ever play a constructive role. Facebook’s business model is at odds with transparency and their executives seem committed to running the business like a Three-card Monte hustle. With deep political divisions in our country hindering our ability to tackle a shared crisis (or even have a thoughtful discussion), we need leaders to step forward and demonstrate an unequivocal commitment to transparency and trust.