Login
Login is restricted to DCN Publisher Members. If you are a DCN Member and don't have an account, register here.

Digital Content Next logo

Menu

InContext / An inside look at the business of digital content

Facebook: Take out the trash

July 26, 2018 | By Jason Kint, CEO – DCN@jason_kint

After last week’s uproar over Mark Zuckerberg’s comments on censorship, Axios asked experts what they would do to decrease the fake news on Facebook. Clearly, this is not a simple problem to address but we must. In fact, because Facebook hasn’t taken proper action over the past two years, governments all over the world have stepped in to take steps to address the problem of misinformation.

As the trade organization representing media brands that seek long-term consumer trust through the creation of high quality news, information, and entertainment, we take the problem very seriously. In 2016, we wrote an unanswered letter to the CEOs of Facebook and Google – Mark Zuckerberg and Sundar Pichai – to provide our perspective on the dangerous proliferation of “fake news” throughout the digital ecosystem.

In the letter we asked Facebook to aggressively harness their brilliant minds and massive resources to clean up the garbage that was flowing, with little friction, through their platform. We also cautioned against acting as a censor. Given the escalation of concerns and the fact that the 2018 midterm elections are less than 100 days away, it’s critical that Facebook take more significant and concrete actions to help ensure their platform isn’t used against our democracy again.

In the spirit of transparency, here are a few things we mentioned to Axios that Facebook could do today.

  • Eliminate the viral and monetization benefits for known fake news peddlers. Take Infowars as an obvious example. (1) At this point, Infowars should only reach users who explicitly follow its account. We’re not asking for the account to be banned entirely, although the arguments to do so are reasonable at this point. (2) Infowars shouldn’t be able to buy advertising with creative containing links to Infowars content and (3) user activity (likes/clicks) on Infowars content should not enhance its presence in the feeds of users who don’t follow it. Essentially, their content shouldn’t be exposed to users unless they explicitly ask for it. It’s why on Tuesday I asked what percentage of the views of a grotesque Infowars clip were viewed by Infowars followers. The answer could be very revealing.
  • Publish a clear escalation policy (as YouTube does) which would suspend and permanently ban accounts which repeatedly violate their hate speech and harassment rules. Although YouTube’s escalation seems to have loopholes and oddities, as proven yesterday, it’s at least transparent and up for scrutiny by the public.
  • Elevate the brand presence around content. The brand is a proxy for trust and Facebook (and Google) have long minimized the brand in their experiences. This is important for those who have built up trust through their reputation but it’s also important to newer publishers who want to build their brands. If you’re reading an Axios story on Facebook in your feed, you should know the source. Likewise, if you’re reading a Russia Today story, you should know the source.
  • Develop a transparent ranking system by domain/brand. This is not a novel approach: Google does it. Email services do it. If Infowars wants to keep publishing garbage, then let’s see their domain score fall off a cliff. The score has to mean something. The fact is that most respectable news publishers, regardless of subject matter or leanings, would score well and not be impacted. However, the trash would get taken out.
  • Hire more human moderators. Algorithms are amazing but personal responsibility should involve people. The company needs to take ownership over its “news feed” or stop calling it a “news feed.” We also need transparency on where these moderators are being hired. As platforms have challenged the economics of local news, we’ve also lost local accountability to the public. Moderators need the proper context for the areas, countries and cultures they’re serving.
  • Engage with member associations and non-profits to get advice on codes of conduct that responsible news organizations follow. Facebook had a significant misstep here when they rolled out their political advertising labels and archive. They dangerously conflated boosted news coverage about political issues with advertising about political issues. Facebook chose to ignore counsel from publishers and shut off communication with member associations despite more than 20,000 news publishers expressing concern to its CEO and COO.

A couple of executives at what is arguably the most impactful news distributor on the planet are making business decisions that have a massive impact on the political dialogue in our democracy. They should thoughtfully listen to concerns, advice, and legal inquiry in order to become a responsible member of the digital ecosystem from which they reap great profit. The problem of misinformation is not small, it is not easy, but it’s a problem that we all have a stake in solving.

Liked this article?

Subscribe to the InContext newsletter to get insights like this delivered to your inbox every week.