Login
Login is restricted to DCN Publisher Members. If you are a DCN Member and don't have an account, register here.

Digital Content Next

Menu

Research / Insights on current and emerging industry topics

Content moderation is serious business. Social platforms need to act like it

June 16, 2020 | By Rande Price, Research VP – DCN

Content moderation is a big job for social platforms (i.e. Facebook, Twitter, YouTube). Much of the content posted on these sites is made up of user-generated postings, pictures, and videos. However, there is a sizeable amount of content from rogue publishers to contend with as well. At present, it is common practice for social platforms to outsource the role of content moderation.

NYU Stern Center for Business and Human Rights’ new report, authored by Paul Barrett, focuses on Facebook and questions its strategy of outsourcing content moderation. Facebook, by far, outsources the largest number moderators of all the tech platforms. Barrett makes the case that content moderation is central to social platforms’ business. Thus, as with other core business functions, Facebook should make this vital role that of a full-time company employee. To improve moderation, Barrett contends that Facebook must bring moderators in-house and increase the number from the current 15,000 to 30,000.

Outsourcing

According to Barrett, Facebook’s outsourcing was a purposeful decision. Their reasoning goes beyond the cost of bringing these employees in house. It is also a logistical – even strategic – decision to outsource moderation.

Sarah Roberts, an expert on content moderation at the University of California, comments that Facebook and other social media companies outsource moderation to minimize its level of importance. Roberts’ refers to this as “plausible deniability.” She explains, the work is mission critical. However, your full time employees don’t handle it directly. The company intentionally places a physical distance between the problem and its staff.

Facebook contracts third-party vendors, which hire temporary workers located at 20 sites worldwide. With the assistance of AI, this army of contract-moderators sift through approximately three million posts a day.

Business paradox

Facebook’s core business model centers on advertising. The goal of the platform is to add new users, to increase scale. This drives advertising revenues, which demonstrates growth to Wall Street. However, there is an intrinsic problem here. More users create ever more content to be moderated. This puts moderators on a hamster wheel with no end in sight.

With such a high volume of moderation, and a minimal level of content expertise, errors occur. Mark Zuckerberg, Facebook’s CEO, quoted a 10% error rate in flagging content that should be taken down or taking down content that should not be flagged. Given this margin, there are at least 300,000 mistakes each day. Some of these are very serious mistakes. 

The consequences

What are the repercussions of postings remaining online when they should be taken down? There’s little risk for Facebook, at least within the U.S. Due to the Section 230 of the Communications Decency Act of 1996, internet platforms are protected from liability for most content posted by users. Even with President Trump’s recent executive order to roll back Section 230, tech company protection from liability still appears in place.

Unfortunately, there are consequences for the workers. In a recent class action lawsuit against Facebook in San Mateo County, California, a group of former reviewers claimed that “as a result of constant and unmitigated exposure to highly toxic and extremely disturbing images,” they had suffered “significant psychological trauma and/or post-traumatic stress disorder.”

Facebook agreed, without admitting any liability, in May 2020 to settle the suit. The settlement could distribute millions of dollars to more than 10,000 current and former moderators in the U.S. (a minor slap on the wrist for a company that earns more than $70 billion dollar per year).

Importantly, there are also serious consequences for multitudes Facebook users as well. One such incident identified members of the Myanmar military as operatives behind a systematic campaign on Facebook to target the country’s Muslim Rohingya minority group. Facebook was warned of anti-Rohingya rhetoric and false claims posted on their platform (which were not removed). The United Nations and others blame on Facebook for the murder of more than 10,000 Rohingya Muslims in Myanmar and the displacement of hundreds of thousands more. Facebook eventually concluded there was a deliberate and covert Myanmar military operations but again there was no liability on their part.

Incidences like those in Myanmar also happened in Sri Lanka, Indonesia, Ethiopia, as well as other places.

Time for change

Barrett concludes his report with eight recommendations to change Facebook current practices:

  1. Stop outsourcing content moderation and bring the process in-house.  
  2. Increase the number of moderators from 15,000 to 30,000 to improve the review process.
  3. Hire someone to oversee content and fact-checking who reports directly to the CEO or COO.
  4. Expand moderation in at-risk countries in Asia, Africa, and other areas.
  5. Provide all moderators with quality medical and psychiatric access.
  6. Support research on the health risks of content moderation.
  7. Explore targeted government regulation regarding harmful content.
  8. Expand fact-checking to discredit false information.

Companies routinely outsource areas outside their expertise. However, this practice is far from common for areas core to their business. Content moderation is an area of expertise that Facebook needs to possess, and excel, in for the safety of its employees and Facebook users worldwide.

Print Friendly and PDF