Login
Login is restricted to DCN Publisher Members. If you are a DCN Member and don't have an account, register here.

Digital Content Next

Menu

Research / Insights on current and emerging industry topics

What to expect from disinformation during the 2020 elections (and how to respond)

September 11, 2019 | By Rande Price, Research Director – DCN @Randeloo

Most academic, media and political analysts forecast the likelihood of disinformation playing a large role in the upcoming 2020 presidential election. The NYU Stern Center for Business and Human Rights echoes this projection in its new report, Disinformation and the 2020 Election: How the Social Media Industry Should Prepare. In fact, the analysis predicts that more disinformation will be generated state-side than the volume generated by foreign entities.

What to expect from disinformation

  • Deepfakes increase in volume. Deepfakes are easier to produce now due to advancements in deep-learning and editing systems. A deep-learning system produces a persuasive fake video by studying photographs and videos of a target person and merging them with images of an actor speaking and behaving in the same manner as the target. Once a preliminary fake is produced, a method known generative adversarial networks (GANs) makes it more believable. The GANs process detects any inaccuracies and corrects them. After a few rounds, the new fake video is complete and ready for amplification.
  • Disinformation spreads to the political left. While domestic disinformation comes most often from the political right, the left is also engaging in its creation and spread in social media.
  • Misled American staged events. Americans are now being recruited by Russian organizations to stage real-life activities to spread of disinformation. From deceptive IRA social media personas to anti-Muslim and pro-Muslim demonstrations, these events are promoted online to American followers to attend and draw media coverage.
  • Instagram, owned by Facebook, will be used more to spread disinformation. Image and video services are ideal for spreading disinformation via memes; photos combined with short, punchy text, and video clips.
  • WhatsApp, also owned by Facebook, will be used to amplifying disinformation. WhatsApp was used to send false content to large populations in the elections in Brazil and India. It could be a very active force in the U.S. 2020 presidential elections. 
  • Increased international activity. Not only is Russia involved in the creation and spread of falsehoods but Iran and China are also suppliers of disinformation.
  • Digital voter suppression continues as a threat in 2020. Election manipulation to continue in 2020. According to the University of Wisconsin, users tried to suppress voter turn-out in 2018 creating Twitter campaigns. One post tried to tell Trump opponents incorrect voting day information. Another tried to intimidate liberal voters by saying that NRA members and Republicans are bringing their guns to the polls.

Responding to disinformation

Social media companies put a few new measures in place since 2016 and 2018. They are now communicating more with each other, the government, and outside experts in an effort to address disinformation. Regardless, more has to be done to prepare for 2020. The NYU Stern Center for Business and Human Rights suggests the following recommendations for social platforms:

  1. Detect and remove deepfakes. Improve efficiency in removing deepfakes before they do their damage.
  2. Remove content that is provably false. Purge content that is definitively untrue.
  3. Hire a content overseer. Hire a senior official to oversee the process of guarding against disinformation.
  4. Make changes at Instagram. Act assertively to protect users from disinformation. Instagram doesn’t remove or down-rank a user’s main feed if disinformation is found. While the service does make it harder for new users to access the false content, it’s not a forceful enough action to stop the problem.
  5. Limit the reach of WhatsApp. WhatsApp now limits the reach of a message to 1,280 users (5 chat groups x 256 members in each) versus the previous maximum reach of 66,536 users (265 chat groups x 256 members in each).
  6. Defend against for-profit disinformation. Social platforms need to pay close attention to false content distributed by corporations, consultants, and public relations firms. Many companies specializing in clickbait manage successful businesses alluring naïve and intrigued consumers attracted to conspiracy theories and fake items.
  7. Support legislation on political ads and voter suppression. Push the Senate to approve the Honest Ads Act, to include political disclosure standards to online ads.
  8. Improve industry-wide collaboration. Form a permanent inter-company task force devoted to fighting disinformation.
  9. Teach social media literacy to the public. Educate users about questionable content and what to do if they come across it.

NYU cites findings from The Oxford Internet Institute and from four universities, Princeton, University of Exeter, Washington University and University of Michigan. This body of research shows a decrease in disinformation between 2016 and 2018. However, with the 2020 being a presidential election year, all anticipate intensified interference by both foreign and domestic U.S. actors. Continued efforts are needed to combat disinformation and protect the electoral process.

Print Friendly and PDF