“I have some great news about the coronavirus,” says TikTok user Khalilslife, speaking with the confidence of a newscaster, while wearing a backwards baseball cap. It’s May 5 when he uploads this video, and Americans are growing restless. By now, many have followed Covid-19 social distancing rules for six weeks or more. At this point, any “great news,” is undoubtedly welcome.
“If (coronavirus vaccine) trials are successful, that means that it could be in emergency use by September,” Khalilslife says to his 2.8 million TikTok followers. “In other news, Madagascar president said they already have the cure, and it’s made from herbal natural remedies. It’s already ready to begin production. What would you rather do, a vaccine or an herbal drink?”
It’s easy to hear “information” akin to “an herbal tonic can cure Covid-19” while idly scrolling through TikTok. In fact, Khalilslife’s video has been viewed 1.7 million times. It takes more effort to go to a news website and find explanatory articles about medically- and scientifically- researched efforts around coronavirus vaccines and treatment.
If Khalilslife viewers were to double check his facts, they’d find his video lacks context. it’s true that Madagascar’s president claims that an herbal concoction can work as a coronavirus cure. The veracity of that claim is highly questionable. Some stories do mention a vaccine appearing in autumn. However, most experts find that projection highly doubtful. In fact, in May, scientists and medical experts predicted that widespread use of a vaccine would be 12-18 months away.
For zoomers, the medium is the message
TikTok, a social platform where users create and consume 15- to 60-second videos, was the fourth-most downloaded app of 2019. It is especially popular with Gen Z, those born from 1997 to 2012. Colloquially called zoomers, Gen Z members post dance challenges, lip syncing, comedy sketches, slang-filled history lessons, and political punditry. Along with YouTube, Instagram, and Snapchat, these apps concentrate much of Gen Z’s time.
However, social media platforms are also where most members of Generation Z get their news. A 2019 report from the Reuters Institute at Oxford University found that young people largely consume news passively or as a time filler as they scroll through social media. People under 35 also tend to think of the news as a source of entertainment, the research showed.
Unfortunately, though, misinformation is rampant on these platforms. They disseminate political conspiracies and false information about climate change, flat earth beliefs, and anti-vaccination propaganda. Unsurprisingly, coronavirus conspiracy theories have found a home on social media apps.
Perhaps what is surprising is the fact that Gen Z-ers have just as difficult time sniffing out false or out-of-context statements as their elders. News organizations may need to rethink their strategies when it comes to tackling misinformation directed at Gen Z. It is critical to support this generation’s media literacy skills and help them learn to identify quality information from trustworthy sources.
Digitally native, but digitally naive
The problem is not just the misinformation itself. Because as it turns out, although they use social media fluidly, Gen Z has trouble distinguishing fact from fiction.
Despite their familiarity with online media, 82% of middle schoolers who responded to a Stanford University survey could not tell the difference between an advertisement and a news story. The 2015-16 survey also found that over 30% of middle schoolers surveyed considered a fake news story more credible than a real one.
A Stanford study from 2019 echoed these claims. Researchers showed more than 3,000 American high school students a Facebook video of Russian election workers secretively loading ballots into bins. The caption incorrectly identified these workers as Democrats during the 2016 election. Over half of respondents believed the caption. They agreed that the photos were “strong evidence of voter fraud” during the 2016 Democratic primaries.
Zoomers are not alone. A Pew study found that nearly 90% of U.S. adults thought fake news either caused some or a great deal of confusion during the 2016 election. Nearly a quarter of American adults admitted to sharing fake news during the election, whether they realized it at the time or not.
“The problem of digital media literacy is truly multi-generational,” said Katy Byron, editor and program manager of MediaWise, a fact-checking initiative from the Poynter Institute. “Teenagers spend a ton of time online, upwards of 4+ hours every day by some estimates and anecdotally … But just because they are online a ton doesn’t mean they know how to navigate digital information.”
Given their ease with social media, the internet and digital devices, it’s a conundrum. Why do zoomers struggle to identify misinformation online just as much as older adults who remember floppy disks and prefer Facebook to TikTok?
“The ability to operate a device fluently and fluidly does not mean that you understand the information that that device produces,” said Dr. Sam Wineburg, professor of education at Stanford. In carrying out the two Stanford above, Wineburg aimed to investigative civic online reasoning in young people.
Wineburg likens online media literacy to driving a car. For anyone who drives with a stick shift, switching gears doesn’t require any thought; it’s an automatic response. But not everyone who can change gears can also fix a broken transmission. In short, somebody’s ability to use a tool or technology doesn’t mean that they understand how it works.
Misinformation in the age of Covid-19
Here’s a Sparknotes version of social media’s transmission, so to speak: Algorithms reward posts with high engagement. More eyes see content with shares, retweets, comments, and likes. This alone may seem obvious enough. But these high-engagement posts and videos sometimes promote extremist or conspiracy theories, born on anonymous sites like Reddit and 4Chan before spreading on more mainstream platforms.
YouTube in particular has come under fire for its algorithm, which, if left to its own devices, seems to play more and more incendiary content. And in 2016 and 2017, Russian bots took advantage of spreading misinformation through fake Instagram accounts.
On top of that, it’s not always easy to pinpoint false claims if they aren’t immediately outrageous or obviously extremist. Khalilslife’s video about coronavirus cures, for instance, is rooted in news reports. However, it lacks critical context.
Thea Barrett, a recent graduate and zoomer who works as a Mediawise intern, sees this often enough. Misinformation “presents itself in subtle ways, where it isn’t completely right or completely wrong. So, one can share something not thinking too deeply about the post but actually be spreading misinformation,” Barrett said.
When it comes to coronavirus, much of the misinformation spread by social media follows this pattern. In April the Reuters Institute found that about 38% of misinformation regarding Covid-19 was completely fabricated. In comparison, the majority (59%) was real information that had been taken out of context or twisted in some way.
Social media at a boiling point
Historically, social media platforms have been criticized for their failures in identifying and taking down false information. Recently, the debate about social media’s role intensified. The Wall Street Journal reported that top Facebook execs knew the platform’s algorithms were divisive and polarizing, but ignored this information. And Facebook CEO Mark Zuckerberg said he didn’t think “Facebook or internet platforms in general should be arbiters of truth.” In other words, according to Zuckerberg, social platforms shouldn’t be responsible for fact checking.
But the misinformation debate has really flared up on Twitter, when the social platform labeled two tweets from the president as “potentially misleading.” Twitter also took the unprecedented step of hiding a tweet from the president that seemed to call for violence against protestors in Minnesota. Twitter’s public interest note claimed that Trump’s tweet “violated the Twitter rules about glorifying violence.” In response, the president issued an executive order that threatens to make websites and social platforms vulnerable to lawsuits.
Covid-19 has also been an impetus for platforms to take more action. Instagram added a note to its community guidelines about not permitting false information. YouTube announced it would take down unsubstantiated medical information about the virus. TikTok revealed a $50,000 research grant meant to “better understand the misinformation ecosystem on social media.”
Even so, plenty of out-of-context or wrong information remains online and is widely disseminated via social media. And experts don’t believe these platforms are doing enough: “They need to be less reactive and abandon this game of whack-a-mole unless they plan to significantly increase content moderation efforts which is clearly not happening in an effective way,” Byron said.
What about news organizations?
Gen Z consumes much of its news on social media. However, it is clear that some of that news is wrongly contextualized (or just plain wrong). So, how can credible news organizations engage this audience and deliver the facts?
It’s not always easy to compete for the attention of a generation that views reading the news as a chore, as reported by the 2019 Reuters Institute study. But the study also found young people are, at least, aware that “fake news” is a problem. They tend to trust what they consider reputable organizations, like the New York Times, Bloomberg, or the Washington Post.
For reputable news sites, this means adapting to the content that Gen Z-ers prefer. They gravitate toward short-form stories, videos, and illustrated formats. It also means delivering the news in a way that fits with the platform, not simply a 800-word traditional news story lazily reformatted to fit a TikTok video or Instagram story.
Some independent initiatives are not only targeting their content at young people, but also getting zoomers involved. The Poynter Institute, for instance, runs the MediaWise Project and Teen Fact-Checking Network. Teen fact checkers show their work at the Mediawise Instagram account, talking viewers step-by-step in how they evaluate “sus” information on social media.
Barrett, who started as a Mediawise teen fact checker and became an intern, says this approach works. “Once you even get one person started talking about it, that has a ripple effect,” Barrett said. “I’ve seen that in my own life. Friends have started sending me claims and asking me if things are legit just because they know I do this work.”
That was, perhaps, one of Washington Post executive editor Marty Baron’s aims In his Harvard commencement speech last week. Baron addressed the college’s seniors, the oldest members of Gen Z. He reminded them that false information can directly impact issues of public health like coronavirus, leading to more hospitalizations and deaths.
“I would have settled for emphasizing that our democracy depends on facts and truth. And it surely does. But now, as we can plainly see, it is more elemental than that,” Baron said. “Facts and truth are matters of life and death. Misinformation, disinformation, delusions and deceit can kill.”