Journalists strive to present fair and balanced reporting. If they do their job well, they become respected authorities of the subject matter. In today’s environment of fake news and misinformation, this is both extremely challenging and important. So, how can they best do their jobs and break through the clutter? How can they rise above noisy false rhetoric and rebuild consumer trust in the media?
To explore these critical issues, researchers Hong Tien Vu and Magdalena Saldaña conducted a nationally representative survey among U.S. journalists. The study examines how journalists are evolving their practices given the increased volume of misinformation.
Double-checking sources more often.
Limiting the use of anonymity and identify source of information.
Including verification of information (i.e., data, raw footage) and incorporating into their content.
Importantly, journalists today offer increased transparency into their work. They want to connect with audiences on a new level of accountability. Harvard Kennedy professor, Thomas Patterson, offers similar insights in his book, How America Lost Its Mind. As Patterson states, “More harmful to our democracy is a cousin of conspiracy theories — misinformation. It also involves fanciful ideas about the actual state of the world, but it is far more widespread and a far greater threat.” Patterson believes journalists are gatekeepers of information. They have a responsibility to weed out false facts and call out false reports or unreliable sources.
The research from Vu and Saldaña demonstrates how journalists are trying to reengage with audiences to build trust by offering objective and accurate reporting. These efforts help to counterbalance the fake news and misinformation amplified on social media.
Journalists recognize fake news as a direct attack on our democracy. Their efforts in transparent practices are a helpful solution to curtail misinformation. Further, reporters with a strong base of online followers feel they have a responsibility to provide accurate information to their social media feeds. They also feel it is their duty to point out fake news and misinformation on social platforms.
This study offers an important understanding of journalists renewed focus on accuracy in the transformation of information. It illuminates opportunities for self-disclosure and social exchanges between the reporter and the consumer — be it a reader, listener, or viewer. This open exchange is vital in helping readers recognize quality journalism and premium publishers.
Social media continues to grapple with the spread of misinformation on their platforms. And consumers know this. Regardless, they continue to use social media as a primary news source. According to the most recent Pew Research Center survey, more than half of U.S. adults (53%) report that they get their news from social media “often” or “sometimes.” The survey was taken by nearly 10,000 U.S. adults.
Facebook ranks highest (36%) as the number one news source consumers use regularly among 11 social platforms. YouTube ranks second at 24% and Twitter ranks third with 15% of adults regularly getting their news there. Fewer consumers say they get their news regularly on Instagram (11%), Reddit (6%), Snapchat (4%), LinkedIn (4%), TikTok (3%), WhatsApp (3%), Tumblr (1%), and Twitch (1%).
Interestingly, despite the fact that they often find their news on social media, consumers question the accuracy of the news they get on these platforms. Approximately six in 10 consumers (59%) say that they expect the news on social platforms to be largely inaccurate. Unfortunately, the data shows little change over the last three years. Even after two congressional hearings, there’s still an abundant amount of vaccine, Covid-19, and the 2020 presidential election misinformation on social media.
Social media does little to help consumers interpret the news. In fact, less than one-third (29%) of consumers believe the information they received on social platforms helps their understanding of the news. Further, 23% believe the news on social media leaves them more confused and 47% report that it doesn’t make much of a difference.
More women than men (63% vs. 35% and 60% vs. 35%, respectively) use social media to access their news. However, Reddit has a distinctly different demographic. Among its regular news consumers, two-thirds are men compared to women (67% vs. 29%).
Consumers use social media as an easy and accessible path to news and information. However, this Pew study clearly shows consumer are aware of misinformation on social media. Increased awareness is a good thing and an important step to expose and defuse misinformation.
Social platforms continue to try to combat misinformation with fact-checkers and other programs. Twitter launched a new program, “Birdwatch,” which allows Twitter users to comment and provide context on tweets that they believe are misleading or false. Unfortunately, none of these programs are winning the fight against misinformation. A recent investigation of Facebook found 430 pages with 45 million followers monetizing misinformation with Facebook tools. Clearly, more needs to be done to stop the dissemination and monetization of misinformation on social platforms.
Deepfakes, manipulated videos synthesized by deep learning, are the newest tools in the misinformation arsenal box. Easily accessible via open-source applications, they offer a cheap and efficient way to create deceptive videos of public figures. How powerful are deepfakes? New research finds that misinformation consumed in a video format is no more effective than misinformation in textual headlines or audio recordings. However, the persuasiveness of deepfakes is equal and comparable to these other media formats like text and audio.
The research used two tests to measure the effectiveness of deepfake messaging among 5,750 respondents. The first test was conducted in a social media feed environment, surrounded by regular social media content.
Respondents saw or heard either a deepfake video, a fake audio, a SNL-skit like exaggeration, or a text headline. Deepfake stimuli featured Senator Elizabeth Warren in several scenarios. In them, Warren:
Calls Biden a pedophile
Calls Trump a pedophile
Revives an old controversy about identifying with indigenous people.
Creates an unexpected controversy about LGBTQ lifestyle
Goes back on a political position that eliminating student loan debt for anyone is fair or realistic
In all, just under half (47%) of respondents believed the deepfake video was real. However, the deepfake scored no better or worse compared to the audio or text false messaging.
Further, the research delved into respondent characteristics (e.g., gender, income, political party and more) to see if any are predictors of susceptibility to deepfakes. The results showed no significant differential between deepfakes vs. false text vs. audio misinformation. However, selective acceptance of information based on previous beliefs may influence an individual’s response to deepfakes.
The second test alerted respondents to look for misinformation. Participants were asked to identify if a video was true or fake. In all, 55% of the videos were identified correctly. Interestingly, political orientation did have an impact here. Both Republicans and Democrats underestimated the authenticity of real videos if it went against their party or candidate. They were much more likely to call a real video fake if it made their political leader or party look bad.
Seeing is not necessarily believing these days. Based on these findings, deepfakes do not facilitate the dissemination of misinformation more than false texts or audio content. However, like all misinformation, deepfakes are dangerous to democracy and media trust as a whole. The best way to combat misinformation and deepfakes is through education. Informed digital citizens are the best defense.
In an attempt to act responsibly, social platforms now flag content that is certain to be false. However, flagging disputed content has some unintended consequences. False headlines that aren’t flagged are often thought to be true. In fact, according to new research, The Implied Truth Effect, conducted by Gordon Pennycook, Adam Bear, Evan T. Collins, and David G. Ran, false headlines that fail to get tagged are viewed as more accurate. Thus, the research appropriately questions whether the policy of using warning tags to fight misinformation is effective.
False news headlines with flagging
This research includes two studies. In the first, the
control group was shown both true and false news headlines without any warning
labels. The test group was shown both true and false headlines; the false news
headlines included warning labels. Participants were asked how accurate the
headlines were and if they would consider sharing the story on social media (such
as Facebook or Twitter).
The first study confirms that content with warning labels
decreased the belief in items that are flagged (the Warning Effect) but
increases belief in items that are untagged (the Implied Truth Effect). In
other words, headlines that were not flagged in the test group, were rated as
more accurate, by at least one-third, than those in the control group
True news headlines with flagging
The second test included a control group and two test groups.
Participants were presented with news headlines and asked whether they would
share on social media. They were told that 75% of the headlines had been
fact-checked by Snopes.com.
The control group was shown true and false news headlines
without any labeling. The first test group was shown false news headlines, half
with a “FALSE” stamp and the other half without any stamps. The second test
group was shown false news headlines with a “FALSE” stamp and true news
headlines with a “TRUE” stamp.
The findings show that participants in the first test group were
less likely to consider sharing false headlines tagged with a warning compared
to false headlines in the control group. Further, participants in the second
test group were more likely to consider sharing true headlines tagged with a
verification compared to true headlines in the control group.
This research identifies the consequence of attaching warnings to some inaccurate headlines but not all. It’s safe to assume that a large percentage of false headlines will continue to appear on social platforms and remain untagged. However, it is important to note that it may be even more valuable to tag true headlines.
Labeling truthful and verified headlines helps consumer identify
what is true and suggests that everything outside this stamp is potentially false.
While this research was is experimental in its design, it’s important to take
next steps to explore the impact of full stories. Future work should
investigate the impact of warnings on the users’ likelihood of clicking through
to read the full articles, and the impact on sharing after reading the article.
the dawn of the new decade of 2020, DCN members gathered at the Mandarin
Oriental Miami January 16 and 17 to network, discuss victories and challenges
as media companies evolve, and explore industry predictions.
new decade calls for a perfect ‘20/20’ vision, said Jason Kint, CEO, Digital
Content Next as he kicked off the closed-door, off-the-record gathering. That
encompasses continued focus on audience desires, pushback against the myth that
all content has to be free, and the elevation of trust and transparency in an
era marked by ‘fake news’.
Union’s recently enacted copyright law is a win for the industry, with similar
discussions expected this year in the U.S, noted Kint. Federal and state
investigations as well as emerging regulations are all good signals toward protecting
consumer privacy, regulating data use and anti-trust concerns, notes Kint.
can also expect a steady rise in content investments. UBS estimates that in
2020, a combined 16 media firms will spend $100 billion to produce content.
More than $35 billion will allocated on streaming video content, as new players
such as Disney Plus and NBC’s Peacock emerge.
feeling really good this year about where things are headed,” said Kint.
Jim Bankoff, CEO, Vox Media said he valued being at the DCN Summit. He described it as a place where premium publishers come together to “find ways to partner and to check our healthy, competitive impulses … and figure out ways to work together” in the wake of ceding ground to third party big tech platform and ad network “that have proven time and again not to have our best interests in mind.”
journalist Carole Cadwalladr, who freelances for the Guardian and Observer,
captivated the audience by recounting her experiences unearthing the activities
of Cambridge Analytica and Facebook. She was nominated for a Pulitzer Prize for
her work, which sparked international investigations as well as inspiring the
Netflix documentary, ‘The Great Hack’.
was my introduction to this world of creepy disinformation, but also complete
reluctance from the platforms to even acknowledge the problem, let alone deal
with it,” she noted. She was instead subjected to legal pushback from Google
and Facebook as well as online bullying.
also called for media companies to not compete against each other. Instead, she
encouraged those in the room to join together to “compete against lies and
falsehoods. We’ve seen it in Britain and you’re next,” said Cadwalladr.
Galloway, professor, NYU Stern School of Business, said
he believes that the big tech companies on the antitrust radar should be
broken up. Monopolies kill economic growth and are a “key step to tyranny,” he
contended, adding a co-opted government can’t serve as a dominating force for
pointed out that efforts to regulate the behavior of big tech fines have been
largely ineffectual. To date, the fines haven’t been punitive enough to
dissuade the big tech companies to modify behavior, he said. He also criticized
the federal government for being slow to act.
Monetization and concerns about subscription fatigue were recurring themes at the summit. Yet DCN research shows that younger audiences in particular appreciate the value of a subscription and finds that there is still consumer appetite for subscription products.
Peretti, founder and CEO, Buzzfeed noted that over the course of a few short
years, the company has begun to generate significant revenue from Facebook,
Google, Amazon, and Netflix from licensing.
don’t think Facebook or Google wants to buy news companies,” said Peretti. Of
the platforms movement toward paying for content, he said that “They get the
benefit of sharing some of the costs of the production of that content. News is
a great way to direct repeat visitors and to build trust in the platform to
avoid some of the problems of misinformation.”
Turpin II, president, National Journal, noted his longstanding publication adapted
to the changing media landscape by transforming itself from a media company to government
research and consulting services company for which subscribers are willing to
pay premium prices.
VandeHei, co-founder and CEO, Axios; Executive Producer, AXIOS on HBO said, “you
have to deliver content in a way that I would deliver in a conversation with
you over a drink, like what is new.” However, to create value, “Tell me why it
matters. Give me some context. Give me the power to go deeper.”
Complex, the path to success hasn’t been simple. Rich Antoniello, CEO and founder,
Complex Networks said, “we call ourselves a brand that happens to monetize
through media.” He said his company shifted from an ad-dependent model in 2016,
ahead of the curve.
example is the wild success of its “Hot Ones” program. It features10 questions
of its celebrity guests that get progressively more personal along with the consumption
of hot sauce that gets progressively hotter. And the business model is based
not on advertising, but on the sales of high-margin hot sauce.
also outlined the success of ComplexCon, the company’s flagship event, which connects
cultural icons with fans who spend $100 to $700 for VIP tickets, with hundreds
of thousands sold. Fans also snap up merchandise from Complex and its app-based
vendors such as Nike and Adidas.
power of fandom arose again when Howard Mittman, CEO, Bleacher Report spoke of
how his company’s app and successful franchises attract sports fans. He
described how individual athletes hold more sway in their fandom habits than sports
10 million fans have signed up for alerts and the app accounts for half of the
company’s user engagement. Bleacher Report’s focus is not on breaking sports
news, but creating engagement on its own platforms, according to Mittman.
continues to go through cultural shifts toward diversity both in company
staffing and in targeting readership such as women. “Women are generally not
seeing themselves in media and advertising to the extent that they should be,”
said Catherine Levene, president, chief digital officer, Meredith National
have been the first to support #SeeHer, a national organization committed to
accurate representation of women in media and advertising,” she said. She added
that’s not only good for supporting women, but also for the bottom line. Women
who see themselves in media and advertising are 45% more likely to recommend a product
to a friend and purchase it, said Levene.
the controversy it has attracted by those who question the veracity of its
science, Gwyneth Paltrow’s Goop brand is growing, noted Elise Loehnen, chief
content officer. The platform embraces several media forms and covers topics
from relationships to health, including alternative therapies. She said that
the controversy has been good for keeping the brand at the forefront of popular
tired of being talked down to,” said Loehnen. “We’re a strong female brand
undisturbed by the chaos.”
Tobaccowala, chief growth officer, Publicis Group,noted that the only
way to get ahead as a legacy company is to “kill your core. You have to rethink
your entire business.”
Levene from Meredith believes that the mobile world and 5G will create an even greater market for video. And, with 50% of searches conducted on the more than 200 million voice-enabled devices in U.S. homes, opportunities and challenges will arise.
action to purge third-party cookies against the backdrop of GDPR and CCPA will
impact the entire digital ecosystem, Levene noted.
is going to be the currency of the future. Those who have it at scale and the
ability to drive a lot of insights from it are going to win,” she added.
In a social media environment that is being blamed for everything from decreasing personal contact to radicalizing disaffected youth and intensifying suicide rates among girls, Tatyana Mamut, head of product, Nextdoor, made the case that her platform is creating connections on a micro-level in a neighborhood at a time when people hardly know their neighbors
believe that kindness is the next big thing in tech,” she added.
Alto journalism educator Esther Wojcicki made the case that helicopter
parenting has impacted the workforce and its ability to embrace risk and
innovation. She calls for parenting – and management – to embrace trust,
respect, independence, collaboration and kindness. She also promotes the idea
that every student should take a journalism course to build media literacy skills.
future will be fraught with change. And as Tobaccowala pointed out, “human
beings know how difficult change is.” But to survive, media companies must
continue to evolve.
have the power to shape minds and hearts, to fill the world with laughter and
tears to inform the truth,” said Kint. “Here’s to 2020 bringing the roar of the
crowd as we focus on what matters most: the audiences we serve.”
An alarming number of consumers don’t trust the media. Since trust hit its all time low in 2016, the industry has been hard at work restoring this critical factor. The media industry and social platforms now employ a wide range of approaches in order to address the proliferation of inaccurate and misleading stories. Some media brands have undertaken marketing and educational efforts to help make the connection between brand and the quality of information more explicit. And labeling has been used as a means to help consumers quickly identify the source of, and type of, information they are viewing.
This last approach — labeling — takes a classic print strategy and brings it into the digital medium. A new study from The Center for Media Engagement (CME) set out to evaluate the effectiveness of labeling stories. Unfortunately, the primary takeaway is that labeling alone does not improve consumer trust in the information before them. In fact, most of them don’t even notice labels or recall them accurately after reading an article.
Labels alone will not
However, this is not
to suggest that labeling should be abandoned altogether. Upon deeper
inspection, the research found that some labels work better than others. The research
also suggests that, when conceived of as explicit and even educational, labels
may be effective as part of an overall trust-building strategy.
Does the in-story
explainer label work better than the above-story label?
Key findings from the research:
Labeling stories did
not affect trust.
Nearly half of the
participants did not notice whether the story was labeled.
Those who reported
seeing a label were not particularly accurate in recalling the type of label.
Of the two labels, recall was better for the in-story explainer label
Clearly, the research
demonstrates that people glaze over most story labels (i.e. news, analysis,
opinion, sponsored) if they notice them at all. Overall, 45% reported that they
did not notice whether an article was labeled or not and that percentage did
not vary depending on whether the article actually was labeled.
More concerning was
the finding that, when asked, most people believed that the article was labeled
news. This is a potentially problematic default assumption given efforts to use
labels to prevent the spread of disinformation and to help consumers
distinguish opinion, commentary, and satire from hard news and analysis.
The research analyzed
whether the effect of the story labels on the ability to recall the label
varied based on participants’ backgrounds, including their age, race,
ethnicity, education, income, gender, political ideology, and political
partisanship. It is interesting to note that only one variable seemed to
matter: age. The younger the participant, the more likely they were to recall
the label correctly when the story was labeled news or opinion.
CME also compared the
traditional above-story label to an in-story explainer label and no label at
all. The in-story explainer label provided definitions of each label based in
part on those proposed by the Trust
Overall, the study
found that in-story explainer labels increased the likelihood that people would
recall the correct label compared to those who did not see the label and those
who saw above-story labels. However, they still found that many people failed
to recall whether the story they read was labeled or not.
Sixty-three percent of
those who saw an article without a label said they did not recall whether there
was a label (25% correctly recalled that it was not labeled). Fifty- eight
percent of those who saw an article with an above-story label could not recall
whether the article was labeled (24% correctly recalled what the article was
labeled). On a more encouraging note, only 24% of those who saw an article
with an in-story explainer label failed to recall whether the article was
labeled, and 66% correctly recalled the article label.
More work to be done
Unfortunately, regardless of label type, the use of labels alone did not improve consumers’ view of the information’s trustworthiness. Past research from CME suggests that a combination of strategies to signal trust – such as story labels, author biographies, and descriptions of how the story was reported, can increase trust.
Given reader’s digital
consumption habits, it is significant to reveal the low recall for labels,
particularly those placed above the story. Other efforts, such as
describing how a story was reported, in conjunction with the finding that
explainer labels are somewhat more effective suggest that transparency and
consumer education will be critical in restoring trust in digital
Most academic, media and political analysts forecast the likelihood of disinformation playing a large role in the upcoming 2020 presidential election. The NYU Stern Center for Business and Human Rights echoes this projection in its new report, Disinformation and the 2020 Election: How the Social Media Industry Should Prepare. In fact, the analysis predicts that more disinformation will be generated state-side than the volume generated by foreign entities.
What to expect from
Deepfakes increase in volume. Deepfakes are easier to produce now due to advancements in deep-learning and editing systems. A deep-learning system produces a persuasive fake video by studying photographs and videos of a target person and merging them with images of an actor speaking and behaving in the same manner as the target. Once a preliminary fake is produced, a method known generative adversarial networks (GANs) makes it more believable. The GANs process detects any inaccuracies and corrects them. After a few rounds, the new fake video is complete and ready for amplification.
Disinformation spreads to the political left. While domestic disinformation comes most often from the political right, the left is also engaging in its creation and spread in social media.
Misled American staged events. Americans are now being recruited by Russian organizations to stage real-life activities to spread of disinformation. From deceptive IRA social media personas to anti-Muslim and pro-Muslim demonstrations, these events are promoted online to American followers to attend and draw media coverage.
Instagram, owned by Facebook, will be used more to spread disinformation. Image and video services are ideal for spreading disinformation via memes; photos combined with short, punchy text, and video clips.
WhatsApp, also owned by Facebook, will be used to amplifying disinformation. WhatsApp was used to send false content to large populations in the elections in Brazil and India. It could be a very active force in the U.S. 2020 presidential elections.
Increased international activity. Not only is Russia involved in the creation and spread of falsehoods but Iran and China are also suppliers of disinformation.
Digital voter suppression continues as a threat in 2020. Election manipulation to continue in 2020. According to the University of Wisconsin, users tried to suppress voter turn-out in 2018 creating Twitter campaigns. One post tried to tell Trump opponents incorrect voting day information. Another tried to intimidate liberal voters by saying that NRA members and Republicans are bringing their guns to the polls.
Responding to disinformation
Social media companies put a few new measures in place since
2016 and 2018. They are now communicating more with each other, the government,
and outside experts in an effort to address disinformation. Regardless, more
has to be done to prepare for 2020. The NYU Stern Center for Business and Human
Rights suggests the following recommendations for social platforms:
Detect and remove deepfakes. Improve
efficiency in removing deepfakes before they do their damage.
Remove content that is provably false.
Purge content that is definitively untrue.
Hire a content overseer. Hire a senior
official to oversee the process of guarding against disinformation.
Make changes at Instagram. Act
assertively to protect users from disinformation. Instagram
doesn’t remove or down-rank a user’s main feed if disinformation is found. While
the service does make it harder for new users to access the false content, it’s
not a forceful enough action to stop the problem.
Limit the reach of WhatsApp. WhatsApp now
limits the reach of a message to 1,280 users (5 chat groups x 256 members in
each) versus the previous maximum reach of 66,536 users (265 chat groups x 256
members in each).
Defend against for-profit
Social platforms need to pay close attention to false
content distributed by corporations, consultants, and public relations firms.
Many companies specializing in clickbait manage successful businesses alluring naïve
and intrigued consumers attracted to conspiracy theories and fake items.
Support legislation on political ads
and voter suppression.
Push the Senate to approve the
Honest Ads Act, to include political disclosure standards to
Improve industry-wide collaboration. Form
a permanent inter-company task force devoted to fighting disinformation.
Teach social media literacy to the public. Educate users about
questionable content and what to do if they come across it.
The business of publishing disinformation, inaccurate information spread purposefully and/or maliciously, is more profitable than ever according to a new study from the Global Disinformation Index (GDI). Analyzing website traffic and audience information from 20,000 domains it suspected of disinformation, GDI estimates the sites generated at least $235 million in ad revenue. GDI is a nonprofit that evaluates and rates websites’ risk of spreading disinformation.
A fertile environment
The existence of disinformation predated the internet. However, social platforms offer a new level of amplification. Social media coupled with a programmatic marketplace provided the perfect environment for malevolent actors looking for reach, a target audience, and revenue.
Ad-tech companies’ opaque practices make the environment even more attractive. Some connect buyers and sellers while others collect, aggregate, package and sell data. The result is a black box of operating systems. These shifts in the digital ecosystem offer a golden opportunity for marketers of fraud and disinformation.
Disinformation actors rely on each another to amplify their messages. Interfering with one actor could potentially make it more difficult for the others to spread their disinformation. However, despite their inter-reliance, disinformation actors each have a distinct focus:
State actors include governments as well as state-linked actors to spread inaccurate information and promote government propaganda. They are centralized actors using digital virality to amplify their message.
Private influence operators are for-hire companies (e.g. Cambridge Analytica) that run commercial marketing and public relations campaigns that aim to disinform the public. They use targeted campaigns to identify a specific psychological, behavioral or politically affiliated audience to amplify their message. Their misleading and false content sites look professional. These ad-supported domains mimic traditional journalism.
Grassroots trolls are individuals or groups that band together for a specific issue or cause. Their content and activities often focus on hate speech or try to push a false narrative. Their messaging often starts out on forums like 4chan or 8chan, move to other intermediate platforms like Reddit and then finally into mainstream media.
Pure rent-seekers are all about clickbait. They churn out sensational disinformation to drive visitors (and bots) to click on their site in order to collect revenue.
It’s not surprising that today’s digital marketplace offers an effective delivery system for disinformation actors. Unfortunately, the internet is filled with disinformation that is rapidly amplified via social media. And it is human nature to find drama attractive. Disinformation is loud content that demands our attention. It also claims advertising dollars in an efficient and expedited programmatic manner. As an industry, we need to de-incentivize disinformation actors by removing financial and amplification motivators.
Disinformation comes in all shapes and sizes. Whether it takes
the form of a text-based article, a meme, a video or photo – they are designed
to go viral across message boards, websites and social platforms like Facebook,
Twitter and YouTube. And, its polluting the internet. In fact, an Oxford
Internet Institute study
found that in the 30 days leading up to the 2018 U.S. midterm elections, a full
25% of Facebook and Twitter shares contained misleading and deceptive information
claiming to be real news. Addressing concerns about domestic disinformation, Paul
Barrett, of the NYU Stern Center for Business and Human Rights, identified the
steps social platforms need to take to stop the problem in a new report, Tackling
Domestic Disinformation: What the Social Media Companies Need to Do.
In the Report Barrett cites a MIT study that analyzed every English-language news story distributed on Twitter over an 11-year period and then verified the content of each story as either true or false. The study found that, on average, false news is 70//5 more likely to be retweeted than true news. Where are all these falsehoods coming from? A Knight Foundation study found that 65% of fake and conspiracy news links on Twitter could be traced back to just 10 large disinformation websites (e.g. Infowars).
Domestic disinformation is a constant in today’s digital experience.
While there are many who call for its removal, others believe that it’s
difficult to differentiate it from ordinary political communication protected
by the First Amendment. Importantly, Barrett does not suggest that the government
determine what content should be removed from social media. He believes social
platforms can make better choices determining if content is accurate, and also how
they promote and rank it.
Practices in place
Social platforms use machine learning to improve their
ability to identify false stories, photographs, and videos. In addition, while Facebook
previously flagged content to warn readers that it was potentially false, they
now offer “Related Articles,” a feature that provides factually-reliable
context about misleading stories. YouTube offers a similar program. When a user
searches for topics that YouTube identifies as “subject to misinformation,” they
preface video results with a link to information from reliable third parties. Even
with these efforts, disinformation remains available on these platforms for
anyone to view and share. Social platforms’ current practices are not enough in
their fight to reduce disinformation.
false content. Content that is proven to be untrue should be removed from
social media sites, not just demoted or annotated.
the principles for content removal. Offer insight and transparency into
what constitutes facts and rational argument versus the manipulation of
information for disinformation purposes.
senior executive who has company-wide responsibility for combating false
a more robust appeals process. Offer an opportunity for an appeal to a
person or people not involved in the initial content removal decision.
efforts to purge bot networks. Increase efforts to eliminate automated
accounts that imitate human behavior online.
algorithms to reduce monetization of disinformation. Doing so will diminish
the incentive and therefore the amplification of fake news.
more data for academic research. The platforms have an ethical and social
responsibility to provide data they possess to facilitate studies and tracking of
industry-wide cooperation. Establish a data exchange and offer best
practices across platforms to ensure common challenges are addressed.
for digital media literacy. Teaching people to identify digital literacy and
how to be more discriminating of online content should remain a priority.
more fact-checking and explore new approaches to authenticate news content. Continue
fact checking efforts, a crucial first step to distinguish between truth and
narrow, targeted government regulation. Identify specific content
regulations similar to the degree of disclosure for online political
advertising currently required for traditional broadcast media.
Barnett concludes that “neither the First Amendment nor international principles protect the lies on social media.” It’s essential for social platforms to step up their role in self-governing to ensure disinformation is not monetized or worse, used to manipulate people and trigger violence. Importantly, humans must remain in control of platforms, overseeing the impact of AI in all its forms. Scrutiny and transparency are key in uniting efforts to dismantle the prongs of disinformation.
On January 29th, at the 2019 DCN Next: Summit, Rappler CEO Maria Ressa outlined the role social media and concerted, well-orchestrated disinformation campaigns played in perpetuating false information and media distrust in the Philippines, as well as attacks aimed at Rappler.
She then went on to have a wide-ranging discussion examining the various pressures on media credibility (and safety) worldwide with interviewer extraordinaire Kara Swisher, Co-founder of Recode.
Less than two weeks later, on Wednesday, February 13 at 5 p.m. local time in Manila, plainclothes officers from the National Bureau of Investigation, an agency within the Department of Justice, arrested Ressa on charges of cyber libel. As Ressa wrote in a statement: “We are not intimidated. No amount of legal cases, black propaganda, and lies can silence Filipino journalists who continue to hold the line. These legal acrobatics show how far the government will go to silence journalists, including the pettiness of forcing me to spend the night in jail.”
The Board of Directors of Digital Content Next (DCN), a trade association representing nearly 80 high-quality media companies, said, “The arrest of Maria Ressa is deeply troubling. Maria traveled to the U.S. to share her developing story with our members only two weeks ago. It is vital we value and protect the independence of media organizations and journalists around the world. Any effort to silence journalists or use intimidation to reduce their reporting is an affront to freedom. We encourage global leaders and the press community to make it clear this cannot be tolerated.”
In light of Ressa’s arrest, and to reinforce our support of a free press everywhere, DCN is pleased to share the video of Ressa and Swisher’s interview (full transcript below):
And, for those who would like to show support for Rappler and Ressa’s work, she has provided a link to their crowdfunding page.
Below, we’ve shared a full transcript of Ressa’s conversation with Swisher.
Alexandra Roman: [00:00:00]
I am truly honored to introduce this next conversation interviewer
extraordinaire Recode’s Kara Swisher. She’ll be speaking with a very special
person in our world these days. Named Time magazine’s Person of the year as one
of the guardians of journalism, please welcome the CEO of Rappler, Maria Ressa.
Kara Swisher: [00:00:33] So we’re going to start…first Maria is going to make a presentation then we’re gonna have a full fantastic discussion. Maria was on my podcast recently. It was, it was an amazing experience for me and I’m so glad she’s here and safe in the United States right now. We’ll be talking about that more. But first Maria go ahead.
Maria Ressa: [00:00:52] So
I like that Jason [Kint, CEO of DCN] talked about trust. And this is stuff I’ll
show to you from our perspective in the Philippines because it’s got the data
to prove the thesis and then I think you guys are not quite… I think you’re
not seeing the termites eating at the the credibility that you have as news
organizations and those termites are coming from geopolitical power plays. We
go back to information is power and with that that let me show you what’s
happened in the Philippines.
January last year, there were two surveys that came up exactly the same time but they’re almost complete opposite results. The top is real world Pew Global Attitudes Survey: How do Filipinos look at traditional media? And they came back they said 86 percent think traditional media is and the right quote is “fair and accurate.” But the Philippine trust index, which is part of the Edelman Trust survey, they came out with a survey that same month a year ago. And they asked people on social media and they came out with 83 percent “distrust traditional media.” Right. So how did that happen? We tried to figure out why is the world upside down? That’s really the question right. Why is the world upside down?
We have a database that we started gathering in July of 2016 when the when the drug war began in the Philippines because the attacks all came on social media. In our case it’s Facebook. But this is a timeline of attacks on traditional media. And in Rappler because we were the main focal point for a period of time, [which] started January 2015 and then moving to April 2017. January 2016 was when the campaigns began and the social media machine of then Mayor Duterte. He Was elected to office May 2016. You see that one? And you can see the fracture line Byaran means corrupt. Bias. So Bayaran is the one in the middle. The first long line and bias is the last one. If you look at that it’s a fracture line of society right.
There were mentions before but it was constantly pounded until it became a straight line after president Duterte was elected the weaponization of social media happened after he was elected because it was repeatedly pounded until it became fact. A lie told a million times its truth.
Right. So, then what happened? Here: This This is the database I was telling you about right? We call it The Shark Tank. The one on your left is the URLs that are spreading fake news in the Philippines. The middle column are the court the Facebook pages that are spreading that page. And I always look at the average reposting time which is the one all the way to your… my right, sorry it’s flipped.
I want to show you when the real attacks began against Rappler and it was after we came out with a three-part series on the weaponization of social media. It was October 2016. I went to Facebook with the data August 2016. So, October 2016 this is what it looked like. In October 2016, if it’s more than 10 times reposting, it turns red, You can see how it turned red. This Facebook page Sally Might Die accomplished its goals by April 2017. It’s been deleted from Facebook but you can see … This was something we created for our social media team so that you can see it’s a cut and paste account. And they post; look at how many times they post in one day! Each one of those squares is just one day. And this is where they post the groups. They posted to go viral in the campaign pages of Duterte it and Marcos, the son of former President Ferdinand Marcos.
oneI’m going to just show you the last thing which is how can we figure out who’s attacking us. Well you can gather the data and it looks like this but if you put it in a network map, It looks like this. This is the network that was attacking Vice President Leonie Robredo about a year ago and it is the same network that constantly attacks me, Rappler, and every traditional media. It is so systematic that the content creators of the network are broken down by demographic. For the Motherland, it is pseudo-intellectual and tries to target the one percent but pseudo-intellectual. The middle class is targeted by thinking Pinoy and the mass base is this Moka Olsen blog who is former singer dancer. They used to use to build her Facebook page by having it like she has a singing group called the Mocha Girls and they do pillow fights every Sunday. That was how they first built her Facebook page. Then she became the head of social media for the presidential palace and it became a whole other thing.
Anyway, you can see this is what attacks what attacks
journalists systematically. And it happens so many times. I just want to show
you one last thing which is something we did for Rappler. Natural Language
Processing to pull out. So, we looked at the entire Lexis Nexis right to try to
figure out … What do we need to learn? What Is the data telling us about the
articles that were written about us at the time when I was about to come home
for bail to file bail? Yeah, I had an arrest warrant then.
Right. So, the Philippines wrote 34 percent of the stories.
The United States wrote 27 percent. You guys are a potent a potent force for
us. But what was most interesting is that the Filipino stories are part of the
reason it’s in a line like this is because they essentially just regurgitated
the press release of the Department of Justice. It was the American news
organizations that talked about it as a Duterte rights crackdown. That wrote
about it in context. That was an amazing thing. I want to leave you with sorry
I don’t know what wrong thing, I think. I want to move forward. I want to leave
you with this information warfare. Yeah, I guess this is the right one.
So, with information warfare I’m going to bring it to to Russia. Dezinformatsiya. This was really interesting because. For Duterte to end the drug war. Sorry about that my slides were. OK so… I don’t know if you remember Yuri Andropov. He was the former KGB Chairman. This quote stuck with me because it fit the Philippines. Dezinformatsiya works like cocaine. If you sniff once or twice it may not change your life. If you use it everyday though it will make you into an addict. A different man. I think this is the impact on our democracies and we’ve seen it.
The first reports came out in November of 2017 saying that cheap armies on social media are rolling back democracies all around the world. And at that point it was something like 28 countries. By last year, it was 48 countries. It’s doubling. We started looking at Ukraine to try to understand how we can use the data the way Ukraine started fighting back. It is information warfare. It is political. It is about power and the money part of it … or the people who are actually or who are actually catering to the politicians. Russia backed Facebook post, this was November of 2017, this is the first time that I saw Americans really starting to look at it. But even when I saw this ok they reached 126 million Americans. I think what people missed is it happens all the time. It wasn’t just ads, it was it’s all the time …
I talk about termites. This bot is interesting to me because
it tweeted about U.S. elections. First, remember the Philippine election of
Duterte was one month before Brexit. After Brexit, there were U.S. elections
and then the Catalan elections. This little bot Ivan tweeted about all of
those. So, we found him from the Catalan elections. And when I’ve looked at his
account, it was specifically only tweeting about the Philippines. When we
posted this story, within 24 hours Twitter took his network down.
On Facebook, this is the last part I want to show you, the most recent thing that I found fascinating. In December, two groups came out with reports based on data that was given to the US Senate Intelligence Committee. This is the chart that is from new knowledge. And this thing at the bottom, I want to show you the connection between the Philippines and that chart. It’s this: So we tend to map the networks around us. Let me just. Try to get this so that you can see it. There. This is the attack network. Not connecting.
OK. So, this attack network was from November. Sorry it’s frozen. There. Yay. OK. November 8, December 7th. This network. And you know what I used to map the network is this free tool called Flourish. It’s a startup. This is little Rappler. And what’s so interesting and this is where I will make the pitch that I don’t think we have any other choice but to actually collaborate together. Rappler is here. This, all of this, is a disinformation network that’s attacking us and you can kind of literally see it right.
But what’s so interesting is in the Philippines this
overshadows the information landscape. The traditional media groups are so set
aside they’re desperate. I’ve been trying for the last two years to get our top
television networks our newspapers to work together like retweet re share each
other so that we can rise up together in the algorithms. We refuse to do it
because people think it’s competitive. But you know what? You’re competing
against disinformation not against each other now.
I want to show you this because and I’ll end with this one… so this disinformation network is so interesting right. But this is the most fascinating one. When we saw this, I was surprised because this was created a year ago. It’s only one year old the daily sentry dot net. And yet the larger the circle, the larger the eigenvector centrality, the more powerful the account is. This is exponential pushes behind it. What’s interesting about it is that this is the first time we saw a direct connection to the Russian disinformation landscape because daily century dot net uses experts in quotes from this network. Sorry I can’t I can’t do the thing but on that chart there is an American man who who’s often interviewed by our de Sputnik by Iranian television … His name is Adam Gary. He is now an expert who’s popping into the Philippine ecosystem. He came in through the Daily Century and he’s from the Daily Century and he jumped into traditional newspapers from there. There’s a direct link to him because he writes for Global Research dot ca a group in Canada and connected to two other groups: one is Eurasian affairs dot net here. Another site both of whom come from a Russian IP address.
All that data in the chart came from the data that was given
to the Senate Intelligence Committee and published last December. This is
what’s happening in my country. I think you’re finding out what’s happening in
yours. But I think we’re only a small case study of what is happening globally
and that scares me.
Kara Swisher: [00:14:13] OK. All right. So, how was prison? [laughter] No really. How was prison?
Maria Ressa: [00:14:20] I, oh, I hope I won’t get there but you know…
Kara Swisher: [00:14:25] You were arrested. Explain what happened to you? We we did a podcast and I said you should not go back to the Philippines because you will be arrested. And what happened.
Maria Ressa: [00:14:32] Of
course I went back. Right. But I wasn’t arrested. OK. I thought I would be so
our lawyers told me … My flight arrived on Sunday night at 9:30 p.m. The
court, which is supposed to be an all night court, well it closes at 9:00 p.m.
So, if they had picked me up that night I couldn’t have filed bail until Monday
morning when courts opened. In the Philippines, if you have an arrest warrant
you’re not told you have an arrest warrant. They just come get you. I came I
went home and I was I wasn’t going to change anything and it went OK. I filed
bail. I I filed bail once I filed. I posted bail five times actually in that
Kara Swisher: [00:15:16] But
to be. And you weren’t actually arrested.
Maria Ressa: [00:15:18] No
I wasn’t arrested. I wasn’t arrested.
Kara Swisher: [00:15:20] Please
explain to everyone here who doesn’t know why they [are going to] arrest you.
What are the charges?
Maria Ressa: [00:15:26] Well
charges are ludicrous. Tax evasion. It’s really one event, the same event, that
I have four other cases of. They’re alleging, the government is alleging, that
I am working for, well, that Rappler is owned by Americans one and that I am
essentially working for them to take down the government. Very Putin-esque.
None of that is true. And then on top of that, the arrest warrant came from
taking that same charge: the investment instrument that we used, which was
constitutional. They then decided that… we didn’t pay the right taxes. And
the reason why they said we didn’t pay the right taxes was because they
reclassified Rappler into a stock brokerage agency.
Kara Swisher: [00:16:16] Rather
than a journalist.
Maria Ressa: [00:16:17] Rather
than a newsgroup.
Kara Swisher: [00:16:18] Right.
Maria Ressa: [00:16:19] And
that’s what I have to post bail for.
Kara Swisher: [00:16:21] The
reason I’m asking what this is I want people to understand how people can use
social media to create trumped up charges and then arrest you for them arrest
you for it.
Maria Ressa: [00:16:32] Well it is interesting that you said that because all of these charges. I laughed off because they first appeared on social media. And they were thrown at me. CIA you’re a foreigner. I am a dual citizen. But, all of that. Like termites you know they just came at it and then a year and a half later it comes out of President Duterte’s mouth during the State of the Nation address. He said that you are a journalist; I’m covering the State of the Nation address. And then President Dutertet says look at Rappler: They are American. So, then I just tweeted back. President, no we’re not owned by Americans.
Kara Swisher: [00:17:12] Right, right. So, let’s talk about the state. Well, last we talked you, you made a very passionate plea to Facebook to do something about what’s happening. What you’re showing here is essentially organized disinformation campaigns to pull you down because you’re doing critical coverage of the president in the Philippines. And so they’re employing a very slow moving but powerful network to do so and using in the Philippines as you said most people get their news at not just the Philippines but across the world from Facebook. This is the purveyor of news. And these malevolent forces have created pages and news organizations and fake organizations to try to battle that. Talk a little bit about that. About what where you are right now because at the time. You were sort of subject to the biggest news organization attacking you being used to attack you.
Maria Ressa: [00:18:06] OK.
So I think that there’s a whole information ecosystem that has been
manufactured and it is manufactured reality. And we went down to a point where
we were looking at you know how how powerful is it really. We manually counted
the impact of 26 fake accounts. 26 fake accounts can actually reach up to three
million other accounts in the Philippines and it wasn’t we were the first
targets because we expose them. I was so naive.
You know, I thought wow we can just do a hashtag no place
for hate campaign and people will come back because you think these are real
people. They Are not. And after we did that, we became the target. And as you
saw in the first slide it’s not just us it is traditional media because the
main goal is to kill any trust in any institution that can that can push back.
All we have done is challenge impunity. Impunity here in information warfare and impunity in the drug war. You don’t know how many people have been killed in the Philippines during this drug war because they keep changing the numbers. At most recent count the Philippine police will admit to killing 5,000 people. Even that number alone is huge compared to the fact that 3,200 were killed in nine years of Marcos rule. Right. But. There’s this other number they never rule out. It’s the homicide cases under investigation and there are 30,000 people who’ve been killed there. So, If you think about it since July 2016 you can have more than it is tens of thousands. Thirty five thousand. I know the way they parse the number and I’m even cautious in the way I tell you how many people have been killed.
Kara Swisher: [00:19:58] So
what they’re doing is trying to use social media to stop you from writing about
Maria Ressa: [00:20:03] Not just trying to use it they’ve used it effectively. iI think this is the first the first weapon it’s a new tool against journalists and against truth. And part of the reason we’re having a crisis of trust is because this is global.
Kara Swisher: [00:20:17] Right.
So, talk a little bit about your efforts with Facebook to do this initially. You
ran into Mark Zuckerberg and told him about this.
Maria Ressa: [00:20:28] F8 April 2017. There was a small group of us who had lunch together. It was founders groups of companies that were working with Facebook and I invited him to come to the Philippines because I said you know you have no idea how powerful Facebook is. Ninety-seven percent of Filipinos who are on the Internet are on Facebook. We’re 100 million people. And he was frowning and I was going so why are you frowning. And he just said, “Maria what are the other three percent doing?”[laughter]. We laughed: huh.
Kara Swisher: [00:21:05] Ah.
Ha. Ha. That’s how the board talks. But go ahead.
Maria Ressa: [00:21:09] But
that’s when you realize that that they didn’t understand their impact. What
they understood was their goal. And so I think now that’s changed.
Kara Swisher: [00:21:21] Right.
So they did that and then you brought this information to them. What happened
Maria Ressa: [00:21:27] Nothing.
You know by the time Mark Zuckerberg was in Congress for me everything that you
guys were finding out here is you know “been there done that.” We’ve
talked about this. I feel like Cassandra, you know. I’ve talked to maybe more
than 50 different officers and friends inside Facebook.
But we’re the Philippines and maybe people think you know
you’re out there. But, when He appeared in Congress and he said it would take
five years to fix this with AI. I was like you can’t do five years. Because In
the global South in my countries in Myanmar Sri Lanka and the Philippines every
day that it isn’t fixed means people die… I think they’re getting it. I think
partly your coverage you know the 2018 has spotlighted this but I don’t think
enough because it’s still being used.
The good thing is there have been take downs take downs of
Russian networks, Iranian networks, they’ve been to take downs in the
Philippines. The most recent take down was about three weeks ago of a network
we identified and did a story on 13 months earlier. You know so it’s a little
too little too late but you know what. I will take everything because at least
it cleans it up. But the fundamental problem is that. our gatekeeping power …
So, we used to create [and] distribute the news and when we distributed the news we’re the gatekeepers. Now that power has gone to the social media platforms. Facebook is now the world’s largest distributor of news and yet it has refused to be the gatekeeper. And when it does that when you allow lies to actually get on the same playing field as facts, it taints the entire public sphere. And it’s like introducing toxic sludge in the mix. And this I think that’s the fundamental problem. They have to actually at some point say take down the lies instead of allowing it to spread.
Kara Swisher: [00:23:33] So
what do you face when you go there and say you need to take down these lies?
Tell me what happens or how are they now working with you.
Maria Ressa: [00:23:41] It’s
it’s significantly different now. And that’s part of the reason.
Kara Swisher: [00:23:46] Well
they’re very sorry now. But they’re very very sorry and also very very very
Maria Ressa: [00:23:53] I think they’re starting to understand what they’ve done. And I think they’ve started to hire the right people. In January of 2017, Nathaniel Glaser who was in charge of counterterrorism in the Obama White House. You know he was hired and shortly after that, well took a while, because this is a manual effort right? Tracking these networks down like counterterrorism requires somebody like a law enforcement official to go look for them. And so that’s part of the reason you see the takedown start starting to happen. I think it goes. The main thing that they have to do is to go to the content moderation system that they’ve put in place.
Kara Swisher: [00:24:39] Right.
Maria Ressa: [00:24:40] As journalists we have values and principles. We call it the standards and ethics manual. As tech people they tried to atomized it into a checklist and then this checklist goes to content moderators in — you know the two largest for a long period of time we’re in Warsaw and Manila.
Kara Swisher: [00:25:01] Right.
Maria Ressa: [00:25:02] And in Manila … I don’t know if you saw the movie, it was done by..
Kara Swisher: [00:25:06] The
Maria Ressa: [00:25:08] The
Cleaners, right. And in that one you can see that that these content moderators
who barely make you know minimum wage here in the States but they they have
seconds to decide whether to delete or whether to let content stay. And if they
just go by a prescriptive checklist they’ll just go up delete delete and let it
stay. And the guy who took down Napalm Girl was a Filipino and he took down
Napalm Girl because check list naked.
Kara Swisher: [00:25:35] So there’s no famous photograph of the girl running from napalm in Vietnam. Pulitzer Prize-winning photograph. It was news.
Maria Ressa: [00:25:44] So
these Filipinos who were in a call center in the Philippines are taking down
terrorist content potential are taking down supposed hate speech without any
cultural context without understanding the content.
Kara Swisher: [00:25:59] So
what do you what is your solution to them. I’m using Facebook as a broad thing
but they really are the game. Twitter is sort you have the same problems with
Twitter and other social networks?
Maria Ressa: [00:26:10] Twitter
is only 7 percent of penetration in the Philippines.
Kara Swisher: [00:26:13] So
it’s an unpopular service. So yeah.
Maria Ressa: [00:26:18] No, but it’s same right the same content moderation policy as YouTube. YouTube is huge. Also in the Philippines. And you know what this disinformation cuts across all of them. So I mean you saw it in our shark tank. We had the you or else I would love to give that to Google and have them down ran some of that. Right. Because.
Kara Swisher: [00:26:39] This
is just you doing their work for them. Correct?
Maria Ressa: [00:26:43] You
know I… I guess for me when you’re dealing with this stuff. and you’re
breathing it, it’s like toxic fumes every day. You just want a solution. And it
takes… Imagine if somebody from America comes to the Philippines and tries to
figure this out. It would take them a year. I already know it. Here take it. Do
something with it. I don’t look at it as their work. I think OK. This is where
I’ll be really generous. I know that they didn’t mean to do it. It is an
extremely powerful tool and the reason why I continue to work with Facebook is
because I think if they had the political will and the economic will to do
This is a game changer for the Philippines. Rappler couldn’t
exist without Facebook. We zoomed we grew 100 to 300 percent year on year
because of Facebook at the beginning in the good times. And I think they made a
crucial error in 2015 and that was instant articles when they brought all the
news groups in and then all of a sudden were at the same algorithms as the joke
that you heard or what you had for dinner. And when we became mob rule when
facts became determined by mob rule then it changed the ecosystem of democracy
in the world.
Kara Swisher: [00:28:03] And
what do you propose now that these… So, YouTube is a problem.
Maria Ressa: [00:28:09] YouTube
Kara Swisher: [00:28:09] A
huge problem. Are you getting the same responses from them: So sorry. They’re
really, really sorry. [laughter] No they really are. But they’re not in any way
Maria Ressa: [00:28:22] So
yeah. Tell me do you think they will act on it?
Kara Swisher: [00:28:27] You know I have an expression that was from one of my grandparents: You’re so poor all you have is money. I think they like their billions. I think they think they’re doing good for the world. And I think they’re careless. It’s sort of like from The Great Gatsby. They were careless people and they moved, they did damage and moved on.
Maria Ressa: [00:28:47] But they now know they’re not. And they’re killing people. They know that now.
Kara Swisher: [00:28:52] I
think they, what I’m getting now from a lot of people, is you’re so mean to us.
Maria Ressa: [00:28:59] Because
I do see them see this.
Kara Swisher: [00:29:01] When they say that I’m like fuck you. [laughter, applause.] You know what I mean. So it’s very hard for me to. But they are there’s a lot of victimy.
Maria Ressa: [00:29:11] I
mean until now. But you don’t know.
Kara Swisher: [00:29:14] No
I think they’re they literally get angry when people say hey hey now you know
hack democracy you really need to fix it. And they… I think one of the things
that I find interesting is when there is money to be made or whatever, they are
it’s their company. Yes.
And when there’s problems to be solved, it’s we all togethe have to solve it as a group. You know I mean and I’m like we didn’t get 64 billion dollars that I looked at. You know I have real old shoes. I don’t know. I mean we didn’t share in the upswing. And so I think again I joke. I’m so sorry but they feel badly but then I think are actually incapable in any way of taking care of it. I think they have they don’t have the mentality. They don’t have the talent. I think they’re incompetent to the task. That’s what I think.
Maria Ressa: [00:30:02] But
if that’s the case they will die. I mean it’s going to be a slow painful death.
But you know what I mean I guess for me I’m taking almost an opposite that it’s
there’s this phrase on enlightened self-interest that is…
Kara Swisher: [00:30:17] One
would think. One would think. No because this this will eventually… the
product will become terrible to use.
Maria Ressa: [00:30:24] Right.
Kara Swisher: [00:30:25] Or
it will become very addictive to use. And then what’s the difference? Like you
said with cocaine, I think. So, how do you … what are you wanting. What would
you like from them? You’d like them to become gatekeepers in other words.
Maria Ressa: [00:30:37] I don’t think they have a choice. I think they have to be. Otherwise we will leave. Right? Or again they’ll break be broken up by regulation or people will leave. In the Philippines so look at the immediate reaction. Alexa ranking of all the websites where do Filipinos go? From 2012 to 2016: number one Facebook. Undisputed. But then when the toxic sludge began mid-2016, by January 2017 on Alexa ranking Facebook dropped from number one to number eight. And then by January 2018, it went back to number five. In January 2019, right now, if you look at Alexa ranking in the Philippines, it’s number four.
So slowly they’re rising up but there’s no way. So, I mean
my thing is if they don’t fix it we will leave. We will leave. So that’s why I
think it is in their best interest they have no choice. But They are going to
have to suck it up and they’re going to have to have they are going to have to
hire real people. Machines can’t do this. But those real people will train the A.I.
and it will get better over time and they will have to lose money because they
will have to hire real people.
Kara Swisher: [00:31:52] So
talk to me a little bit about that business because you’re trying to create a
Maria Ressa: [00:31:57] Yeah.
2019 I’m trying to be a good CEO.
Kara Swisher: [00:32:00] Being
arrested attacked and essentially they’re trying to put you out of business.
Maria Ressa: [00:32:07] The
Kara Swisher: [00:32:07] Talk
about the actual business. Because it’s hard enough to do a digital effort. You
know that. I know that.
Maria Ressa: [00:32:14] Yeah.
So, in the Philippines and in many other parts of the world good journalism is
really bad business and I wear both an executive editor hat and I’m the CEO so
it’s my job to make sure our business survives. In 2017, when the attacks started
happening we realized that and we had a big board battle. You know “you
journalists, you know you gotta tone it down” from the business men. And then,
from the journalists, because we had we were the largest group of shareholders
in Rappler. We had 3 percent more votes. So we pushed forward and 2018 was
mission and a lot of anger management issues. But 2019, I have to be a good CEO
and we need to build the business. So what we’ve decided. So when you’re under
attack by the government your advertisers get scared almost immediately they
don’t want to be associated with the brand. They always say you know Maria
we’re behind you but they’re very very far behind. [laughter].
Kara Swisher: [00:33:18] And
nice Time cover!
Maria Ressa: [00:33:23] So
I found out about it on Twitter. And I had to check whether it was real! But
the time cover is the first time I saw the ecosystem come up like real people
who were afraid. Fear is very real in the Philippines and I’m sorry. Before I
before I talk about the fear and I just want to finish on the part about the
business. So businessmen the businesses… they’re not the protectors of
democracy. And even if their values say that they want to do that they just
don’t because the money isn’t there. So, you can’t attack Facebook in the same
way or if you’re run by businesses your values — sorry — they follow
afterwards after the money. So, well, what we did is: We came up. We were
forced to be agile. And A lot of the things that you saw–the mapping, trying
to understand unstructured big data ,all of these things– we came up and
pivoted and became a consultant. Like I essentially carved out another team
that can do the same things we do for Rappler for other companies.
Kara Swisher: [00:34:37] So
your business… so, in that environment what do you do? Because good
journalism like you said is bad business.
Maria Ressa: [00:34:44] Rappler
continues doing good journalism. And I’ve we’ve taken the business and pushed
it away and we actually found a new business. The two things that we did. We’re
the first in the Philippines… The crowdfunding part, actually I didn’t think
it would work in the Philippines. But when our legal fees became like a quarter
of the entire monthly spend, we asked our community and they helped. And that
that helped pay for some of the legal fees. And then we, just December, we
began a membership program we called it Rappler Plus. I don’t think it would
have worked in the Philippines because unlike the United States or Europe
unlike the more developed countries, we don’t have a history of that but not
even subscriptions. People don’t want to pay for news especially in a country
where you struggle to put food on your table three times a day. So the Rappler
Plus took off much faster than I had expected and I think it is because of the
fear. People are afraid and by standing up … By being the kid telling the
emperor he has no clothes. By telling him he cannot do this with impunity.
This Is the most powerful man that we have had in since… I
think he’s more powerful than Marcos was. He controls the executive. He owns
the legislative and by the time he leaves office he will have appointed 11 of
13 Supreme Court justices. You guys in the states worry about one Supreme Court
justice he’ll have appointed 11 of 13. This is our next generation. And It’s
extremely worrisome, especially with this information warfare, with the young
men in our country who are sucking up these fumes. You know the levels of
misogyny according to our data women are attacked at least 10 times more than
Kara Swisher: [00:36:40] Alright,
we have questions from the audience and then we are going to end. Are there
questions from the audience? Yes, you over here. Right here. Put your hand up.
Question: [00:36:51] Hi. Krishan Bhatia from NBCUniversal.
Thank you for sharing this story and the insights and everything that you’re
doing to uncover this. My question for you is in the US market, as we sit here
today as premium publishers most of whom have some sort of news business and we
serve large cap marketers in the US: What should we be doing differently with
respect to Facebook in particular but platforms in general that we’re not
Maria Ressa: [00:37:20] I think we have [to address the issue]: Who is the gatekeeper right now? But I think that ideas are very simple to me. If information is power. And the gatekeeping determines what information is taken by everyone. And we all focus … the debate in the US focus is on all of these different demographics and the polarization. The polarization happens because we don’t have the same facts. So it goes down to that. Please push. I think Kara asked the solution for me is when you have something like Facebook or YouTube moving beyond prescriptive to where we used to be which is what are the values? What are the principles like standards and ethics for journalism right? It can’t be prescriptive because. Ironically what they keep saying they defend free speech but free speech in this case is being used to stifle free speech. So, you’ve got to take the toxic sludge out of the body politic because that is killing us and everything else is organ failure you know because you’re not getting the oxygen that you need.
So please push you have far more power than little Rappler
does in terms of pushing for action in my part of the world I guess you know
maybe I’m happy with little because it’s been so long. We have elections in May
and these take downs will do a lot. I’ve seen the reactions of the people
running those those Facebook pages. But please look also do the investigations
here in the United States. The data is coming out now. I think that our
credibility are and I mean are for traditional media and the new ones coming
up. I think we’re getting eaten up by termites without realizing that that the
floorboards are about to crack. That’s why I think there’s a crisis of trust.
Kara Swisher: [00:39:19] Yes,
I would agree with that. Finish on this question of fear because I think it’s a
really important thing of fear of not speaking up of rocking the boat of all
kinds of stuff or just people just are exhausted by it because you’re not doing
journalism you’re spending time dealing with lawyers you’re spending time
moving businesses around you’re not doing the actual job which of which you
were.. used to do.
Maria Ressa: [00:39:43] Yeah
that’s also true. I know it just means I’m not sleeping that much. But you know
I find that the journalism… So look, Rappler has been mission-driven and all
of the friction of a normal organization is gone because everyone who stayed
with us and everyone did stay with us on the journalism side we lost sales and
tax strangely. But the mission is so clear and the purpose is so clear and I
think the challenge for all of our news groups is to be able to maintain that.
In a society, what fear does, what this stuff does is normal
people will not… When you get attacked like this I didn’t show you any of the
attacks, but when you’re attacked so viscerally when you’re threatened with
rape with murder, you just shut up. And that’s exactly it’s meant to pound you
into silence. But our community realizes this. So in a strange way. I. We’re
not just journalists anymore also that’s weird.
Like when I’m at the airport sometimes a family a family
came in and hugged me and I hug them back. I didn’t know who they were but it
was because they are also they are afraid to speak. So when you speak for them
you fulfill a role that I think that’s the mission of journalism. I think I
have a natural tendency to be more positive I should hang out with you a little
bit more. [laughter]
But you know when you’re in my place, I put one foot in
front of the other. The mission is clear. We’re going to have to deal with
this. And I think this is what Facebook has to realize. They have to get
through this because it’s not just us. We’re just the canary in the coal mine.
It’s here it’s happening here. Your problems are because of stuff like this. I
think. I think it’s global.
Kara Swisher: [00:41:37] Are
Maria Ressa: [00:41:39] No
because there’s too much to do. Not right now. You know there are times when I
think it was far worse when no one was paying attention because when the
attacks were so personal the first two weeks…I got 90 hate messages per hour.
Not one nine. Nine zero hate messages per hour. And when I got that, it took me
two weeks to just figure out how do how am I going to deal with this and what’s
real and what’s not and then do I need security? You know all of that stuff. So
no I’m not afraid because now I know what it is. And the data helps me
understand it. So that’s the certainty. That’s why I know it’s important to
have the facts. You cannot fight back if you don’t have the facts.
Kara Swisher: [00:42:24] All
right. On that note Maria Ressa. [applause]