/ An inside look at the business of digital content
Engaging audiences: How publishers can shift heated arguments into meaningful discussions
October 20, 2021 | By Andrew Sullivan – Chief Product Officer, OpenWeb@OpenWebHQThe Internet is renowned for bringing out our worst instincts. When anonymous strangers with differing views converge in online community forums, the result is often heated arguments that can quickly descend into insults, threats, and abuse.
Once upon a time, that tendency to “pile on” seemed like a good thing for publishers. If strong emotions drive engagement, and engagement means more page views and ultimately more revenue, then stirring up angry debate must be good for business, right?
The risks of allowing online toxicity to flourish
Is it actually effective to sow controversy in an article, then sit back and let commenters fight it out below the fold? Based upon what we’ve seen over the past few years, not very.
For one thing, social platforms and publishers have had to contend with advertiser boycotts, as brands rush to distance themselves from hate speech and misinformation. Publishers like The Atlantic and Vice, meanwhile, have felt they had no choice but to shutter comments sections completely, cutting themselves off from their readership in the process.
There’s another, much more appealing option: building a safe and healthy online community of engaged commenters. Our research shows that quality conversations and respectful debate actively attract engaged users–and engaged users typically view 4.6x more pages, spend 3.6x longer on-site, and drive 3.2x more revenue than non-engaged users.
What drives incivility in online community forums?
So how can publishers turn heated arguments into quality discussions? First, it’s useful to understand what drives online incivility. Then these underlying drivers can be taken into account in audience engagement strategies. Here are three for starters:
- People behave in a less inhibited way when they interact online with people they don’t know and are never likely to meet, in an environment where they can be anonymous and where there are few or no repercussions for behaving aggressively.
- People get addicted to being right. Winning an argument can produce a flood of feel-good hormones, similar to winning a bet. As with gambling, that feeling can become all-consuming, leading to a greater focus on “winning” than on debating respectfully.
- Comments without context get misinterpreted. While it may not be 100% true that 93% of communication is non-verbal, a comment can seem hostile when elements like body language and tone of voice are missing. And that, in turn, can promote an aggressive response.
How to shift heated arguments to meaningful discussions
There are several options open to publishers who want to elevate the quality of debate in the communities they host. Here are four approaches that we believe in at OpenWeb:
Encourage visitors to become registered users
When people feel they belong to a community, and get to know their fellow community members, the online disinhibition effect starts to fade. Interactions between community members become more civil as a result. But that’s not the only benefit. Registered users are also more likely to return more often, spend more time on-site, and deliver more revenue than non-registered users.
Have editorial teams join the conversation
Editorial engagement is a powerful way to increase the quality of online debate. Editorial teams set the conversational tone and guide discussion by responding to the highest-quality contributions. We’ve found that editorial involvement of this kind typically leads to a 17% decrease in toxic comments.
Define and reward civility
Most people don’t come to an online community forum looking for a fight: often, it’s the atmosphere they find on the site that tips them into incivility. Sophisticated moderation technology can help publishers cultivate a civil atmosphere. For example, you can:
- Make it clear what kinds of language and behavior are encouraged, and what won’t be tolerated. For example, our OpenWeb Clarity Mode puts community guidelines front and center, ensuring everyone knows what’s expected of them.
- Invite users to rethink their comment before posting, if it looks like it may breach the community guidelines. This draws on the “nudge” theory of behavioral economics, which holds that positive reinforcement at the right moment can spur more considered decisions. When we experimented with such prompts across some of our top publishers, we saw a 12.5% lift in civil and thoughtful comments being posted.
- Use multi-layered AI and ML-based moderation to analyze, sort and highlight comments based on their quality rather than their propensity to incite hostility. That way, community members who post high-quality, thoughtful and expert content will be rewarded by seeing their comments highlighted and featured on the site.
Analyze and filter incivility
If rewarding quality conversations is the carrot, then filtering out incivility is the stick. For the most efficient moderation results, a hybrid approach using category-leading AI and ML moderation (combined with a healthy dose of intuitive, manual moderation) can reliably filter out a large proportion of toxic language–and analyze users to understand who are most likely to post toxic comments in the future.
Today, publishers no longer have to make a Hobson’s choice between enabling comments (and seeing them degenerate into petty arguments) and turning them off completely. With positive reinforcement, editorial engagement and multi-layered moderation, overheated arguments can become meaningful discussions–bringing healthier online communities, and higher reader loyalty, to publishers.
About the author
Andrew Sullivan is the Chief Product Officer at OpenWeb.