Consumers are very familiar with digital recommendations. Suggestions online are a constant and come in all varieties from recommendations on new products to purchase, to new songs for your playlist, to new individuals to connect with to news posts in your newsfeed. As consumers spend more time on social platforms consuming content and expressing their views on just about everything, more data is collected, and more algorithms are employed to extract value from consumer information. However, consumers appear mystified as to what usage data tech platforms collect and how it’s parlayed into algorithms to impact their content results.
The Tow Center for Digital Journalism examines this question and others in their study, “Readers are hungry for news feed transparency.” In all, they conducted 13 focus groups across four cities with news consumers. The goal was to understand consumer usage and to clarify the role of algorithms in terms of tech platforms’ accountability and transparency, particularly in the distributed news environments.
Key findings include:
1. Tech platforms and news habits
Participants claim that their news and information consumption on tech platforms is more of a consequence and not their main intent upon visiting. They believe the ease and convenience of accessing news when visiting social platforms fuels this pattern of news consumption.
News audiences understand few details about platform algorithms. In fact, many participants have little awareness of algorithm usage. While, others see algorithms as filters for relevant content and personalization. Still others think their usage remains independent of algorithms and generally overestimate their degree of control on their news feeds. Interestingly, many participants said they are willing to leave tech platforms because of the lack of transparency with their algorithms and privacy practices.
3. Local news
Participants think local news has little to no visibility on tech platforms.
Audiences claim they can recognize publishers’ brands on platforms. They also admit to sharing fake news, thinking at the time it was accurate and truthful. Many participants identify what the Tow Center refers to as the “third-person effect.” According to the participants, they of course, recognize solid news brands but it’s “others” who do not and end up sharing fake news.
Participants link fake news to social platforms and place responsibility on them for allowing fraudulent information to be shared. While consumers debate platforms’ ethical responsibilities, they agree that offering ineffective solutions to identify fake news is not the answer. Further, audiences also see platforms as often having political biases. They think platforms are quite challenged in their ability to remain politically neutral.
5. Privacy on social platforms
Participants are concerned with tech platforms’ black box practices and lack of transparency, yet they appear resigned to the practice of data collection. Surprisingly, data collection is viewed as the price paid for accessing these platforms. Not surprisingly, the younger the participant, the more accepting of a platform’s data collection practices.
6. Business models
Participants acknowledge the value of news content; however, they often engage in practices to avoid publisher paywalls. The ease to which consumers can access news content on social platforms reinforces the practice of non-paid for content.
In terms of advertising, native advertising and the use of sponsored links such as “recommended links” or “partner content” is viewed with little trust and appears to carry negative views on the publisher.
The research suggests a necessary unveiling of algorithm practices for both publisher and tech platforms to maintain a trusted relationship with their audiences. It’s essential to offer tools and education to verify brands and the legitimacy of content and to unveil algorithms, tracking and privacy policies.