- Jay Rosen’s Press Think: Full Stack Credibility (7 min read)
- New York Times: Gigaom’s challenges: poor leadership, spending beyond means, inattention to long term problems (7 min Read)
- New York Times: Biggest Advertisers Are Sending Their Dollars to Digital (2 min read)
- Washington Post: Why some are terrified by the idea of a Google truth machine (5 min read)
- Ad Age: New-Media Powers Say They’ll Be Profitable Soon. Don’t Ask About That Facebook Plan (3 min read)
- Reynolds Journalism Institute: In the new news ecosystem, getting paid means: personalizing, bundling, and wholesale-retail pricing (16 min read)
- PBS MediaShift: Journalism and Social Media: It’s a Love-Hate Affair (5 min read)
Category results for "Perspectives"
DCN’s Recommended Reading: Week of March 19, 2015
Between Consumers, Innovators and Open Internet, FCC Chooses All of the Above
Last week, the FCC released the details of its Net Neutrality order. Looking past the sound bites from both sides of this debate, the FCC appears to have a struck a delicate balance that will ensure consumers can continue to enjoy access to the digital content and experiences they love and, at the same time, preserve the ability of current and future content creators to innovate.
The crux of the order is the ban on blocking, throttling or prioritizing content. Rightfully, the FCC is seeking to preserve consumers’ unfettered access to content or experiences on the Internet. Just as importantly, however, the order also protects the ability of content creators to reach their audience without having to seek the blessing of the ISPs.
In the absence of this order, future content creators might have to pay ISPs just to receive fair treatment of their content or, worse, find themselves shut out in favor of content created by affiliates of the ISP. To be clear, large and established corporations would probably do very well in a world where broadband providers served as gatekeepers of the Internet – the small and game-changing innovators, well, not so much. The FCC should be applauded for going to great lengths to ensure that the consumer experience and content creators are protected.
Another key component of the order is the requirement that a broadband provider provide transparency about “the network management practices, performance, and commercial terms of its broadband Internet access services.” This kind of transparency is vital to helping consumers fully understand the internet services they have purchased and whether they are getting full value. This information is also critical to content creators who need to know that new applications, content and services will operate as expected.
The FCC also declined to waive (or “forebear”) the consumer privacy requirements under Title II although they won’t apply them immediately. Instead, the commission intends to hold a workshop to explore how the Title II privacy provisions should be implemented to protect consumer privacy. As the FCC rightly notes, “broadband providers serve as a necessary conduit for information passing between an Internet user and Internet sites or other Internet users, and are in a position to obtain vast amounts of personal and proprietary information about their customers.” As we’ve written about before, and the FCC agrees, consumer trust is critical to harnessing the full potential of broadband internet services.
We argued in our comments to the FCC last summer that the FCC should focus their work on the consumer experience. With this order, the FCC has taken important steps to encourage investment and innovation in content creation for consumers, and ensure that the Internet is an open platform that supports consumer choice and the open exchange of ideas and information.
Financial Times Research Shows Context Matters for Digital Advertising
Publishers would have you believe that environment matters more than ever when it comes to the effectiveness of digital advertising. The Financial Times wanted to make this more than a theory. To that end, on March 4th, 2015 Daniel Rothman, director of marketing and insight at the Financial Times, presented The 2015 Halo Study – new research that set out to test (and prove) that environment matters more than ever in digital advertising. While this is not the first time this type of research has been conducted (i.e., BBC’s Advertising Dress Test, IAB’s VW campaign test, and DCN’s Branding on Display Research), it’s easy to understand the need for regular updates to this research given the constantly evolving digital landscape.
“This research highlights for advertisers the importance and relevance of websites affiliated with traditional media companies and helps educate (in Halo’s case, using neuroscience) how brands are clearly viewed depending on the environment that they appear within,” Rothman said.
The FT study coined and defined the term “Observable Halo Effect”: the subconscious perception that brands that have ad placements next to high-quality content benefit from the “halo” of that publishers brand, creating trust, and positive measures for their brands. The FT research looked at six brands and nine content providers; brands with ads on the more established content providers (FT, New York Times, Wall Street Journal and Bloomberg) garnered significantly higher brand perceptions, or Observable Halo Effect, compared with brands with ads on emerging sites (Huffington Post, BuzzFeed, Drudge Report, Twitter and LinkedIn).
Rothman noted that “Commissioning a project such as our FT Halo Study is a significant financial and time commitment. Many brands and even smaller agencies may be unable to make investment into projects such as this. We are happy to share all the findings with all marketers and agencies– and have already begun to do that across the US and UK.”
Additional findings from The 2015 Halo Study:
- Not all content providers are equal. Established content providers are more likely to be perceived as high quality, trustworthy, and more prestigious than non-traditional media owners.
- You can’t fit a square peg in a round hole. When consumers act like media planners, they place 50% more ads in established content sites compared with emerging sites.
- Right environment = implicit added value to a brand. Customers implicitly have a higher perception of the brand and a quicker reaction time indicating the best natural fit with established content providers. The same attributes — especially high quality and trustworthy — are what people look for in new products and services.
After presenting his findings, Rothman moderated a panel of marketers and agency executives to discuss the results. The panelists were Christine Bacon, Head of Advertising at Allianz Global Investors, David Rosenbaum, EVP, Group Account Director – LVMH at Havas and Matt Hickerson, Managing Director, Marketing & Communications at Macquarie Group.
Some takeaways for agencies and marketers from Rothman’s panel:
- Ask questions. Before picking content partners, brands must ask themselves:
What is the brand’s DNA?
What are my goals and what am I trying to accomplish?
What is the right audience?
Is the content provider the right environment for my brand?
- Brand reputations matter. Brand perception can change based on platforms. Smart brands want to be in smart environments.
- Quality over quantity. Brands are looking for quality interaction and will only make an impact if there is the right integration. Being on many different platforms and in front of a myriad of audiences is not always the answer.
- Content adjacency is key. Content adjacency problems are an ongoing issue. Brands can mitigate risk with high quality and collaborative briefs that focus on desired goals and outcomes.
- Thoroughly evaluate platforms. Well-established brands are not always looking to rush into new environments. And in certain categories (such as financial institutions and luxury products), many choose not to unless there is proven success by similar brands. Brands should properly evaluate each platform they partner with – it’s not always about being the first.
“The Halo Study is an example of the commitment that FT has to providing our clients and the advertisement community at large with unique market insights. One of the advantages of the Financial Times is that we invest significant funds into analytics, which include campaigns performance analytics, brand lift studies, brand tracking studies as well as larger macro industry trend pieces,” said Rothman. While The Halo Study may not repeated for another two to three years, we can expect to see more research from the FT this year, including a study of Opinion Leaders and a Media Engagement study.
Building a Business Case for Keeping Comments Sections
There was a time when online comments below stories enhanced and strengthened the stories with added commentary, smart insights and timely corrections from readers. Unfortunately, that only lasted until the first ad hominem attacks came. Then spammers. Then outright harassment.
So publishers are torn between wanting to keep readers engaged (and on the site) or streamlining operations and content so they don’t get pulled down into the swamp of trolls. Lately, publishers such as Re/code, Chicago Sun-Times, Bloomberg and Reuters have eliminated some or all comments. But others, like the New York Times, value reader comments but limit them to certain stories and close them after specified time periods.
The bottom line is that each community is different, with a site like Popular Science wanting to ward off climate-change deniers and a place like Reddit trying to keep things as open as possible. And what fits one business doesn’t fit another. Local news sites typically want more engagement with readers who live in their community, while a wire service like Reuters doesn’t really “live” in any community.
No Comment
There are many reasons to cut comments entirely. A recent study led by University of Wisconsin-Madison professor Dominique Brossard found that comments can change readers’ perception of the article. “Uncivil comments not only polarized readers, but they often changed a participant’s interpretation of the news story itself,” she and her co-author Dietram Scheufele wrote in a New York Times Op-Ed.
Another important reason to kill them is that comments and conversation have moved on to social networks such as Facebook and Twitter, where articles are being shared. Thus, publishers begin to question the resources required to moderate comments (The New York Times has 13 moderators) if they don’t pay enough of a dividend.
Dan Colarusso, executive editor for digital at Reuters told me in our recent PBS MediaShift Mediatwits podcast that they hadn’t seen a drop-off in engagement on Reuters after removing comments from news stories. “The normal organic users to our site weren’t engaging,” he said. “It was a fraction of 1% increase in engagement on our site from comments… So we decided to cull them out of there. And our wire service reporters don’t have the time to interact.”
It makes more sense for Reuters.com to keep comments on blog posts and opinion pieces, Colarusso said, because those stories, and their writers, are already associated with providing opinions and personal views. Reuters, as a brand, is not.
Moderate Comments
There are middle grounds, too. The Huffington Post, for example, banned anonymous comments in 2013 and has since instituted a system where readers must log in using their Facebook accounts. Though this has earned mixed reviews from its readers, some of whom don’t like giving personal information to Facebook, it’s proved useful for HuffPost in terms of accountability. “We’ve seen a marked [decrease] in the number of fake accounts in our system. It’s also helped with getting more quality comments and positive conversation as opposed to criticisms or insults,” Huffington Post community director Tim McDonald told Digiday last year.
The Engaging News Project released a study recently that showed that there are ways to increase civility in online comments. If journalists interact with commenters, the tone usually improves. Plus, some publishers such as Gawker do a better job of designing comments to highlight the best ones. And algorithmic systems that automatically flag cursing or spam in comments can help.
In the end, news outlets have to analyze the purpose of their comments to make the business decision that’s right for them. Is the purpose of commenting to encourage a community that fits in with the news organization’s brand? If so, then perhaps keeping a commenting system in-house is a must, so the organization can maintain control of that brand and identity. But if the purpose of comments is to drive engagement back to a news website, then referrals from social media — and therefore, commentary that takes place on social media — might be the best fit.
Or if the point of commenting is to encourage high-level discussions, then perhaps you have to make the process of commenting attractive only to those who really care — like Tablet Magazine’s announcement that it plans to charge would-be commenters.
We’ve seen social media take on so much heat and discussion, but publishers will have to weigh whether they want to improve commenting and discussion on-site or cede another territory to the social giants.
Listen to the whole Mediatwits podcast discussion on online comments here:
DCN’s Recommended Reading: Week of March 12, 2015
- AdAge: Another Round of Web Redesigns Brought to You by ‘Viewability’ (4 min read)
- CJR: Can Tony Haile save journalism by changing the metric? (16 min read)
- WSJ: The Most Powerful Player in Media You’ve Never Heard Of (6 min read)
- AdAge: Former Mediacom CEO Alleges Widespread U.S. Agency ‘Kickbacks’ (3 min read)
- Fusion: How an advertising company put a ‘marijuana cookie’ on your computer to get weed legalized (8 min read)
- LinkedIn: The New New in Digital Advertising (10 min read)
- Digiday: What worries European publishers most (2 min read)
- WSJ: Trying On the Apple Watch: Natural Feel, Fewer Distractions (3 min read)
Sharethrough Takes A Neuroscience Perspective to Look at Mobile Native Advertising
Sharethrough, a software company that enables leading websites and apps to manage their in-feed, native ads, commissioned a study “A Neuroscience Perspective: Asessing Visual Focus, Message Processing & The Ability To Strengthen Associations Through Mobile Native Advertising” from Nielsen to determine how consumers visually process mobile ads. What’s interesting about this particular study is that instead of the usual survey-based research, it applied eye tracking and neuroscience—the study of subconscious reactions in the brain—to mobile advertising.
According to Sharethrough:
Unlike survey-based mobile measurement, which evaluates a consumer’s conscious reactions to ads, neuroscience taps into the brain’s subconscious reactions as well. This is critical: the subconscious is the motivating force behind many of our actions, including which brands we buy from.
To understand the effectiveness of mobile advertising, the study (conducted in accordance with Nielsen’s proprietary methodology) compared native ads and banners, both placed in-feed. Nielsen worked with five premium advertisers, including Boeing, creating mock ads from similar creative elements that were optimized for each format. Study participants were shown a video simulating the experience of scrolling through an editorial feed. The feed is paused and the participant is shown either a native ad or an in-feed banner. Using a combination of EEG data— measurements of neural activity in the brain—and eye tracking, Nielsen quantified where and how the participants’ focus was being directed.
Among the key findings, which Sharethrough explores in more detail, are:
- Native Ads Appear to Receive Two Times More Visual Focus than Banners
- Banners Are Processed Peripherally
- Native Ads Are Being Read
- Native Ad Headlines Can Be Optimized to Trigger Associations
- Brand Assets Impact Brand Resonance Lift
Quartz’ Alice Truong takes a look at the research in her article:
Scientific proof that no one pays attention to banner ads
Collective Patents its Cross-platform Targeting Technology
Collective, Inc. has been granted a patent for its technology that targets online advertisements to consumers based on their TV viewing habits by anonymously linking linear television data to digital across multiple screens. They filed for the patent in 2011 when they launched TV Accelerator, and CEO Joe Apprendi describes the four years since as “pretty interesting” given the industry’s increasing desire for cross-platform measurement in that time.
While Collective has been successful deploying its own solution, Apprendi says “now that we’ve got the patent, we’re excited about trying to make this as the standard in the market and establishing a licensing model around it.” Their U.S. Patent 8,949,890 protects the company’s offering and reinforces Collective’s expertise in multiscreen advertising.
Certainly, as audiences increasingly consume content across multiple devices, media companies and marketers are interested in understanding their behavior and, of course, the impact of advertising in various contexts. Interestingly, Apprendi says the origins of Collective’s TV Accelerator solution can be traced to Nielsen research that showed the remarkable lift in ad recall when consumers were exposed to a television commercial and online video. With the Nielsen process, however, this could only be measured when it “randomly happened.” Apprendi says his team realized that “Collective was in a unique position to make this happen intentionally.”
Nielsen’s recent acquisition of Exelate, a company that connects identities across all channels and devices, signals that others would also like to be in that position. Apprendi notes that Nielsen was in the “ad effectiveness and audience measurement business, which they are good at. Now it looks like they are moving into targeting, using data to target, not just measure.”
Collective, says Apprendi, has a head start given that they’ve been doing this for a decade already and have core capabilities focused on data-driven programmatic advertising and on providing audience contextual data to deliver targeted ads across platforms.
Apprendi believes that most marketers are buying publishers’ channels as a proxy to reach a target audience but they should actually flip that thinking in order to make better decisions about where to make big direct investments. “Let’s not make those decisions until I buy programmatically across platforms, screens and media brands to see which avenues and media brands perform the best.” Then, he says, marketers can make better decisions about where to make a significant direct investment.
And, while he sees signs of progress with more and more agencies having a unified video buying practice across screens, he says that “The Holy Grail is that eventually, we’ll see marketers start with audience first, rather than ‘this is what I did on TV now what should I do to support that.’”
At a minimum, Apprendi encourages marketers to use data and analytics to determine which media outlets make the most sense for their objectives. And on the publisher side, that data can help them respond or optimize their content strategy. “Data makes everyone smarter at the end of the day.”
The long term goal, however, is “full transparency,” says Apprendi. This is not just cost transparency, but includes issues of inventory quality and viewability. “We need transparent relationships if we want to help marketers make smarter media decisions.”
DCN’s Recommended Reading: Week of March 5, 2015
- Guardian: Nobody cares about their online privacy… until it’s gone (4 min read)
- WSJ: Facebook Policies Taken to Task in Report for Data-Privacy Issues (4 min read)
- Variety: Why the FCC’s Net Neutrality Vote Matters to Hollywood (5 min read)
- LATimes: FCC vote could be game changer for Internet privacy (6 min read)
- Guardian: Financial Times to change way it charges for online content (5 min read)
- ZDNet: ‘Trust is the new currency’: Can the mobile industry win back users with privacy promises? (4 min read)
- WSJ: Is Facebook Friend or Foe for Telecom Operators? (6 min read)
- Re/Code: Comcast-Time Warner Deal Critics Ramp Up Opposition (3 min read)
CBS Interactive Unveils its Custom Content Studio: Studio 61
The number of US digital video viewers is predicted to hit 204.2 million this year. Add to that the fact that than half of marketing professionals believe that video is the type of content with the best ROI and it’s no surprise that video content marketing is in the spotlight. But the reality is that creating compelling content marketing isn’t easy, particularly when it comes to video—which requires distinct technical and storytelling expertise. CBS Interactive believes there’s an opportunity to help creative marketers better capitalize on this opportunity with the launch of Studio 61, its new initiative focused on custom branded content.
Studio 61 brings together a hand picked team of creative and production talent that, according to Dave Morris, Chief Revenue Officer for CBS Interactive, includes experienced producers who have a background in video storytelling and access to sophisticated studios and production equipment. These teams—which will be based in San Francisco and New York– will be even more effective because they will work closely with the product marketing teams from each of from each of CBS Interactive brands, which include CNET, CBSSports.com and GameSpot, who bring deep subject matter expertise and audience insights.
“We are not a replacement for an advertising agency,” says Morris. “But when agencies or marketers come to Studio 61, we can offer data on our specific audiences and passions.” This approach helps ensure that the marketing content will resonate with audiences, whether the the content is short or long form, social media content, infographics or custom data integrations. Morris notes there are a growing number of agencies that are seeking to leverage video content marketing as part of their overall strategies one of which, MEC, has signed on as one of the first Studio 61 partners.
CBS Interactive has seen an increased demand for this type of service for many reasons according Morris. Chief among these is the steep growth in video viewing, which is being fueled by bigger smartphones and faster connectivity. However he also points to the fact that today, marketers have better channels for video distribution. “A few years ago, if you created great video and it didn’t go viral, it didn’t go anywhere. Now you have advertisers with millions of social followers. So they can distribute across our sites, but they also have an opportunity to distribute content through their owned and earned media.”
Morris points out that the company “had all the ingredients for a custom content marketing branded studio, and we saw that the time was right to put it together and bring Studio 61 to market.” The creation of this focused brand content studio will provide the company to elevate its work on branded content. At Studio 61, “we’re going upstream and working with closely with the marketers before we create something.” This, Morris believes, will create the kind of finely-tuned content experiences that will satisfy marketer’s demands and—even more importantly— provide value to our audiences.
Nielsen Acquires Ad Tech Company, eXelate
According to a Nielsen statement
Nielsen [has] announced that it has completed its acquisition of eXelate, a leading provider of data and technology to facilitate the buying and selling of advertising across programmatic platforms. This acquisition allows Nielsen to enable its clients to make better and faster marketing and media decisions.
With the acquisition of eXelate, Nielsen clients gain the ability to activate in real-time Nielsen audience insights as well as eXelate’s aggregated consumer segments from over 200 data providers. eXelate’s advanced technology leverages data to inform the highest quality programmatic buying decisions in the marketplace…
eXelate aggregates and distributes third-party online data, composed of premium demographic, interest, and intent data from over 200 online and offline data providers. Nielsen intends to further develop and expand eXelate’s already rapidly-growing data marketplace and innovative technology solutions.
Nielsen’s acquisition raises a couple of key issues:
- This move may have a significant impact on data-driven programmatic for television. The acquisition brings Nielsen closer to being able to transact based on measurement and data.
- It will enhance measurement capabilities from Nielsen, answering questions about who’s watching and what they’re doing post exposure.
- It will be interesting to see if comScore responds with an acquisition of its own in order to keep up (though there are few independent DMPs left out there, such as Krux and Lotame)
Coverage of the acquisition:
MediaPost: Nielsen acquires Exelate for undisclosed terms
AdExchanger: Nielsen Acquires Data Platform eXelate For Estimated $200 Million
Why The Future of Data Journalism Is… Less Data
photo credit: r2hox
“Data is the new oil.” That is the modern maxim across a host of industries. From shopping to shipping, businesses are being urged to gain better insight and improve performance by delving into their underlying numbers.
News media are no different. In the last couple of years, journalists have been encouraged to adopt analytics software as part of their daily editorial efforts. Now it is common to find newsroom editors checking their page views, time-spent metrics and social referrals on a minute-by-minute basis.
This kind of data used to be kept under lock and key, used only by website technical administrators and advertising auditors while journalists gave two hoots. Now, with always-on dashboards from tools like ChartBeat, Parse.ly and Outbrain’s own Visual Revenue in the hands of editors, content producers are becoming skilled numerical interpreters. The industry has come a long way. However, it’s time to go to the next level.
Modern journalists are constantly being told which new skills they have to learn—a dizzying array of video production, coding, even drone-flying. Those who also up-skill sufficiently to become their newsroom’s virtual data analyst can manufacture higher user engagement for their employer. After all, when particular stories cause traffic to spike, writing more of the same is a quick win.
But more data doesn’t necessarily produce better journalism. The danger with the growing role of audience numbers in publisher strategy is the risk of over-reliance, creating a belief that every reader data point should be responded to with an editorial outcome: quantity and category of story over quality and ambition.
The Data Endoskeleton
The first wave of newsroom analytics has served its purpose (we now know that stories about kittens and celebrities trend well, for example). What the professionals now need is a support system that does not encourage them to make snap decisions based on reams of numbers, but one which is more in harmony with the craft of editorship, playing a softer and more symbiotic role as editors’ sidekick, not their auditor.
Just think of the way Apple’s Siri hides from users so much of the underlying data that rival services like to bombard them with. When you ask Siri a question, it returns not unlimited options but fewer, more directed opportunities.
In the same way, in the next wave of publishing analytics, software would suggest publication improvements after noting not just the raw, blunt performance of site content but also the priorities and goals of conscious managing staff.
In the future, editors should be able to pre-populate their software with their own, qualified goals and ambitions, helping tailor system recommendations that are in line with publications’ true missions.
By allowing editors to make smarter decisions that are based on their own instincts, not just being a slave to the spreadsheet, the industry can keep readers coming back and rediscover a lost metric: lifetime value.
You can already see the beginnings of this new philosophy being applied. Despite often being accused of publishing low-brow click-fodder, BuzzFeed looks at engagement and virality in a whole new way. Publishing purely to the numbers could have prompted it to publish even more cat slideshows and quizzes. However selective insight led it to also make risky bets, such as its commitment to long-form journalism, which have ended up surprising success stories.
Publishers who are excitedly following the data trail to clear traffic growth should pause to reclaim their own part in the process. In a world becoming familiar with the concept of automated “robo-journalism,” I envision the future of journalism not as this replacement of flesh with circuitry, but as an endoskeleton—a perfect combination of the best qualities of each.
Matt Crenshaw is the Vice President of Product Marketing, Engage at Outbrain. Matt is responsible for setting the product vision and delivering solutions to publishers that build their audience relationships and grow their revenues.
The President’s Proposed Consumer Bill of Rights Moves the Conversation in the Right Direction
Last week, the President publicly unveiled a proposed Consumer Privacy Bill of Rights. While the bill language needs some work, the President and his team should be applauded for introducing some good concepts that would improve consumer privacy.
For one, it makes a lot of sense that consumers should understand how their data is collected and used. Reputable companies are transparent with consumers about the ways in which their data may be used and there are multiple ways in which they can (and do) provide consumers with a means to opt-out of various data uses. But there are many companies that aren’t so forthright. Educating consumers is a good first step because the industry won’t fully gain the trust of consumers without first educating them. And without consumer trust, the digital ecosystem can’t flourish.
Another good concept in the Privacy Bill of Rights is the idea of “context,” which again maps back to consumers’ expectations. The bill notes that data collected in one context and then used in another context should be subject to some level of control by the consumer. Depending on the sensitivity of the data and/or its use, this could mean providing the consumer with an opt-out or an opt-in.
By emphasizing the importance of context, the President highlights that fact that every day consumers make conscious and subconscious decisions about whether their favorite websites or digital services provide a sufficient value proposition to continue that relationship. Most consumers are perfectly agreeable to exchanging some of their data for access to free content or to have a more engaging or personalized experience. However the value proposition is eroded when consumers don’t trust that all parties will respect their data–especially if they are not even aware of all the parties that may collect their data. Providing consumers with more controls over unexpected data collection and use would go a long way toward regaining the trust of consumers, which is why we’ve argued for industry to develop a DNT standard.
Finally, the bill notes that de-identified data should be outside the scope of this law. This is an enormously important concept because it allows companies to continue innovating with “big data” sets. By allowing for the use of de-identified data, researchers, scientists and entrepreneurs can better understand how data flows, how it might be used differently and develop new technologies that we cannot even imagine today.
As many have noted–and will no doubt continue to be highly vocal about–the bill has some serious flaws in that some of the definitions are overly broad or seem to contradict the President’s intentions. But there are concepts within the President’s proposal that are well worth discussing. As an industry, we should be taking this proposal very seriously because it’s yet another sign that more work needs to be done to regain the trust of consumers.

