Every day we’re reminded about the limited resources facing our industry, from the time people have to produce stories to the lack of insights we have about decisions made at the big platforms. There’s an irony then that it feels like there’s more data available to us than ever before. Yet many people still aren’t sure how to make the most of it.
Trying to improve business models, audiences, or content simply by adding more data doesn’t guarantee any success. Now, more than ever, the answer is finding the most relevant data—and making sure we’re uncovering all the available opportunities the data we have can provide.
Historically, some of this data has been hard to get to. The major technology platforms see it as theirs to wield. So, without major undertaking, it can be hard to piece the different types of data that audiences coming from Facebook, Google, your own editorial efforts, and untrackable sources actually show.
The Parse.ly data team, using Currents, looked at a recent major news story, immigration, as an example, to uncover opportunities for media companies that might have been missed in other data sources.
Finding the under-covered angle of the story
When Trump’s “zero tolerance” immigration policy separated children from parents, #KeepFamiliesTogether picked up steam on social media. But how did that translate to attention for the articles publishers wrote, and what did people want to know more about?
Over the course of one week, June 18 – June 25, there were 590 articles about immigration getting attention in our network. And that attention was vast: 16 million views.
So many articles means the topic was well covered, right? Not necessarily. Take the topic of “asylum seekers” for example.
Only 20% of these stories were related to asylum seekers. However, they received over 30% of the attention. High traffic per story suggests this angle was potentially under-covered and under-promoted.
Understand the differences between referral types
So where exactly were people finding stories about immigration? The biggest source of traffic was social media, which drove one-third of traffic to immigration stories. Given how much the media industry talks about Facebook, this may not seem surprising. But this is actually about double the typical traffic the network sends to any given story. It also bucks the trend of Google as the dominant source of traffic.
Other important ways readers learned about this issue? Directly from news sites, no platforms needed. About one-quarter of the overall traffic to immigration articles was from editorial promotion: site homepages, section pages, and links within articles.
For the specific topic of asylum seekers, we looked at where else people talked about this topic:
Twitter sent almost as much traffic as Google to stories about this topic. And Instagram shows as a significant referral – one of the more surprising results of this data to me. With data about what’s relevant to readers right now, and where, teams can be more pointed in how they spend precious resources, including their time.
Narrow in on what topics matter to different localities
Where people are paying attention to stories doesn’t just apply to places on the internet—search, social, etc. Readers’ physical locations impact attention, too.
For stories about immigration, attention wasn’t at the same level across the entire country. Certain areas, including parts of Texas, New Mexico, and Arizona, over-indexed, represented by darker colors on the maps below.
Geographic data can also inform where a story gets distributed. If you’re pushing your immigration story out on Facebook and Instagram, geo-target cities or states where attention, and therefore interest, is already high. Or if your newsroom is part of a network that spans multiple regions, this information can help guide syndication strategy.
Don’t get more data. Get relevant data
It would be amazing if organizations had more of everything: more staff, more resources, more time, more universally accessible sources of data and information. But the reality is you have to pick and choose your data wisely. You need to use sources that can find opportunities for your site, instead of accessing the same list or information that everyone else uses can make the most of that data.
In the absence of time and resources, focus on making sure you have the right data at your fingertips. Pay attention to the data that’s relevant: what audiences care about right now, where they’re finding that information, what stories are related, and where the gaps are.
Google and Facebook’s dominance of the digital advertising market is well documented. Pivotal Research’s Brian Wieser estimated at the end of 2017 that these two companies “accounted for 73% of all digital advertising.” In the same report, he also estimated that the duopoly accounted for 83% of all growth, which means their grip is only getting tighter.
Scale is the main reason these companies dominate. Facebook has a huge user base, particularly when you factor in its ownership of Instagram and WhatsApp. And let’s not forget its “like” buttons on more than 8.4 million websites, a number Facebook recently disclosed in its answers to lawmakers’ questions during the Cambridge Analytica hearings. Despite lingering legal questions, those “like” buttons continue to collect data about the sites visited by Facebook users regardless of whether those consumers are logged in to the platform. Even more concerning, the buttons also collect data about non-Facebook users. The result is a massive and rich database of consumers’ personal information and activity, which Facebook offers to advertisers for targeting ads to very specific audiences on and off Facebook’s properties.
Google also offers hugely popular consumer services including its search, Gmail, Chrome, and maps applications. Of course, it mines them for very sensitive information about consumers (e.g. personal messages, friend networks, interests, location). Oh, and let’s not forget that Google has a hand in every part of the digital advertising supply chain. Like Facebook, Google is ubiquitous and nearly impossible for consumers to avoid.
No competition
Some might be tempted to argue that these companies serve as competitors to one another. Except that they don’t compete on price. The fact that these services are offered free of charge has enabled their lawyers to shield them (so far) from any antitrust scrutiny in the U.S. What they really compete over is which company can collect more data about consumers. This unbridled competition for data is even more corrosive as it lacks transparency and they offer no effective choice mechanisms for consumers.
Consumer groups assert that this dynamic is bad for privacy. And, they might have a point: Consumer trust is at all time low while adoption of ad blockers continues to rise. Yet, Google and Facebook haven’t felt the impact of this lack of consumer trust. That’s because these companies don’t care where ads are served and they have virtually unlimited places to serve ads (meaning ads can appear near any content, which can create a negative impact by association). Instead, other tech companies, marketers, and publishers bear the brunt of consumer ire.
A dangerous game
But, there is more going on here. Facebook and Google have created a feeding frenzy in the vast, and expanding, data pool. Other companies recognize that have to jump in or risk their ability to compete for digital advertising dollars. Verizon bought AOL and Yahoo! solely to have more ad technologies and inventory. Combined with the significant information about the activities of their user base, this might give them an opportunity to compete for the duopoly-dominated digital advertising dollars. Not to be outdone, AT&T bought AppNexus, the largest independent ad technology company. It’s hard to fault these companies for joining forces given the world they’re competing in.
From my perspective, the duopoly is creating a giant game of musical chairs. If this dynamic is allowed to continue, it’s not hard to imagine that publishers will be forced to pair up with a telecom provider or to form giant conglomerates in order to ensure that they have a chair when the music stops.
Unfortunately, this will have profound effects on well-known premium publisher brands and an even more devastating impact on the ability for new voices to emerge. Ironically, the internet was supposed to increase the number and diversity of voices. Instead, the duopoly is quietly reshaping the web into a dystopian data collection machine.
Marketing professionals, sales teams, and business owners are constantly chasing new audiences to target. This is only natural, as the more your grow your business and the more customers you reach, the greater your success. It is the core element of all business: the need to achieve stronger and greater consumer support.
But we all know that it’s not a “piece of cake” targeting new audiences, especially when they are beyond who you normally encounter or engage with. There are a lot of unknowns in increasing your audience targeting size. And marketers and publishers need to evolve with the technology around them.
While Lookalike modeling isn’t a term that is new to the marketing technology industry, it is still widely misunderstood. It is a tool that has been used by companies to expand their digital audiences while maintaining clear and relevant targeting practices. This piece will take a look into what Lookalike modeling is, how it works, and why it is so important for advertising campaigns.
What is Lookalike Modeling?
If you are looking to increase your targeting efforts with high performing audiences, the answers don’t lie in a mass chain of random messaging to consumers who have zero interest in your product. Instead, your focus should lie on your high performing audiences. What is it about your highest performing audiences that set them apart from the rest? Do they have common interests? Are they from the same geographic area? The best way to acquire new, high performing visitors is to focus on users who resemble your existing visitors, the users who have already shown an interest in your product or service.
Lookalike modeling is a process that utilizes machine learning to statistically analyze a given seed audience (already high-performing audience), identifying the demographics, characteristics, and different combinations of those and other data points. This creates new audiences composed of users that match these learned insights, which are continually updated.
For example, let’s say you’re are hoping to target people who are more likely to click on your ad or watch your video. Lookalike modeling uses machine learning to find more users who will take that specific action. This means your campaigns can scale to reach more people, with a high engagement rate.
How does Lookalike Modeling work?
Let’s say you are a clothing brand looking to boost online purchases for an upcoming sale. The first step would be to place a pixel – a small segment of tracking code – on your purchase confirmation page. This will allow you to track the behaviors of purchasers – during the current sale – as they move throughout the web.
The demographic and behavioral data points of anyone who completes a sale and makes it to the confirmation page can be ingested into a DMP platform and analyzed to identify which behaviors and patterns are most common among the audiences. Once those customer characteristics and data points are identified, you can use them create your new seed audience and ultimately engage with an even greater target audience.
It’s a truly incredible process that uncovers the hidden attributes that can optimize your performance for future campaigns. Icing on the cake? It’s all done in a centralized platform.
Why Use Lookalike Modeling?
Well, I guess the question would be, “Why not”? As an advertiser, when consumers within a specific audience converts, it’s a fulfilling feeling. And while you want to hold on to that high performing target audience, you still want to ensure you are growing your brand following.
By using a lookalike modeling tool, it helps you to identify a larger pool of possible customers. You can use the tool to seek audiences with behaviors that match up to your target audience, and so you have a greater chance to convert them. It’s essentially the same as building a robust profile of existing customers, only you are doing it for the audiences you have yet to reach out to and engage with.
In summary, Lookalike Modeling offers marketers and publishers a valuable approach to reaching new or current consumers in a cost-effective way, ultimately helping companies to grow the scope and reach of their businesses. Why not take advantage of your already high performing audiences and increase your brand awareness? As the industry continues to look for new and innovative ways to reach new consumers, lookalike modeling will help marketers stay true to their core and use what they already know to improve campaign success.
G-day is looming! On May 25, the EU’s General Data Protection Regulation (GDPR) will kick in and – despite the hype – many publishers aren’t ready for it.
Over the course of Advertising Week Europe, held March 19-22 in London, several panels grappled with the issues surrounding GDPR. It became clear that, while marketers are focused on new ways to reach consumers, publishers are faced with the challenge of understanding consumer consent, and if necessary de-personalizing the message, while maintaining trust and keeping revenue streams flowing.
During the Digital Content Next seminar, I had the opportunity to ask DCN CEO Jason Kint, Anthony Hitchings, Digital Advertising Operations Director for the Financial Times, and Jo Coombs CEO of Ogilvy One UK, whether GDPR (referred to in some circles as the God Damn Privacy Rules) represents a massive headache or a huge opportunity.
Headache or Opportunity
Trust is the number one priority explained Kint. “Trusted relationships and transparency become key as publishers work with vendors and with consumers to ensure satisfaction. Premium publishers could see user loyalty rise, if intrusive messages become standard on all other sites. Premium publishers, with trusted and recognizable brands, stand to come out ahead as the industry experiments.”
“It’s certainly keeping us busy, but I wouldn’t say it’s a headache,” said Hitchings. “For months now, we’ve been doing supplier reviews — due diligence with all our suppliers. We’ve been doing system audits, we’ve been practicing system access requests based on our own platform, and looking at the length of time we hold data for.” Even though this might seem like more than a bit of a headache to many, Hitchings said he’s optimistic, because the FT has a “direct relationship with users.”
The Relationship Business
All the panelists agreed that the nature of the relationship with users is key. Publishers need to be open and honest about data collection. However, it poses a serious threat if the industry at large thinks it’s just fine to grab profiles or take surfing data. “When you know how consumers feel, you start to feel slightly more concerned about what the industry does,” Hitchings added.
“Every single piece of data represents a person, and that person obviously needs to be protected and respected by the brands. We are talking to brands about, not just the media they are purchasing and how they are using consumer data. This is every single touchpoint with a customer. It is their data and they need to be in control of it,” said Coombs.
However, she added that while a lot of work is being done on GDPR from the business side, consumers are not being educated. They need to better understand what they are giving up when they click on a box to agree to get rid of a cookie use banner she explained. “We need to help consumers understand what they are ticking and what they are saying yes to,” she added.
Hitchings pointed out that trying to convey the complexities of ad exchanges to users is “not going to be an easy job.”
“The biggest concern that publishers have right now is the actual execution of how to have that very discussion,” said Kint.
“Publishers that have a trusted relationship [with users] can decide what to put in front of that user. Typically, if a user is visiting a site like the Financial Times, or even a search engine like Google, they have a general idea that their data is being collected and used in certain ways. And they are probably okay with their data being used for personalization of the page, or posting on the message board, or fraud protection, etc. But for a data broker to be watching what they’re doing and then reusing their data across the web is probably outside their expectations,” he continued.
Data Dilemma
“Google currently collects data from about 80% of the top one million sites and uses that data however it wants to maximize its own value. The very notion that Google can continue to do that after GDPR is a concern. Users do not want Google watching the web,” Kint continued.
“There’s a fine line between being personalized, useful and relevant, and being a useful value exchange, and actually being just too creepy,” agreed Coombs.
The issue that hasn’t really been addressed by the industry is third-party risk, said Hitchings. “We’re trying to understand what adtech is doing with data and one piece of due diligence took more than a year! So, I don’t think publishers are going to have the bandwidth to assess the risk for more than a handful of partners,” he explained.
This aligned with Kint’s view that only trusted publishers and partners can be successful post-GDPR. And that’s going to be a good thing for consumers as well. Hitchings pointed out a couple of thousand trackers on every page is just daft — and creepy.
Better known as Brusselsgeek, Jennifer Baker has been a journalist for 20 years, the last 8+ specializing in EU tech policy and digital rights. A member of the Expert Council of the Good Technology Collective, Jennifer is on the editorial advisory board of the Journal of Data Protection and Privacy, and was named by Onalytica as one of the world’s Top 100 Influencers on Data Security 2016. She was also listed by Politico as one of the Top 20 Women Shaping Brussels 2017.
Jennifer writes for some of the biggest names in media, including ArsTechnica, Computerweekly, TheNextWeb, Macworld, PCworld, and The Register. She regularly features as an EU policy expert on BBC radio.
The digital media business is finally making the shift from attention to user engagement. We see users as individuals rather than sets of eyeballs and focus on winning hearts and minds. This is a huge, ultimately positive change that will produce a much healthier media ecosystem. But it’s not going to be easy. It requires new technology, new marketing and product skills, and most importantly a change in mindset from content-first to customer-first. That means moving away from some very entrenched habits.
The first 20 years of the consumer internet, especially in media, have been almost entirely about aggregating audience. Sites seek to attract millions, often tens of millions, occasionally hundreds of millions of people, with all those eyeballs “looking” at billions of banner ads. That focus on big, unidentified, often undifferentiated audiences made it possible for media companies to take the existing ad models—based mostly on audience size—and adapt them pretty easily to digital. Yes, there were significant creative and technical challenges in making that shift—learning to create digital stories and to sell and serve digital ads. But fundamentally the model itself didn’t change much.
The relationship with the audience was still largely a one-way, anonymous relationship, despite the new ability to engage directly, to measure behavior and to learn more about that audience. Most media companies were shortsighted, opting to avoid friction-inducing roadblocks like registration in order to maximize unique visitors, pageviews and ad impressions, missing a chance to develop a direct relationship with readers.
The end result: massive harvesting of user data, ad-cluttered sites powered by the ad tech Lumascape, “recommended for you” widgets, ad fraud, and ultimately unhappy, ad blocking users.
But simply launching a paywall, adding affiliate links or announcing an event series isn’t enough (hello, Buzzfeed). That’s just throwing new revenue streams up against the digital wall like spaghetti. There are four essential elements required for success in the new user-engagement era of digital media: customer knowledge, product strategy, enabling technology and marketing skill.
Let’s dig into each in turn.
Customer knowledge
In the attention era, media companies didn’t need to change their fundamental model. We could still follow an editor-first content strategy—writing about what editors thought was important or interesting. And the ad tech revenue stream didn’t require any understanding of who was reading beyond some basic demographics. Yes, there were audience analytics, paying attention to SEO trends and later social traffic. But the starting point was always what WE thought was interesting. We didn’t truly know our customers. In the user engagement era, understanding the reader (or viewer) has to come first. Whom are we serving? What can we learn about them? What do they need to live their lives, do their jobs or be entertained? Then we can apply editorial and product creativity to serve, surprise and delight them with great products and stories they didn’t know they wanted.
Product strategy
Once you know your customer, developing the right product to serve them takes more than creativity. It also requires focus, experimentation and iteration. In product management terms, it’s “finding product-market fit.” Focus means keeping your eye on the customer you’ve identified when deciding what product ideas to pursue and rejecting ideas that aren’t a fit for those customers. Experimentation and iteration go hand-in-hand. Buildenough of the product to test it with your customers (or at least a few of them), see what works and what doesn’t, and iterate to make changes and improve the product. This method will apply across multiple dimensions of product and business decisions—from editorial and product focus, to features, to pricing. It’s also an ongoing process, continuing even after achieving success.
Technology
While there has been a massive investment over the past 20 years in ad tech, there’s been relatively little investment in software and services to understand and engage with users as individuals, to measure behaviors like loyalty and conversion to repeat usage. Driving user engagement and powering consumer-paid content requires a robust technology platform that provides measurement and reporting, customer messaging, content gating rules, entitlements, and payment processing. Moving forward, machine learning will be a powerful tool for anticipating which users are most likely to become loyal and ultimately willing to pay.
Marketing skill
It’s become conventional wisdom among media business observers that—through his often disparaging tweets—President Donald Trump deserves a lot of credit for the recent success driving subscriptions at The New York Times, The Washington Post, and other media companies. The follow-up question is often “What happens when the ‘Trump bump’ fades?” Piano’s CEO, Trevor Kaufman, points out that’s a pretty limiting way to look at it. No one asks, “Will consumers pay for Nike shoes?” the way media pundits ask “Will people pay for journalism?”
The problem is that most media companies don’t know how to think like product marketers. They generally don’t have the skills in house, haven’t got the tools available and aren’t building marketing into their business plans and P&Ls. So, once you understand your customer, create a compelling product they’re willing to pay for and have the technology support. The last element to put in place is the ongoing marketing plan to drive customers through the engagement funnel. Then Google and Facebook transform from behemoths with the power to slash your audience and destroy your business into just another channel for marketing your product.
The ultimate vision is a healthy media market based on true relationships with known customers. For publishers, creating products that meaningfully connect with a loyal audience will unlock multiple revenue opportunities—whether consumer-paid products, events, merchandise sales or even advertising based on that real customer connection.
Strategies for differentiating their premium news and entertainment companies in an environment of disruption, trust issues, and monetization challenges were the focus of the annual closed-door members-only Digital Content Next (DCN) Summit held Feb. 8-9 in Miami, Florida.
DCN CEO Jason Kint updated attendees on consumer privacy, net neutrality, and press freedom policy initiatives. He said that pressure on platforms will increase this year and that advertisers will seek greater transparency. Kint cited findings from DCN’s new Distributed Content Revenue Benchmark Report, which found that publishers only garner 5% of their revenue from social platforms. However, he also touched upon the growth in paid content, on-demand video, and promising signs of sustainable advertising models.
Trust
For the digital media industry, Trust has reached a crisis level, Kint said. He and other speakers throughout the event pointed to the 2018 Edelman Trust Barometer, which reveals a low consumer perception of the media, platforms, and advertisers—particularly around digital.
An absence of trust has been a driving factor toward regulatory scrutiny in the U.S. and abroad. It has also profoundly affected digital advertising, one of the mainstays of the industry. Kint applauded DCN members for embracing DCN’s new tool for rebuilding trust: TrustX. The cooperative private programmatic marketplace serves as a collaboration platform for marketers and publishers to create innovative advertising solutions that drive measurable value and improve the consumer experience with confidence and safety at scale.
Kint was far from alone in extolling the importance of trust in the digital content marketplace, however. Fatemeh Khatibloo, principal analyst at Forrester Research cited the building blocks for trust, which include integrity, competence, transparency, privacy, and data security.
David Sable, Global Chief Executive Officer, Y&R, noted that trusted brands employ honesty, environmental sustainability, and kindness. He also pointed out that millennials are keen to identify trusted news sources. Building trust starts early, according to Sean Cohen, president, International and Digital Media, A+E Networks, citing how brands such as the History Channel have become a trusted source for students.
While Edelman’s barometer noted a five-point jump in trust of journalists, a social media-weaponized world has given way to readers and viewers expressing anger, often anonymously and without consequences, as vividly reported by a panel of journalists— Arianna Davis of Refinery29, Jorge Ramos of Noticiero Univision, CNN’s Brian Stelter, and Katy Tur of MSNBC Live.
Brand Quality and Context
People won’t pay for brands that don’t focus on quality, noted Andrew Essex, former CEO of Tribeca Enterprises and Droga5 [pictured, top]. Quartz President and Publisher Jay Lauf also emphasized value-based selling over commodified volume selling.
Context is critical, he said, adding that marketers “are terrified” about ads appearing on an exploitive YouTube video or inadvertently funding fake news on Facebook. And Hearts & Science research on negative reach confirms advertising appearing next to content a consumer finds offensive does more harm than good according to the agency’s president Zak Treuhaft.
And, in a world dominated by memes and disembodied news delivered via social platforms, “Context is king,” according to Sean Cohan, President, International and Digital Media, A+E Networks. For example, he pointed to the History brand’s increased emphasis on providing a larger historical context for today’s news, such as the history of sports figures’ involvement in political protests.
Disruption and Opportunity
Disruption has led to a competitive marketplace imbalance as DCN member companies try to transform their business models, as Kint noted. At the same time, disruptive technologies, such as voice assistants, can create significant opportunities.
Loren Mayor, COO, NPR, spoke of the station’s mission to connect with people through storytelling journalism and is using on-demand audio and podcasting to enhance audience growth and engagement.
Smarter use of data and respectful personalization were subjects that came up in a number of conversations and presentations. More-informed data will help drive value, according to Lou Paskalis, SVP, Enterprise Media Planning, Investment and Measurement Executive, Bank of America Merrill Lynch.
Marcus East, EVP, Product & Technology/CTO, National Geographic, said that successful brands create personalized experiences and help consumers save time and money, create emotional connections, offer life-changing elements, and promote positive social impact.
That said, in today’s uncertain digital environment, the hallmarks of reputable journalism have reemerged as critical for consumer trust and attention. Michael Anastasi, VP News, USA Today Network, Tennessee pointed to importance of the Indianapolis Star’s investigative coverage of U.S. Olympic gymnastics doctor Dr. Larry Nassar, which stands out in a time of local news outlets’ survival uncertainties.
Anastasi said that USA Today leverages its local/national symbiosis on to inform some of its stories. He cited the brand’s coverage of the opioid crisis across all platforms—and with national, local, and individual ramifications. The comprehensive coverage was made possible through a sponsorship from BlueCross BlueShield of Tennessee.
In addressing financial sustainability in non-profit journalism, ProPublica President Richard Tofel noted significant growth in donation-based revenues since the 2016 U.S. presidential election. The non-profit model seems to be working for ProPublica as Tofel said that they launched with a staff of 25 nine and a half years ago and now number more than 100.
Diversification and Monetization
Unsurprisingly, revenue was a key topic at the Summit. And while advertising remains a critical focus, diversification was a dominant theme. In all aspects of monetization, good consumer experience and engagement were essential. As Ed Davis, EVP & CPO Advertising Products, Fox Networks Group put it: “Attention is currency.”
Maggie McLean Suniewick, President, NBCUniversal Digital Enterprises, showed off the many ways the company’s Olympic coverage is tapping into a wide range of platforms to engage target audiences wherever they might be. Bloomberg Media’s initiatives include global partnerships that help it transcend the competitive U.S. market according to Scott Havens, Global Head of Digital, Bloomberg Media. And The Washington Post has launched 15 products specifically designed to engage consumer interaction according to Jarrod Dicker, The Post’s VP of Innovation and Commercial.
The History Channel is leaning into new platforms and partners with The New York Times on stories and photo spreads. Sean Cohan, President, International and Digital Media, A+E Networks said that the company is seeing doubled social engagement, significant newsletter interest, and substantial boosts in YouTube video revenues.
Marty Moe, Vox Media President, said his company focuses on finding ways to grow quality, scale, and audience across its eight brands while retaining relevancy on each platform. However, diversification brings challenges such as tracking and measuring performance on multiple platforms, noted Christy Tanner, EVP & GM, CBS News Digital CBS interactive.
Dr. Jens Mueffelmann, CEO, Axel Springer Digital Ventures GmbH, President, Axel Springer USA, said his company’s success in global acquisitions is based on later-stage investment, development and partnership. While its successful classified ad profits have stunned critics, Mueffelmann urged companies to “stay paranoid” and continue to keep a close eye on emerging digital technologies and players.
On the heels of the news that The New York Times added 157,000 digital subscriptions in the 2017 fourth quarter, pushing its subscription revenues – which comprise 60% of overall revenues – to more than $1 billion, COO Meredith Kopit Levien encouraged everyone to get into the subscription business. It’s important to understand what drives subscribers, she said. For The New York Times, it’s the resources to create better original content, including 250 daily stories, a popular crossword puzzle and a cooking app, she said, noting “our strength is as a brand.”
While challenges in trust, brand quality, disruption and diversification continue to throw roadblocks up in the news and entertainment industry, Kint emphasized that for DCN members, there is strength in numbers, citing The New York Times’ subscription victory as a victory for all DCN members because of what it symbolizes for the industry.
At the core, DCN members are focusing on what they do best and continue to innovate and experiment in order to best serve audiences.
“All of our members have a direct and trusted relationship with your audience and with your advertisers,” Kint told the packed conference room. “They come to your brands because they know what they’re going to get when they give you their valued attention or valued advertising dollars.”
Spend on data-driven advertising is surging as brands continue to realize the benefits of more granular and personalized campaigns. However, the way the advertising industry uses, captures, and analyzes consumer data will change dramatically in 2018. And that will impact just about everything.
Here are two big changes on the horizon that are posed to significantly impact digital advertising:
GDPR is coming – and that’s OK.
In May, GDPR will go into effect. And with it comes new rules surrounding digital data collection. Now, EU audiences will have more control over how their personal information is indexed and used.
In the ad industry, reception to this new regulation has generally been negative. Brands and agencies rely on consumer data to develop personal – and ultimately successful – advertising experiences. GDPR, however, will treat anonymous and personal data identically, which could have a major impact on ad experiences for consumers in the EU. What’s more, there are significant compliance costs that come with GDPR. Those who don’t obey will face fines, which could amount to as much as 4% of global revenue.
Now, while these new guidelines could cause some early problems for advertisers, GDPR will also bring with it key long-term advertising benefits. For example, GDPR raises the bar for opt-in data collection. While this will reduce the scale of data collected, it will dramatically raise the quality. This means better ad experiences for consumers and stronger ROI. More marketers will realize this as we get closer to the launch date.
Second-party data is king in 2018 ad transparency.
Second-party data was all the rage in 2017 and that will continue this year as well. Essentially, second-party data is someone else’s first-party data that you access directly from them. There is no data aggregator or other “middleman” in the exchange of second-party data. Through a direct relationship with the owner of the first-party data, you can define exactly what data is being bought or sold, the price of the data, and any other commercial terms.
The possibilities with second-party data are endless. Advertisers can choose the data sources they feel are most relevant to their campaign optimization, filtering out all the unnecessary stuff. This is where you can take data-driven marketing to a place that is not only unique, but extremely efficient. And what second-party data may lack in “scale,” it makes up for with precision.
In 2018, second-party data will explode in demand, driven by advertiser calls for transparency and clarity in the data they use. By cutting out the middle-man, marketers can go directly to companies that they know will have the most important or high-quality data. The data is unique and a direct relationship is in place, so quality is never an issue.
With GDPR looming and transparency still an evolving battle, 2018 promises to be an interesting year for advertising data. The industry is changing – and while some aspects of data collection and activation will become harder, as a result, at large, GDPR and the shift towards second-party data are likely to enable a new depth of data quality. That’s a good thing.
As Lotame’s Chief Revenue Officer, Eric Marterella is responsible for overseeing all revenue-focused and client retention initiatives worldwide. Since procuring his degree from Virginia Polytechnic, Eric Marterella has garnered sales leadership experience in various industries for over eighteen years at companies including Digex, AT&T and Cisco. Most recently, Eric served as a Global Vice President at Sprinklr, where he was instrumental in the company’s growth from $20M to a $1.8B valuation. Eric is a co-chairman of a monthly executive leadership forum called The Leadership Breakfast.
As chief marketing officer for IBM Watson Content & IoT Platform, Jordan Bitterman oversees messaging to both internal and external audiences. He is responsible for all marketing functions across the IBM Watson Content & IoT Platform division – connecting b2b marketing, ad sales marketing, consumer marketing, as well as corporate communications. Based in New York, he is part of the IBM Watson Content & IoT Platform executive leadership team and reports to GM, Cameron Clayton. Here, Bitterman offers his thoughts on some issues he thinks will be top of mind for marketers in 2018:
What’s the biggest challenges that marketers face in 2018?
JB: The benefits of the big players in the digital space – scale, relative brand safety and deep targeting capabilities – have always had an equally challenging downside: data flexibility. At the same time these players have been addressing their current marketing needs, they have also been growing a massive stockpile of data on their customers. They have been collecting it, cleaning it and preparing it for use across all aspects of their business: marketing, sales, CRM, logistics, and yield management. The big challenge today: putting it to use.
As an industry, how should we best address this issue?
JB: The big, walled gardens in the industry act like a roach motel: data comes in, but it doesn’t come out. For the most sophisticated brands, 2018 is the year when they will address these issues and start demanding that their media partners become flexible with data. These brands will require that their own data can be mixed with publisher data to be applied on and off the publishers’ own platforms. This “pivot to flexibility” will begin this year and will ultimately up–end the new norms created over the last decade.
What do you think the biggest opportunity for marketers will be in 2018?
JB: As the Internet has become a tangled web of nontransparent, unknown and unsafe environments comprised of millions of publishers, Facebook and Google have been the prime receptacle for brand dollars. They have largely done it in the interest of brand safety – a known is better than an unknown. And, as we have seen, the duopoly accounts for about 85% of all digital ad spend. But that trend will begin to back off a bit in 2018. The opportunity that will come into clearer focus for marketers will be their ability to port data into and out of platforms.
Major brands have cut – or considered cutting – their spend in the programmatic space and have moved those dollars to the duopoly. It is important that we get the message out that programmatic isn’t the problem. Programmatic is just a way of buying media. The sheer volume of the site list – and efficacy of those sites – is the most important issue for buyers to consider as they shift dollars back into programmatic environments. The top 100 global website traffic list, which includes brand–safe properties such as the New York Times, ESPN, c|net and The Weather Channel – is populated with media brands that manage brand safe environments and can deliver audiences both at scale and with extreme competence.
Given the abundance of options for today’s digitally-savvy consumers, it has never been more important to understand their preferences and expectations in order to best serve their needs and retain their business. The intersection of privacy and personalization is a particularly tricky spot to navigate. Today’s consumers are more worried about data privacy than they are about losing their income. At the same time, they expect increasingly personalized digital experiences. There is only one way to bridge that divide, according to the 2017 Accenture Strategy Global Consumer Pulse Research: Build Trust.
Accenture’s research, which surveyed 24,877 consumers in 33 countries, found that 48% of consumers expect specialized treatment for being a good customer. And Accenture believes that this entails “next generation personalization” which they call “hyper-relevance.” Many current personalization tactics are static and relate to certain consumer behaviors. However, Hyper-relevance—like today’s digital consumers—is “always on.” It is much more dynamic, constantly changing and always available.
rather than focusing solely on customers’ purchase behaviors and preferences or relatively fixed attributes, such as their address or number of children, Accenture says it is essential to understand on customers’ needs in a given circumstance and the evolving context in which they make decisions.
As such, data gathered from website visits, social media posts, or previous purchase histories will not suffice. Rather, what’s needed is information that is much more personal in nature—such as health data transmitted via wearable biometric technologies. Needless to say, that’s getting highly personal. And when things get that personal, the potential rewards go up immensely. However, risk also rises.
It is heartening to note that two-thirds of those surveyed said that they are willing to share personal information with companies. But there’s a catch. They will only do so in exchange for some perceived value. And if that value exchange—or the trust upon which it is based—is broken, customers will quickly move on.
Accenture emphasizes that securing and maintaining consumer trust is a prerequisite to achieving the promise of intelligent personalization. They point out that it takes time to build trust, but that it can be shattered with one wrong move. Thus, companies must counter that risk by constantly presenting themselves as trustworthy, keeping their promises, and upholding their end of the value exchange agreement.
As companies seek to deliver hyper-relevance, Accenture makes three recommendations:
1. Look beyond the traditional customer journey
Companies that distinguish themselves with hyper-relevant experiences look beyond the traditional customer journey. They identify and prioritize those areas where hyper-relevance can deliver added value and quickly address the unexpected. Ask questions like: What can we offer once we realize our customer has missed her flight? Received a job promotion? Been forced to flee a hurricane? In these situations, customers need different things and relevance becomes supremely important.
2. Rethink data
Hyper-relevant companies don’t rely solely on descriptive analytics or traditional sources of information. They invest in predictive analytics, collaborate with an ecosystem of stakeholders to capture real-time snapshots of every consumer, and mine data in new ways to understand the customer journey that extends beyond core products and services and across channels. In addition, hyper-relevant companies redouble their data security efforts. They ensure customers have full control of their data across touch points. They eliminate duplicate requests for customer information and permissions. And they make sure all customer data is secure and visible to employees on a need-to-know basis.
3. Earn trust continuously
Trust must be a key consideration when designing hyper-relevant experiences, creating new customer value propositions, and serving as a critical resource when customers need them most. A company’s commitment to delivering the experiences that were promised and meeting customers’ expectations is paramount. Hyper-relevant companies understand their baseline level of trust, and eliminate issues or irrelevant offers that detract from the trust quotient. They make trust sustainable by establishing a rigorous process and a robust, cross-functional governance structure to continuously measure trust and hyper-relevant effectiveness—and act on their findings. Most importantly, they manage trust as the critical growth enabler it is.
The virtuous circle
Companies that are attentive to their customers’ concerns and reinforce their trust quotient are more likely to persuade customers to share personal information. That, in turn, helps to inform the design the kinds of hyper-relevant experiences that today’s consumers expect. So, while companies are finding valuable ways to leverage data to super-serve their best customers, they have also begun to realize that the digital trust consumers place in companies is as critical as the data itself.
According to Kantar Millward Brown’s fourth annual Getting Media Right study, 40% of marketers believe their media investments should be allocated to cross-channel and cross-device marketing, rather than any single channel like mobile or television. But less than 50% of advertisers in the study had confidence that their ad dollars are currently being effectively allocated in this manner.
It would appear that despite the seemingly endless amount of data available, advertisers still feel that gaps exist in available research, hampering their ability for decision making. And one of the biggest white spaces is in cross-platform and cross-channel data, with more than half of advertisers (61%) arguing there are gaps in holistic measurement.
Almost every marketer queried (90%) said that while their digital strategies are integrated into their overall brand strategy, most still don’t understand the impact of these strategies across channels. Seventy-four% say this is because it’s tough to maintain an integrated brand strategy in a fragmented media landscape.
Universal currency
For cross-channel comparison and understanding, marketers must effectively have a currency that can be utilized across all channels. But we still lack a true apples-to-apples to comparison between the old and new platforms. Traditional platforms like television and radio still primarily focus on reach and frequency metrics, while online and mobile advertising is primarily based on ROI or sales.
When it comes to measuring cross-channel performance, reach and frequency is the most widely-used metric, but it still is only used by about half of advertisers; ROI or sales is used by 47% of marketers, highlighting the difficulty the industry has had in bridging the gap between digital and traditional media.
Measuring ROI
ROI is the bane of existence for many marketers, identified for the third year in a row as the single hardest metric to get right. Though a majority of marketers feel confident in their ability to track digital channels like online and mobile, 54% have difficulty tracking traditional channels and half have difficulty tracking cross-channel ROI.
When you break it down even further, the ROI divide between traditional and digital channels is apparent. While 64% of marketers use it to measure online and half apply it to mobile, only one-third use ROI to measure TV and other traditional media.
It’s not like marketers don’t believe in those platforms. If ROI measurement improved, over half would increase spend on most channels. More than 75% would increase spend in cross-channel and the individual channels that fuel it.
It goes to show why attribution is so important and central to the overall health of the advertising industry. Accurate measurement leads to increased trust among marketers and their media agency partners, which leads to higher spends on advertising.
Digital advertising was supposed to be more efficient than television, print and radio, with technology giving marketers the ability to reduce waste by targeting only those who would want to buy their product.
But as we all know, now, the digital advertising industry has yet to fully deliver on that promise. To do so, unlocking understanding to cross-platform measurement is a crucial next step.
When I was 14, I walked door to door selling subscriptions for the local newspaper. It paid a lot better than ditch digger (which I also tried), but it also required thick skin. I quickly learned that I needed a good elevator pitch that captured the essence and value of what I was selling and that helped me make a connection with the person on the other side of the door-frame. If I could do all of that in 10 seconds, then the conversation almost always went well. Sometimes it led to a sale, sometimes it led to “come back later,” and sometimes it just led to a pleasant conversation. All of those outcomes were infinitely better than getting a door slammed in my face. I imagine it was also a much better experience for the prospective subscriber who had just been interrupted by some 14-year-old kid ringing their door bell.
One key to improving my elevator pitch was having a thorough understanding of the product. In this case, the local newspaper. Then I needed to master the value proposition, which meant tuning into the customers’ needs. The key value for most subscribers was the connection to their local community. This was the 1980s, so the paper was the primary way to keep abreast of local politics, sports, events, weather, and to read the classifieds. It was impossible to communicate the value proposition to a prospective subscriber without understanding how a subscription would be helpful to them.
Now, we get all this local information in a very different way. But, what’s now often missing in today’s digital ecosystem is the basic communication with consumers. Some companies do it well – see Apple’s just-in-time notices that allow apps and websites to ask consumers for permission to collect and use sensitive data like location and contacts. However, entire categories of companies aren’t communicating with consumers AT ALL. Just look at the thousands of ad tech companies that track unwitting consumers across the web.
Maybe there’s value to data collection in these contexts. But companies that don’t have a solid value proposition on the added value that they can effectively communicate to consumers better be ready for the door to slam.
The coming enforcement of the General Data Protection Regulation (GDPR) and the likely passage of ePrivacy Regulation in Europe will force companies to better explain what they are offering to consumers. In short, they’ll have to develop an elevator pitch.
For years, industry trade groups have heralded the consumer benefits of big data. But these benefits are often articulated in long-winded, research white papers designed to justify unbridled data collection to regulators. The fact is that, like their predecessor, direct mail, these “highly targeted ads” are more often viewed as creepy than carefully customized. Certainly, customers sometimes appreciate experiences tailored to their past behavior and preferences. However, giving them control over what data is collected and telling them why does not negate that possibility. Effectively communicating the “why” is the essence of the elevator pitch after all.
Starting in the spring of 2018, ubiquitous, non-transparent data collection is not going to fly in Europe. And in a global economy, that means marketers and advertisers everywhere need to pay attention. Companies will have to figure out concise and compelling ways to explain their value proposition to consumers in order to get their consent.
As a young newspaper salesman, I had to knock on the door and—if a customer was willing to open it—I had a brief opportunity to demonstrate the value the local newspaper could deliver. Otherwise, that door would close and I might not get another chance.
Based on my experience, the companies that understand why consumers value their product will be the ones that emerge victorious in this new landscape. Along the way, the whole digital experience might be improved for everyone. And the result will certainly be better than a potential customer opting to download an ad blocker instead.
The regulatory burdens of Europe’s General Data Protection Regulation (GDPR) have the advertising industry in a flurry. The industry press is rife with doomsday scenarios predicting the end of online advertising.
Is the regulation complex and in some instances, onerous? Yes.
Does it signal the end of online advertising? No.
GDPR does, however, force companies to examine the costs of collecting and using online behavioral data. Because GDPR increases the compliance costs associated with the collection of data, it becomes necessary to examine whether the collected data is worth the added compliance expense. Why absorb compliance costs for data that may or may not be valuable to clients and partners? While the industry has been talking about a “flight to quality” for years, GDPR has cemented Lotame’s commitment to data quality initiatives.
The Personal Approach
“Personal data” under the GDPR is broadly defined to include personally identifiable data points like name and email address, as well as less precise data points like cookies, device IDs, and IP address. The rationale is that modern technology allows marketers to identify individuals with one or more data points, even if those data points do not refer to each individual by name.
The problem with this broad definition of “personal information” is that GDPR treats anonymous data and personally identifiable information as similarly situated. A company that collects and uses cookie data has nearly the same compliance obligations as a bank that collects and stores highly sensitive financial data. The bank, and the consumers it serves, are all well aware of the risks inherent to the storage and processing of financial data. A breach has significant consequences for both the bank and the consumer. For this reason, the bank is able to pass some portion of its high compliance costs along to the consumer in the form of fees. These fees help to make banking one of the world’s most enduring, and lucrative, industries.
How the Cookie Crumbles
The potential loss or unauthorized disclosure of cookie data did not, until relatively recently, raise the ire of consumers or regulators. But now, with GDPR, cookie data is placed in the same category as financial data – meaning that a company like Lotame has the same compliance obligations with respect to its cookie data as does a bank that transacts in highly sensitive financial data. While regulators may view cookie data as having equal footing with consumer names and addresses, marketers do not, and they have not been willing to pay premium dollars for data that is viewed as less precise than personally identifiable information.
On the one hand, GDPR validates what Lotame has known and proselytized for years – that data is tremendously valuable – even when that data is not tied to personal information like name, email address, street address, or telephone number. On the other hand, GDPR makes it costly to collect, store and push large volumes of cookie and device data. There is substantial risk that compliance costs, relative to the earning potential of this data, are being driven too high. In order to drive revenue in the face of mounting regulatory burdens, Lotame and other players in online advertising must focus on quality.
Quality Over Quantity
Over the past year, we have invested heavily in technology and partnerships that emphasize quality over scale. Earlier this year, for example, we announced a partnership with Are You a Human, to provide advertisers and their agencies the ability to quickly and easily identify non-human traffic and create new bot-free audiences for ad targeting, analysis, content personalization, and more. This allows our clients to remove all profiles identified as bots, to reduce wasted ad spend, decrease bot traffic, and increase conversions. For Lotame, it also ensures that we are not absorbing compliance costs for data that offers minimal value to our clients and partners.
GDPR is a game changer for the online advertising industry, but it is not the death knell that many fear. With its emphasis on consumer transparency and consent, Lotame expects GDPR to produce higher quality data assets from more engaged consumers. We have embraced the regulations as an opportunity to review our business practices with fresh eyes and have encouraged our clients and partners to share in this approach.
Tiffany Morris is General Counsel & Vice President of Global Privacy at Lotame, overseeing all aspects of the company’s global legal strategy and operations. She brings more than 10 years of legal and business experience in the advertising, sports, and entertainment industries. Prior to joining Lotame, Tiffany held legal and business development roles at Vox Media, Inc., during a period of significant growth and acquisition activity. She previously served as Legal Counsel for The Kraft Group, where she handled a full range of legal matters for a private portfolio of companies, including the New England Patriots, Gillette Stadium, and Patriot Place. Tiffany started her career with the consulting firm McKinsey & Company. Tiffany has a BA from Wellesley College and a JD from Boston College Law School.