Every year, poor data quality costs organizations an average $12.9 million, according to Gartner. These companies are actively looking for ways to eliminate that waste and the market has responded. Gartner also reports that by 2025, 90% of data quality technology buying decisions will focus on ease of use, automation, operational efficiency, and interoperability as the critical decision factors to mitigating the data quality problem.
Data accuracy is the lifeblood for digital publishers
Losing revenue due to inaccurate data is particularly painful for digital publishers and media companies whose business models largely depend on their ability to leverage high-quality data to deliver for advertiser outcomes. Fresh, accurate data is the foundation for effective segmentation, targeting, and business intelligence. It’s essential for optimizing content, improving user experiences, and ultimately maximizing revenue.
Inaccurate, poor-quality, older data can lead to missed opportunities and loss of credibility among ad partners. These costly mistakes show up in multiple ways, including:
- Misaligned content strategies: Targeting the wrong audience can result in wasted resources and low engagement.
- Ineffective advertising campaigns: Poor data can lead to targeting errors, resulting in lower click-through rates and conversions.
- Poor user experience: Inaccurate data can lead to personalized recommendations that are irrelevant or worse, driving users away from your content.
Data accuracy is directly tied to revenue
Inaccurate data can have a direct and significant impact on a media company’s bottom line. Decreased ad revenue from targeting errors can result in lower ad impressions and clicks. A poor user experience due to inaccurate data can lead to subscriber churn. Without an ability to harness fresh data, publishers may not be able to quickly capitalize on the news cycle by jumping on trends and creating new opportunities to serve advertisers. Publishers will also have challenges measuring the ROI of their campaigns and reporting results to advertisers.
Know the signs of data accuracy issues
Even publishers who think their data is accurate, know that data quality can deteriorate over time. This is due to various factors, including human error, system failures, and changes in data sources. It’s well worth the time and effort to conduct periodic checks to make sure your data pipeline is running smoothly and your data is as accurate as possible.
Here are five signs to watch for:
1. Inconsistent or conflicting data
One of the most common signs of data accuracy problems is inconsistencies or conflicts between different data sources. For example, you may find discrepancies between data from your first-party systems, analytics tools, and ad platforms. These inconsistencies can make it difficult to get a clear and accurate picture of your audience, campaigns, and performance.
2. Missing or incomplete data
Another red flag is missing or incomplete data. This can occur due to data collection errors, system failures, or changes in data ingestion methods.
3. Outdated data
Data can become outdated over time, particularly in rapidly changing industries like media. Using outdated data can lead to inaccurate insights, ineffective targeting, and wasted resources.
4. Data quality issues
These issues can arise due to errors such as gaps, inconsistencies, problems with validity, latency, or a lack of data normalization across systems.
5. Lack of data governance & reliability
Without proper policies and procedures in place to manage data, it can become fragmented, inconsistent, and unreliable. In addition, media companies may be working with dozens of first-party and third-party systems that organize data differently. A single source of truth is essential to truly optimize campaigns.
Limited engineering resources exacerbate data quality problems
Despite the critical importance of data accuracy, some media companies simply don’t have the engineering resources to ensure their data is consistently accurate. With millions of pieces of data constantly arriving from dozens of sources, managing that data at scale all day, every day becomes a monumental, ongoing challenge.
Traditional data engineering approaches often fall short in meeting the specific needs of publishers. These challenges typically fall into some combination of these categories:
- Data Silos: Data is often pulled directly from content platforms and dumped into data warehouses without proper structuring or enrichment, making it difficult to use immediately.
- Manual Reporting: Some publishers still rely on manual methods like exporting data to Google Sheets for reporting, which is time-consuming and error-prone.
- First-Party Data Integration: Integrating all first-party data securely and efficiently into the data pipeline is painful.
- Prioritization: Revenue teams often struggle to get their data engineering needs prioritized. In many cases, data engineers simply don’t want to perform the mundane tasks associated with maintaining and updating APIs. Revenue teams are then forced to create DIY workarounds that are time consuming, frustrating and not as effective as they hoped.
Selecting a data operations platform
The current demand for skilled data engineers far exceeds supply, making it difficult to allocate sufficient time and expertise to data quality initiatives. Media companies that leverage no-code or low-code data engineering tools that can automate data pipelines and workflows, can drastically reduce the need for extensive engineering expertise.
Here are some qualities to look for and questions to ask when selecting a data operations platform to overcome your data quality issues:
- Automation and Speed: Look for platforms that offer automated data pipelines and ETL processes that can streamline data management and reduce manual effort.
- Scalability: Make sure the platform can handle large volumes of data and scale as your needs grow.
- High-Quality Data: Find out how the platform ensures reliable and trustworthy data. Do they consolidate data from various sources, (e.g. first-party data, agency data, and data from different channels).
- Domain Expertise: Is the platform backed up by a team of experts in digital publishing who can ensure that data is handled and transformed correctly for optimal analysis?
- Customization: Can the platform be customized to meet your specific requirements and business logic?
- Comprehensive Measurement: Are your essential metrics built in for effective campaign evaluation?
- Privacy and Security: What kinds of security measures have been taken to protect your sensitive data and ensure adherence to data privacy rules and regulations?
Publishers and media companies simply cannot ignore the impact that inaccurate data has on their bottom line. By investing in data operations platforms, quality control processes, and governance, you can unlock the full potential of your data and drive sustainable growth for your business.
About the author
Manny Balbin, a seasoned veteran with over 15 years in digital media and advertising, currently shapes vision and strategy for BI products at Switchboard Software. Switchboard’s data engineering automation platform aggregates disparate data at scale in real-time for better business decisions. Prior to Switchboard, Manny led Product, Ad Technology, and Revenue Operations at Freestar, PMC, and Quantcast.