Login
Login is restricted to DCN Publisher Members. If you are a DCN Member and don't have an account, register here.

Digital Content Next

Menu

InContext / An inside look at the business of digital content

Humans vs. Bots: It pays to find out who is really looking at your content

August 31, 2022 | By Kevin Rehberg, Vice President, Client Development – Alliance for Audited Media @AuditedMedia

Digital publishers work hard to create quality content that engages audiences and keeps them coming back for more. When an article or video receives a surge in views, that means mission accomplished, right? Not so fast.

Since invalid traffic (IVT) makes up for nearly 40% of all website traffic, it’s likely that some of these viewers are non-human, or “bots.” And, unfortunately, when bot traffic is included in analytics reports, it can skew data and give an inaccurate look at content performance.

Here are steps publishers can take to achieve more accurate website data, which allows them to make more informed editorial and marketing decisions.

Identify traffic sources and spikes

To understand how content and promotions are truly performing, it’s important to identify and remove invalid traffic from data reports. A first step is to determine traffic sources. If a visitor comes from an unusual website or unfamiliar referral source, it can be an indication that the traffic is not human. This includes traffic from locations beyond the typical readership area. Or it might be a slew of visitors from one city, which could be the site of a datacenter.

Another indicator could be the time of day that visitors view content. If many arrive on the site at an unusual time such as in the middle of the night, it could indicate that the traffic is robotic. Other non-human behavior such as spending zero seconds on a page or achieving a 100% bounce rate are also signs the traffic might be robotic.

Analyzing traffic spikes might also uncover increases in legitimate visitors, which offers insight into why content performed well. For example, an increase in an article’s views could indicate that it was picked up by another news source, shared on a popular website, or received a greater attention on social media. Identifying traffic sources and patterns can reveal what articles are truly performing well. And, of course, that is likely to influence future content decisions.

Filter bot traffic

Not all bot traffic is malicious. Search engines send bots to crawl websites to learn more about them, which helps inform search results. However, bots also visit websites for harmful reasons including scraping content to create fake sites, inserting malware to steal user data and injecting spam. Publishers can also unintentionally invite bots to their website by purchasing website traffic. 

Many analytics platforms have built-in tools to filter bot traffic. Since these tools often must be activated by a site administrator to begin working, it’s helpful to check to make sure the filter is being applied to analytics reports.

While automatic known bot filtering tools are a great start to getting a more accurate look at website data, they might not detect all bots. Publishers should identify traffic patterns of good and bad user behavior to identify unknown robots that are not detected by the analytic partner and create custom filters based on the characteristics of those bots detected. Some of these characteristics may include the location the traffic came from, screen resolution, service provider, time on page and bounce rate.

Identifying traffic patterns and creating custom filters to remove bots from data ensures that publishers are getting the most accurate look possible at their website visitors.

Check site tags

Analytics providers require that publishers install a tag on their website to allow them to collect data. If a tag is inadvertently installed more than once, data analytics will be inflated. Or, if installed incorrectly, it might produce incomplete data. Publishers should check tag containers to ensure there is only one tag for each tracking suite so that data isn’t included twice in reports.

Get a third-party evaluation

Sometimes it helps to get input from a third party that specializes in analyzing website traffic. A digital publisher audit is great way to get an expert’s opinion about site traffic, tagging and creating custom filters. Website auditors keep up with industry trends and platform changes, which saves publishers time and the resources needed to identify new bots or create custom filters.

By taking the steps above, publishers can better identify high performing content, discovering promotions that drive real traffic, and making more informed business decisions. They can also take steps to reduce bot traffic and avoid common missteps that occur when it goes undetected.

Liked this article?

Subscribe to the InContext newsletter to get insights like this delivered to your inbox every week.