Research findings show that ad delivery optimization can often skew the exposed audience by gender or race. Specifically, research found that Facebook algorithms used to optimize a target audience discriminated in its delivery of job advertisements. New research, Auditing for Discrimination in Algorithms Delivering Job Ads, expands on this work to present an auditing methodology for gender bias in audience qualifications.
The auditing methodology analyzes the delivery skew of the algorithmic optimization. Further, it detects if the skew is due to ad targeting qualifications or the platforms optimization learning process. Importantly, if offers insight into social platforms’ black box algorithmic systems.
Proprietary ad platforms, algorithms, and data make it difficult to audit Facebook and LinkedIn. To overcome this, researchers created an external auditing process. Facebook’s and LinkedIn’s custom audience feature allows advertisers to build audience targets on the platforms. This offers the ability to infer the gender of the ad recipients for platforms that do not provide post-delivery statistics.
The authors, Basileal Imana, Aleksandra Korolova, and John Heidemann, registered as advertisers on both Facebook and LinkedIn. They ran ads for real employment opportunities on both platforms and audited the results.
To test for a bias in the algorithmic choices of the platforms, a set of ads run to compare audience delivery. Each set of ads must be similar in audience requirements. This permits the audience delivery to equally qualify (or not) audience members to each of the ads with a consistent assessment.
The set of ads must also exhibit a true bias in the real-world. A comparison of the delivered audience with an actual audience bias offers an important point of comparison. Real-life bias and non-bias factors are constantly informing the algorithms in a platform’s continuous learning process. Therefore, if there is a significant skew in the platform’s audience delivery, it likely stems from the optimization process. In other words, the system overrides the requested audience requirements to supply its preferred optimized audience.
The researcher’s setup three tests to compare audience delivery. They include pairs of ads for delivery drivers, sales associates, and software engineers. The ads ran on both LinkedIn and Facebook. Campaign goals were exactly the same and included conversion (clicks), to maximize the number of job applicants and reach, to increase the audience exposure.
The first test included ads for delivery drivers for Domino’s and Instacart, both with identical job requirements. Note that, in practice, Domino’s has a higher male composition of drivers while Instacart has a higher female composition.
The test results show evidence of a statistically significant gender skew in audience delivery on Facebook but not on LinkedIn. Facebook’s audience delivery is in line with the actual male skew of Domino’s, even though the campaign requested a gender-balanced audience. Facebook’s algorithmic optimization trained on real-life data adjusted the campaign’s audience delivery.
This unique research offers a new auditing methodology to detect bias audience delivery on social platforms. It offers insight into how ad platform algorithms adjust for platform objectives and override the advertiser’s requirements.
Further, testing on Facebook and LinkedIn demonstrates that the methodology is applicable to multiple platforms and that not all social platforms produce biased results. Importantly, this research offers an auditing solution to protect marketers from unwanted biases baked into social platforms black box algorithmic systems.