More

    It’s ‘Skewed’ Up: Facebook’s Algorithm Shows Job Ads Based on User Gender

    Multiple studies have shown that ad delivery on employment-oriented social media platforms is sometimes skewed by gender or race due to hidden algorithmic optimisation by the platforms, even if the employer’s aim is to reach a demographically balanced crowd. A recent study by the University of South Carolina attests to the same trend on Facebook which not only sustains the gaping gender imbalance in the workplace but also goes against U.S.’ anti-discriminatory law. 

    The study found that job ads were shown to men and women for roles such as delivery driver at Domino’s Pizza or software engineer at NVIDIA, respectively. Women were also disproportionately shown ads for equivalent positions at Instacart and Netflix. According to another research conducted by UK-based online hiring platform Totaljobs, out of the 77,000 job adverts analysed, 478,175 words were found to be biased. 

    The researchers distinguished skew in ad delivery owing to protected categories such as gender or race from the skew due to qualification differences. The study further mentions that this distinction is important in U.S. law, where ads may be targeted based on qualifications, but not on protected categories. This suggests that Facebook is more likely to show ads to users whose gender identity aligns with certain industries or jobs where similar people are prevalent. 

    The report reads, “We confirm that Facebook’s ad delivery can result in skew of job ad delivery by gender beyond what can be legally justified by possible differences in qualifications, thus strengthening the previously raised arguments that Facebook’s ad delivery algorithms may be in violation of anti-discrimination laws. We do not find such skew on LinkedIn.”

    In response to the finding, a Facebook spokesperson told The Wall Street Journal, “We’ve taken meaningful steps to address issues of discrimination in ads and have teams working on ads’ fairness today.” 

    So how exactly are these ads skewed? The study found that, in one out of three examples with similar results, Facebook showed an Instacart delivery job advertisement to a female-heavy audience and for Domino’s Pizza delivery job, the ad had more of a male-heavy viewership. On the other hand, LinkedIn showed Domino’s’ delivery job ads to roughly the same proportion of women as it did the Instacart one. It should be noted that most Instacart drivers are female, while at Domino’s, they’re mostly men. Further, the researchers detected the same disparity in high-skill jobs’ postings as well. Facebook showed women an ad for a technical job at Netflix Inc.–which has a comparatively high number of female employees for the tech industry–more often compared to that of a job at Nvidia Corp, which is a graphics-chip maker with a higher proportion of male employees (as per federal employment reports’ data).

    Under the U.S. Federal law, discrimination based on gender, race, age and other factors is prohibited in housing, employment and credit ads. Its application to behavioral advertising–the practice of displaying relevant ads and personalized marketing messages based on unders’ web-browsing behavior–remains debatable, even as the federal government argues that ads must be distributed in ways that don’t disadvantage protected classes of people’s ability to see them. 

    The research thus suggests that Facebook’s “algorithm learns and perpetuates the existing difference in employee demographics,” even if an employer wishes to target a demographically balanced audience. It also testifies to the fact that Facebook’s understanding and management of the societal consequences of its content-recommendation systems needs reworking. Tech giants usually employ teams that are dedicated to analyzing data for biases and brainstorming ways to eliminate them from their algorithms.

    If we dive deeper to understand the real-world effects of this phenomenon, some telling aspects emerge. For starters, gender-skewed job ads obviously restrict opportunities for both male and female jobseekers. Aside from biases resulting from algorithms, job advertisements can also be discriminatory in the very language used. A study published in 2011 by researchers from University of Waterloo and Duke University presented evidence that gendered words in job ads perpetuate gender inequality in the workplace. 

    The study claimed that masculine words (identified as “Confident,” “Lead,” “Outspoken,” “Decisive,” etc.) used in job descriptions deterred women from applying, regardless of whether the job was stereotypically male, female, or gender neutral. For example, for the position of a nurse practitioner, when the job description included male gendered words, participants perceived that the role is for male candidates. It seems that masculine worded descriptions subliminally implied to women that they were not right for the role, thus should not apply, regardless of their qualifications. 

    Another key finding is that gendered wording had a negative impact only on females. When a job description with female wording was shown to a male participant, it did not have any impact on their sense of belonging in the role or the general appeal of the job.

    Although the following statistics are not a direct result of gender-skewed job ad delivery on social media, they are still part of the larger ecosystem that prevents gender parity in the professional world even today. According to the World Economic Forum’s (WEF) Global Gender Gap Report (2021), the average distance completed to gender parity is at 68%, which is -0.6 percentage points less than 2020. The WEF’s estimate is that it will take 135.6 years to close the gender gap worldwide, the report says. 

    The University of South Carolina study recommends that since platforms are not consistent in self-policing their algorithms for unpleasant societal consequences, perhaps owing to the platforms’ business objectives being at stake, independent (third party) scrutiny could take on the task. 

    However, a permanent solution to the challenge has so far not been found, according to Piotr Sapiezynski, a computer-science researcher at Northeastern University who worked with the USC team to study racial disparities in job ad delivery. Sapiezynski told The Wall Street Journal, “Until we figure out how to do this right, the short-term solution is to turn off relevance matching for housing, credit and employment ads.”

    LATEST ARTICLES

    RELATED ARTICLES

    LEAVE A COMMENT

    Please enter your comment!
    Please enter your name here

    spot_img