Social media: the everyday sexism of advertising algorithms

, ,
advertising algorithms

Social media advertising algorithms can create paradoxical situations, where messages aimed at women are mostly displayed to men. These are the findings of successive research projects carried out by Grazia Cecere at the Institut Mines-Télécom Business School, in partnership with EPITECH, the University of Paris-Saclay and the MIT School of Management. The team has shed light on some of the mechanisms of algorithms that, at first glance, maintain or amplify non-parity biases.

 

Advertising algorithms prefer men. At least, those of social networks such as Facebook, Snapchat, Twitter, and LinkedIn do. This is the conclusion of several successive research projects by Grazia Cecere, a privacy economist at the Institut Mines-Télécom Business School, who has been working on the biases of algorithms for several years. In her research, she provides insights into the mystery of the advertising algorithms used by the major social platforms.  “These algorithms decide and define the information seen by the users of social networks, who are mostly young people”, she stresses.

Through collaborative work with researchers from EPITECH (Clara Jean) and the University of Paris-Saclay (Fabrice Le Guel and Matthieu Manant), Grazia Cecere looked at how an advertiser’s message is processed and distributed by Facebook algorithms. The team launched two sponsored advertisements aimed at recruiting engineering school students. The advertisements used the same image, at the same price per appearance on user accounts, and the same target population: high school students between 16 and 19 years old, with no gender specified. The advertisement was therefore aimed at teenagers and young students.

There was one difference in the text of the advertisements, both of which promoted school-leaving pay rates for engineers and their rate of integration into the working world. On one of the ads: “€41,400 gross annual salary on average.” On the second: “€41,400 gross annual salary on average for women.” The researchers’ question was: how will these two ads be distributed among men and women by the algorithm?

Results. First, the advertisement with a message aimed at women reduced the number of views by users, regardless of the target, and it was shown predominantly to young men. The specification “for women” in the advertising text was not enough to direct the algorithm towards targeting high school girls more than high school boys. However, the researchers note in their publication that the algorithm appeared to treat targets between 16 and 17 years of age, minors, differently than targets between 18 and 19 years of age, adults. The algorithm slightly favored adult high school girls in the advertisement “for women”, compared to minor high school girls who were less likely to see it.

This indicates that the algorithm uses different decision processes for younger and older targets”, says Grazia Cecere and colleagues. “This is consistent with the strict legislation such as GDPR and COPPA surrounding the use of digital technology by minors in Europe and the United States.” While adult high school girls were more likely to see the advertisement than their younger peers, it is important to remember that they were still targeted less often than their male counterparts. The difference in algorithm treatment between minors and adults does not correct the gender bias in the advertising.

Another observation: the neutral advertisement – which did not specify “for women” – was more widely disseminated than the advertisement targeted at women, and here again, it was mainly aimed at men. This observation can be explained both by the length of the advertising text but also by its gendered orientation. Generally speaking, women have privileged access to this type of content when advertising is not specifically for women. Moreover, the word “women” in the text also led the algorithm to introduce an additional criterion, thus reducing the sample of targets – but clearly without favoring high school girls either.

Nevertheless, after several campaigns aimed at understanding the targeting mechanisms of these two ads, the researchers showed that the algorithm was capable of adapting its target according to the gender-specific text of the ad, which nonetheless reveals a market bias: targeting adult women costs advertisers more.

Complexity for advertisers

These results show the opacity of advertising algorithms and the paradoxical biases they entail. For engineering schools, diversity and parity are major recruitment challenges. Every year, schools invest efforts and resources in campaigns specifically targeted at women to attract them into sectors that remain highly masculine, without realizing that there are algorithmic decision parameters that are very complicated to control.

Read on I’MTech: Restricting algorithms to limit their powers of discrimination

This type of research sheds light on the avidly protected mechanisms of advertising algorithms and identifies good practices. However, Grazia Cecere reminds us that the biases generated by the algorithms are not necessarily voluntary: “They are often the consequences of how the algorithm optimizes the costs and views of the ads.” And these optimization methods are not initially based on male favoritism.

In 2019, research by Grazia Cecere, conducted with the same team and Catherine Tucker, a distinguished researcher at the MIT Sloan School of Management, showed the complexity of the link between optimization and algorithm bias, through an example of Snapchat advertising campaigns. The content of the advertisements was identical: advertising an engineering school for recruitment purposes. In this research, four similar advertising campaigns were launched with identical populations in all major cities in France. All other conditions remained the same, but a different photo was used for each campaign: a man from behind with a T-shirt bearing a message for men, a woman from behind with a T-shirt bearing a message for women, and the equivalents of these two photos without the people’s heads.

Pour tester les différences de traitement des algorithmes entre hommes et femmes, les chercheurs ont publié quatre photos sur Snapchat.

To test the differences in the way algorithms process the images for men and women, the researchers published four photos on Snapchat.

 

During the advertising campaign, the full photo of the man was the most often displayed, ahead of that of the man’s torso only, the woman’s torso only, and finally the full photo of the woman. Behind these results is an explanation of how the algorithm optimizes dissemination dynamically. “On the first day, the full photo of the man was the one that attracted the most visits by Parisians to the associated website” says Grazia Cecere. “This then led us to demonstrate that the algorithm bases itself on choices from cities with large populations to optimize targets. It replicates this in the other towns. It tends to optimize an entire campaign on the initial results obtained in these areas, by replicating them in all other areas.”

This case is typical of an indirect bias. “Maybe the Parisian users were more sensitive to this photo because there were more male students who identified with the ad in that city? Perhaps there are simply more male users in Paris? In any case, it is the behavior of Parisian users that has oriented the algorithm towards this bias, it is not the algorithm that has sought this result” stresses the researcher. However, without knowledge of the mechanisms of the algorithm, it is difficult for advertisers to predict these behaviors. The results of the research raise a question: is it acceptable, when trying to reach a balanced population – or even to target women preferentially in order to correct inequalities in professional fields – that the platforms’ algorithms lead to the opposite effect?

Interview by Benjamin Vignard, for I’MTech.

Find out more:

0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *