2024-03-07 | Discrimination by AI: Women’s Lack of Trust in Algorithms
Algorithms have the potential to unfairly disadvantage or favour certain groups of people, often due to biased training data.
In two experiments published in a paper in the International Journal of Communication, Professor Sonja Utz and her Everyday Media lab investigated how people perceive bias in algorithms. "We wanted to find out whether women and men differ in how they evaluate an algorithm that discriminates against women," explains Sonja Utz.
In the first group, participants were asked to evaluate an algorithm that unfairly favoured men, while another group evaluated a gender-fair algorithm. Surprisingly, the results showed significant gender differences: In particular, women found the fair algorithm more discriminatory than men.
Women seem to expect algorithms to discriminate
"Women seem to assume that algorithms often discriminate against women. Whether this is because they read reports about discriminatory algorithms more carefully or because they experience discrimination more often, we cannot tell from this study," says Sonja Utz.
Although men also found the algorithm that favoured men to be discriminatory, this didn’t affect their willingness to accept its use in everyday decision-making.
Second study with an additional unfair algorithm
In a follow-up study, participants evaluated an additional algorithm that favoured married people, alongside the fair and unfair algorithms from the first experiment. The aim was to see whether participants were more concerned about perpetuating existing biases or about the unfairness itself.
The results showed that men found both unfair algorithms to be more discriminatory than the fair one, while women found the algorithm that favoured married couples to be particularly discriminatory. According to Sonja Utz, "these results suggest that women see a greater danger in algorithms introducing new forms of discrimination into the world."
Link to the article:
https://ijoc.org/index.php/ijoc/article/view/20806