Twitter being Racist?
The winners of an open competition to discover algorithmic bias in Twitter’s photo-cropping algorithm has been announced. After research by Twitter users last year revealed it favored white faces over Black faces, the firm disabled automatic photo-cropping in March. It subsequently issued an algorithmic bug bounty to further investigate the issue.
The challenge, which was held with the help of DEF CON’s AI Village, backed up previous findings. Twitter’s cropping algorithm likes faces that are “slim, young, of light or warm skin color and smooth skin texture, and with stereotypically feminine facial features,” according to the winning entry. The system was prejudiced towards people with white or grey hair, implying age discrimination, and favors English script over Arabic script in photographs, according to the second and third-placed submissions.
Real Life Effects of Algorithm bias?
In a presentation of these results at DEF CON 29, Rumman Chowdhury, director of Twitter’s META team (which studies Machine learning Ethics, Transparency, and Accountability), praised the entrants for showing the real-life effects of algorithmic bias.
“When we think about biases in our models, it’s not just about the academic or the experimental […] but how that also works with the way we think in society,” said Chowdhury. “I use the phrase ‘life imitating art imitating life.’ We create these filters because we think that’s what beauty is, and that ends up training our models and driving these unrealistic notions of what it means to be attractive.”
Bogdan Kulynych, a graduate student at EPFL, a Swiss research university, won first place and the top prize of $3,500 in the competition. Kulynych created a vast number of realistic faces using an AI software called StyleGAN2, which he altered by skin tone, feminine versus masculine facial traits, and slimness. He then ran these variations through Twitter’s photo-cropping algorithm to see which one it liked.
Points to this Biases
These computational prejudices, as Kulynych points out in his summary, amplify societal biases by cutting out “those who do not satisfy the algorithm’s preferences of body weight, age, or skin colour.”
Biases like these are also more common than you may assume. Vincenzo di Cicco, another contestant who received particular mention for his creative approach, demonstrated that the image cropping algorithm favoured emoji with lighter skin tones over emoji with darker skin tones. The third-place entry, by Roya Pakzad, founder of the tech advocacy group Taraaz, proved that prejudices also exist in written features. Pakzad’s research compared memes written in English and Arabic script, revealing that the algorithm cropped the image to emphasise the English text on a frequent basis.
Although the results of Twitter’s bias competition are discouraging, since they confirm the pervasiveness of social bias in algorithmic systems, they also demonstrate how tech businesses may address these issues by exposing their systems to external inspection. “The ability of people entering a competition like this to go deep into a certain form of damage or bias is something that corporate teams don’t have,” Chowdhury added.
Contrast to Business Approach
Twitter’s stance is in stark contrast to other digital businesses’ approaches when faced with comparable issues. When researchers led by MIT’s Joy Buolamwini discovered racial and gender biases in Amazon’s facial recognition algorithms, the business launched a major campaign to discredit the academics, calling their findings “misleading” and “false.” After months of wrangling over the findings, Amazon finally caved down and put a temporary halt on the usage of these algorithms by law enforcement.
Patrick Hall, a judge in Twitter’s competition and an AI researcher specialising in algorithmic discrimination, emphasised that such biases exist in all AI systems and that organisations must endeavour to identify them ahead of time.
“No matter how skilled you think your data science staff is, AI and machine learning are the Wild West,” Hall added. “Who is finding your bugs if you aren’t finding them, or if bug bounties aren’t finding them? Because you’re obviously infested.”