Twitter to investigate why its photo preview favors white faces over Black faces

Twitter said Sunday that it was looking into why the neural network it uses to select which part of an image to show in photo previews apparently chooses to show white people’s faces more frequently than Black people faces.

Twitter Users who noticed this over the weekend

Several users demonstrated these issues over the weekend, by posting the examples of posts that had a Black person’s face and a white person’s face. And frequently it’s preview showed the white faces more often.

The informal testing began after one user posted about a few problems he noticed in Zoom’s face recognition, which wasn’t showing the face of Black colleague on calls. And also when he posted it on Twitter, he noticed it was too favoring his white colleague’s face over his Black colleague’s face.

Other user discovered the preview algorithm chose non-Black cartoon characters as well.

Twitter’s take on this situation

Twitter firstly began using the neural network to automatically crop photo previews, machine learning researchers explained during a blog post how they started with facial recognition to crop images, but found it is lacking, mainly because not all images have faces –

Previously, we used face detection to focus the view on the most prominent face we could find. While this is not an unreasonable heuristic,

the approach has obvious limitations since not all images contain faces. Additionally, our face detector often missed faces and sometimes mistakenly detected faces when there were none. If no faces were found, we would focus the view on the center of the image.

This could lead to awkwardly cropped preview images.

And when the general public tests got the company’s attention. Twitter’s chief design officer Dantley Davis tweeted that the company was investigating the neural network, and also conducted some experiments with images –

Liz Kelley of the Twitter communications team tweeted on Sunday that –

Company’s chief technology officer Parag Agrawal tweeted that –

Twitter’s promise to research is basically encouraging, but every user should view these analyses with a grain of salt. It’s problematic to say incidences of bias from a couple of examples. To actually assess bias, researchers need an outsized sample size with multiple examples under a spread of circumstances.

Previous articleHalloween 2020 Will Have An Extremely Unusual Night
Next articlePS5 file sizes revealed for Spider-Man: Miles Morales and Demon’s Souls 2020
The heavy Sniper, Kshitij is the marksman of the team Craffic. He joined the team in 2018 and his continuous hard work and dedication to the work has made his precision in work unmatched. Kshitij has experience in editing the work of others to foster stronger bonds with fellow authors and working together to improve each other's work.


Please enter your comment!
Please enter your name here