A viral tweet containing photo of US politicians Mitch McConnell and Barack Obama displayed the Kentucky senator in both image previews.
The tweet from Tony Arcieri followed an earlier post from user Colin Madland, in which he raised the issue of a Black colleague having his face removed by the video conferencing software Zoom when using its virtual background feature.
Upon sharing the screenshots to Twitter, Colin went on to voice concern after Twitter’s preview settings displayed the right side of the picture on his mobile device.
“Flipped the image… Twitter is trash,” he added.
Replying to the tweet, Twitter’s Chief Design Officer Dantley Davis said that following his own experiments, Colin’s facial hair affected the model “because of the contrast with his skin”.
“I removed his facial hair and the Black man shows in the preview for me. Our team did test for racial bias before shipping the model,” he said.
In response to another user, Dantley added that the issue is “100% our fault” and that “the next step is fixing it”.
Dantley also shared a 2018 blog post from Twitter about the automatic cropping of images with AI, saying that the platform does not “optimize for faces because of how problematic that is”.
The article reads: “Previously, we used face detection to focus the view on the most prominent face we could find. While this is not an unreasonable heuristic, the approach has obvious limitations since not all images contain faces.
“Additionally, our face detector often missed faces and sometimes mistakenly detected faces when there were none.
“If no faces were found, we would focus the view on the center of the image. This could lead to awkwardly cropped preview images.”
It continues to say that “a better way to crop” is for the feature to focus on ‘salient’ regions of an image, with an area with high saliency meaning that a person is likely to look at it when viewing the photo freely.
“In general, people tend to pay more attention to faces, text, animals, but also other objects and regions of high contrast.
“This data can be used to train neural networks and other algorithms to predict what people might want to look at.”
Elsewhere, scientist Vinay Prabhu conducted an experiment to confirm whether the “cropping bias” was taking place, in which he tweeted over 90 images “consisting of a self identified Black-Male + blank image + self identified White-Male”.
Sharing his findings on Saturday, Vinay found that there was a 40:52 White-to-Black ratio in the results.
In a further comment to this website, a Twitter spokesperson said: “Our team did test for bias before shipping the model and did not find evidence of racial or gender bias in our testing. But it’s clear from these examples that we’ve got more analysis to do.
“We’re looking into this and will continue to share what we learn and what actions we take.”