Issued on: 04/09/2021
Facial recognition software has been blasted by civil rights advocates who point out problems with accuracy, particularly it comes to people who are not white
Kirill KUDRYAVTSEV AFP
San Francisco (AFP)
Facebook on Friday said it disabled its topic recommendation feature after it mistook Black men for "primates" in video at the social network.
A Facebook spokesperson called it a "clearly unacceptable error" and said the recommendation software involve was taken offline.
"We apologize to anyone who may have seen these offensive recommendations," Facebook said in response to an AFP inquiry.
"We disabled the entire topic recommendation feature as soon as we realized this was happening so we could investigate the cause and prevent this from happening again."
Facial recognition software has been blasted by civil rights advocates who point out problems with accuracy, particularly it comes to people who are not white.
Facebook users in recent days who watched a British tabloid video featuring Black men were show an auto-generated prompt asking if they would like to "keep seeing videos about Primates," according to the New York Times.
The June 2020 video in question, posted by the Daily Mail, is titled "White man calls cops on black men at marina."
While humans are among the many species in the primate family, the video had nothing to do with monkeys, chimpanzees or gorillas.
A screen capture of the recommendation was shared on Twitter by former Facebook content design manager Darci Groves.
"This 'keep seeing' prompt is unacceptable," Groves tweeted, aiming the message at former colleagues at Facebook.
"This is egregious."
San Francisco (AFP)
Facebook on Friday said it disabled its topic recommendation feature after it mistook Black men for "primates" in video at the social network.
A Facebook spokesperson called it a "clearly unacceptable error" and said the recommendation software involve was taken offline.
"We apologize to anyone who may have seen these offensive recommendations," Facebook said in response to an AFP inquiry.
"We disabled the entire topic recommendation feature as soon as we realized this was happening so we could investigate the cause and prevent this from happening again."
Facial recognition software has been blasted by civil rights advocates who point out problems with accuracy, particularly it comes to people who are not white.
Facebook users in recent days who watched a British tabloid video featuring Black men were show an auto-generated prompt asking if they would like to "keep seeing videos about Primates," according to the New York Times.
The June 2020 video in question, posted by the Daily Mail, is titled "White man calls cops on black men at marina."
While humans are among the many species in the primate family, the video had nothing to do with monkeys, chimpanzees or gorillas.
A screen capture of the recommendation was shared on Twitter by former Facebook content design manager Darci Groves.
"This 'keep seeing' prompt is unacceptable," Groves tweeted, aiming the message at former colleagues at Facebook.
"This is egregious."
No comments:
Post a Comment