Facebook apologizes for an incident in which its artificial intelligence mistakenly tagged a video of black men with a “primate” tag, calling it an “unacceptable error” that it was examining to prevent it from happening again. As reported by the New York Times, users who viewed a June 27 video posted by the UK tabloid Daily mail received an automated message asking if they wanted to “continue watching videos about Primates.”
Facebook disabled the entire topic recommendation feature as soon as it realized what was happening, a spokesperson said in an email to The edge on Saturday.
“This was clearly an unacceptable mistake,” the spokesman said. The company is investigating the cause to prevent the behavior from happening again, the spokesperson added. “As we have said, although we have made improvements to our AI, we know that it is not perfect and we have more progress to make. We apologize to anyone who has seen these offensive recommendations. “
The incident is just the latest example of artificial intelligence tools displaying racial or gender bias, and facial recognition tools have been shown to have a particular problem of misidentifying people of color. In 2015, Google apologized after its Photos app tagged photos of black people as “gorillas.” Last year, Facebook said it was studying whether its AI-trained algorithms, including those on Facebook-owned Instagram, were racially biased.
In April, the US The Federal Trade Commission warned that artificial intelligence tools that have demonstrated “troubling” racial and gender bias may be in violation of consumer protection laws if used for credit, housing, or employment decision-making. “Take responsibility, or prepare for the FTC to do it for you,” wrote FTC privacy attorney Elisa Jillson in a post on the agency’s website.