On June 27, the problem occurred with a video from the British Daily Mail newspaper, according to The New York Times. Facebook has apologized and disabled the recommendations function to assess its AI’s operation as a result of public outcry.
This event only adds to the long list of racist and sexist behaviors that AI replicates. We’ve discussed how social media sites such as Facebook highlight photographs of white people while discriminating against other races on more than one occasion.
A video about primates was recommended to the user, in which a black man and a white man were seen arguing. According to reports from the British media, the white male was abusive towards the other person and police had to be called in. But, regardless of the subject of the video, Facebook suggested after the video with this message: “Want to keep watching videos about primates?”
The Daily Mail was the first to criticise Facebook, but the mistake was not initially revealed to them. Later on Facebook apoligized to The New York Times: “As we’ve said before, while we’ve made progress in AI, it isn’t perfect and there’s more work to be done”.
To examine what occurred and try to prevent it from happening again, the company that also owns WhatsApp and Instagram has disabled the recommendations function in its core social network.
Other errors
“Every day we see the limits, the biases, and the sheer idiocy of relying solely on artificial intelligence. And yet we continue to allow ourselves to be guinea pigs,” stated New York Times cybersecurity writer Nicole Perlroth in response to this example.
Artificial intelligence is no exception to the rule that humans tend to inherit biases from one another. In recent weeks, Twitter has also had to face this reality: its smart algorithms favor photos of young, white people.
Programs that use artificial intelligence are biased because they frequently rely on a set of samples or parameters to train the functions. It will be difficult for artificial intelligence to identify the rest of humanity who do not match those criteria if the sample is limited and includes a lot of images from one race or gender.