Facebook criticised for automated advertising categories
ProPublica, the investigative journalism site, managed to buy adverts targeted at people who had expressed an interest in the topics ‘Jew hater’, ‘How to burn jews’ and ‘History of why jews ruin the world’.
After ProPublica revealed their work to Facebook, the social media giant removed the anti-Semitic categories and said it would ‘explore ways to fix the problem’. The categories had been created by an algorithm rather than humans, and the spotlight has once again turned on Facebook’s automated systems.
Last year Facebook let go a number of human editors to try and make its news less biased but in doing so was criticised for allowing fake news to prevail.
This latest story once more points to the need for smarter artificial intelligence, as Facebook strives to keep pace with its own success.
ProPublica was made aware of the anti-Semitic ad categories and tested them with three adverts targeted to those, and other related, groups. All three were approved by Facebook within 15 minutes. After Facebook removed the offensive categories it suggested it could limit the number of categories available to advertise against or scrutinise them before they are displayed to buyers.
Rob Leathern, product management director at Facebook, said: ‘There are times where content is surfaced on our platform that violates our standards. In this case, we’ve removed the associated targeting fields in question. We know we have more work to do, so we’re also building new guardrails in our product and review processes to prevent other issues like this from happening in the future.’
As ProPublica points out, Facebook is not accused of being anti-Semitic or malicious in any way – the categories have appeared as options for advertisers because users had listed the themes on their Facebook profiles, which the social media service’s algorithms automatically turn into ad categories.
Facebook is only 13 years old and still learning how to manage its billion-strong user base. Artificial intelligence will eventually catch up, but until it does – humans will remain at the forefront of moderation.
Leave a Comment