Facebook’s frontline anti-terror team exposed to terrorists
A group of Facebook content moderators, responsible for removing terrorist-related content from the social network have had their safety compromised after a technical glitch exposed their personal details to terrorist sympathisers and supporters.
The bug affected more than 1,000 staff who use moderation software to identify and close down inappropriate content, including sexual material, hate speech and terrorist propaganda. Among the many staff affected were 40 individuals who work for the social network’s counter-terrorism unit in Dublin.
It is believed that six members of the counter-terrorism team may have had their profiles viewed my potential terrorists.
Following the leak, one member of the team, an Iraqi-born Irish citizen, fled Ireland and went into hiding after it was revealed that seven members of an Egypt-based group with connections to Hamas and with members sympathetic to Islamic State had viewed his profile.
The moderator, whose family had suffered at the hands of terrorists in Iraq, quit his job and moved to Eastern Europe for a five-month period – keeping a low profile and living off savings.
Speaking to journalists he said: “It was getting too dangerous to stay in Dublin. The only reason we’re in Ireland was to escape terrorism and threats.”
He continued: “When you come from a war zone and you have people like that knowing your family name you know that people get butchered for that. The punishment from Isis for working in counter-terrorism is beheading. All they’d need to do is tell someone who is radical here.”
A spokesperson for Facebook said: “We care deeply about keeping everyone who works for Facebook safe. As soon as we learned about the issue, we fixed it and began a thorough investigation to learn as much as possible about what happened.”
Despite reassurances, the moderator who fled Dublin said: “I’m not waiting for a pipe bomb to be mailed to my address until Facebook does something about it.”
Leave a Comment