Advocates, Whistleblowers Demand Facebook End Abuse of Content Moderators
Outsourced Content Moderation Imperils Human Rights, Democracy in Africa and Worldwide; Real Facebook Oversight Board, Foxglove and The Signals Network Outline Plan to Protect Whistleblowers, Address Content Moderation Issuea
10 May — Disinformation experts, whistleblowers and advocates joined today to call on Meta, the parent company of Facebook, to address the abuse of content moderators — and to address the crisis in content moderation that is allowing disinformation and hate to spread.
“Facebook has demonstrated once again that they cannot be trusted to oversee or moderate their own content, or keep users and even employees safe,” said a spokesperson for the Real Facebook Oversight Board. “Meta’s reckless and inadequate approach to content moderation is putting lives and democracies at risk.”
In February, Former Facebook content moderator Daniel Motaung came forward to TIME sharing his story of trauma, poverty wages and alleged union busting inside a Facebook content moderation center in Kenya. It was “Emotionally and mentally devastating,” he says. “I went in ok and went out not ok. It changed the person I was.” Aided by outsourced content moderation contractor Sama, Facebook’s actions in Kenya are a window into the company’s global business practices and human rights impact.
Daniel is now the claimant in a legal case against Sama and Meta, in what is thought to be the world’s first lawsuit demanding that Facebook reform the conditions of content moderators’ work. At a press event sponsored by the Real Facebook Oversight Board, Foxglove and The Signals Network, Daniel recounted his experience, and panelists issued a clear set of recommendations for reform, and for protecting whistleblowers who come forward.
Facebook must absorb the costs of what it takes to keep its platform safe. Moderators around the world have demanded they be valued and treated like the safety-critical workers they are. That means at least the same level of pay, job security, and benefits and protection for mental health as Facebook staff.
Lawmakers around the world, including those in the UK, EU, Africa and the US, should strengthen transparency by including mandated public audits of social media’s supply chains of content moderators.
Regulators like the UK’s Ofcom and the US Federal Trading Commission (FTC) should accept whistleblower cases from employees around the globe.
Facebook must absorb the costs of what it takes to keep its platform safe.
Facebook should release its unredacted audits of Sama, or explain why if none exist. It should also make its full list of content moderator outsourcing partners public, so human rights organizations can properly scrutinize their practices.
Lawmakers across the globe should protect the whistleblowers who came forward by voicing support for them directly and pledging to use the full force of law and public attention to spare them, their livelihoods, and their families from retaliation. Organizations like Foxglove and The Signals Network are doing important work to ensure whistleblowers are heard.
The organizations also encouraged Facebook employees in Silicon Valley to voice support for these moderators as they bravely did for their US counterparts in 2019, and whistleblowers to come forward to The Signals Network, Foxglove or media.
“In a world where companies like Meta face very little oversight and accountability, we must support whistleblowers who are brave enough to speak out,” said The Signals Network Executive Director Delphine Halgand-Mishra. “Our Tech Accountability Project gives end -to-end support to whistleblowers who shared information with the press for the public good.”
Panelists, including former SA MP Phumzile van Damme, a disinformation expert, noted the extreme risk to elections in Africa and worldwide from the continued torrent of disinformation flowing through Africa. Kenya’s election is August 9th; recent reports showed a rise in election disinformation in the country. Kenya has also become a new home to operations from RT and Sputnik, which continue to share non-English language propaganda on Meta’s platforms.
“As Meta fails to adequately address disinformation and hate on its platforms, more than 30 significant national elections are scheduled worldwide for the remainder of 2022,” said a spokesperson for the Real Facebook Oversight Board. “The stakes could not be higher for democracy and human rights.”