Facebook Oversight Board’s “Transparency Report” Hampered by Meta’s Own Lack of Transparency

Real Facebook Oversight Board
5 min readMar 21, 2024

--

21 March 2024 — Meta’s Oversight Board patted itself on the back this week when it released its “Transparency Report” for the second half of 2023 this week. Ironically, Meta’s own lack of transparency made it impossible for the Oversight Board to fully report on its progress, citing multiple examples where Meta’s failure to publish information made it impossible to verify if a recommendation was accepted.

If a company won’t be transparent with its own Oversight Board, who will it share information with? This issue demonstrates the impotency of the Oversight Board — funded by Meta, selected by Meta, and increasingly iced out by Meta as it slides into irrelevance.

Here’s some of what we found in our review.

The Oversight Board said: Over a 12-week period in 2023, which saw pre-enforcement notifications being sent to users across 100 million pieces of content, users took the opportunity to delete their posts more than 20% of the time.

What’s actually true: Pre-enforcement notifications only work if Facebook can identify harmful content in the first place.

“Meta has not implemented protections due to the potential hit on its advertising revenue, according to Torrez, whose office filed the lawsuit after an undercover investigation in which it set up phony accounts of fictional teens and preteens, using photographs generated by artificial intelligence. Meta’s algorithms recommended sexual content to those accounts, which were also subject to a stream of explicit messages and propositions from adults on the platforms… ‘Meta has allowed Facebook and Instagram to become a marketplace for predators in search of children upon whom to prey,’ the lawsuit alleges.” [Source]

“This is the fourth time that the London-based nonprofit has tested Meta’s ability to catch blatant violations of the rules of its most popular social media platform — and the fourth such test Facebook has flubbed. In the three prior instances, Global Witness submitted advertisements containing violent hate speech to see if Facebook’s controls — either human reviewers or artificial intelligence — would catch them. They did not.” [Source]

The Oversight Board said: Meta has now launched a Content Library, so that researchers in all parts of the world can apply for access to the company’s archive of public data on Facebook and Instagram.

What’s actually true: In reality, Meta’s “Content Library” is a half-baked attempt to replace CrowdTangle — a tool that has allowed researchers and regulators to provide some oversight for Meta’s platforms.

“While CrowdTangle is used by news outlets, academic researchers, and regulators to monitor viral content, such as the spread of misinformation and conspiracy theories, Meta Content Library will only be available to nonprofit researchers and academics.” [Source]

“Cody Buntain, a researcher at the University of Maryland’s College of Information Studies told the Wall Street Journal that although Meta Content Library’s features are helpful, it does not include a previous feature that allows for analysis of social media activity in specific locations. He added the replacement of CrowdTangle in August could interfere with research into political activity on Meta’s platforms leading up to election day” [Source]

“The company plans to instead offer select researchers access to a set of new data tools, but news publishers, journalists or anyone with commercial interests will not be granted access to that data” [Source]

The Oversight Board said: Since then, the company has been finalizing a single approach to preserving evidence from its platforms of potential atrocity crimes and grave violations of human rights and humanitarian law, so this evidence can be shared with international courts and recognized authorities in the future.

What’s actually true: If Meta is working to preserve evidence on its platforms of grave human rights abuses, then it should reflect on its own policy failures. The company has been implicated in the very same atrocity crimes that it claims to suppress.

“In this context, Human Rights Watch found Meta’s behavior fails to meet its human rights due diligence responsibilities. Despite the censorship documented in this report, Meta allows a significant amount of pro-Palestinian expression and denunciations of Israeli government policies. This does not, however, excuse its undue restrictions on peaceful content in support of Palestine and Palestinians, which is contrary to the universal rights to freedom of expression and access to information.” [Source]”

“Three years after its staggering failures in Myanmar, Meta has once again — through its content-shaping algorithms and data-hungry business model — contributed to serious human rights abuses. Even before the outbreak of the conflict in northern Ethiopia, civil society organizations and human rights experts repeatedly warned that Meta risked contributing to violence in the country, and pleaded with the company to take meaningful action,” said Agnès Callamard, Amnesty International’s Secretary General.” [Source]

“It is way beyond time that Meta fulfilled its responsibilities and provided an effective remedy to the Rohingya people of Myanmar. It is reprehensible that Meta still refuses to repair the harms it contributed to despite the overwhelming evidence that the company played a key role in 2017’s ethnic cleansing… The Rohingya people were killed, tortured, raped, and displaced in their thousands as part of the Myanmar security forces’ campaign of ethnic cleansing. In the months and years leading up to the atrocities, Facebook’s algorithms were intensifying a storm of hatred against the Rohingya, which contributed to mass offline violence.” [Source]

The Oversight Board said: Since 2021, Meta has fully or partially implemented or has reported progress on 146 of our 251 recommendations

What’s actually true: In reality, Oversight Board recommendations are not binding for Meta. As a result, the distinct lack of the company’s transparency means that not even the Oversight Board can determine whether the recommendations that are being implemented are actually effective.

“…the board has at times been shockingly lazy — as in that quarter, when out of the 347,000 cases submitted by users for appeal, it chose to hear a measly three of them. And while both it and Meta tout long lists of policy recommendations it has made that the company has adopted, the fact that I wouldn’t have been able to name any without first looking them up suggests that for the most part the board is often only nibbling at the margins of relevance.” [Source]

Regarding the video of President Biden that was altered, Meta’s Oversight Board could only recommend steps to combat disinformation — a measly slap on the wrist for the company. “While the Oversight Board ruled the video can remain on the site, it argued in a set of non-binding recommendations that Meta’s current policy regarding manipulated content should be “reconsidered.” The board called the company’s current policy on the issue “incoherent, lacking in persuasive justification and inappropriately focused on how content is created, rather than on which specific harms it aims to prevent, such as disrupting electoral processes.” [Source]

The Oversight Board has stopped reporting a firm % of how many recommendations have been fully accepted, leaning instead on “partially implemented or has reported progress on,” which in plain language means: the vast majority of the Oversight Board’s recommendations were rejected or ignored by Meta, with many impossible to verify because Meta won’t transparently share the information with its own Oversight Board.

Real Facebook Oversight Board members are available for interviews. Statements do not reflect the individual positions of all Real Facebook Oversight Board Members. Contact us at media@the-citizens.com. The Real Facebook Oversight Board is an emergency response to the ongoing harms on Facebook’s platforms from leading global scholars, experts and advocates.

--

--

Real Facebook Oversight Board

An emergency response to the ongoing harms on Facebook’s platforms from leading global scholars, experts and advocates.