FACEBOOK/META FAILING IN UKRAINE CONFLICT: NEW RESEARCH SHOWS RUSSIAN DISINFORMATION PROLIFERATING; GLOBALLY, 62% OF #1 NEWS POSTS IN Q1 WERE FROM KNOWN DIS-INFORMERS
Releasing Quarterly Harms Report Ahead Of Earnings Call, Real Facebook Oversight Board To Shareholders: Meta Is A Sizable Investment in Disinformation, Destabilization and Denial
27 April — Releasing its regular Quarterly Harms Report today ahead of Meta’s Q1 earnings call, the Real Facebook Oversight Board shared new data from global civic organization Avaaz showing that Facebook in Q1 2022 failed to slow a torrent of disinformation about the War in Ukraine, imperiling refugees and threatening to destabilize the global effort to slow Russia’s invasion.
The Real Facebook Oversight Board’s research also showed that globally in Q1, Facebook made little progress addressing its disinformation crisis, with a strong majority of #1 and top ten news posts coming from extremists and disinformation actors.
“As Facebook tallies its Q1 dollars, Facebook shareholders have made a sizable investment in disinformation, destabilization, deceit and denial,” said a spokesperson for the Real Facebook Oversight Board. “In spite of the revelations from brave whistleblowers, Facebook continues to create an information ecosystem rife with hate and confusion. It can’t possibly be worth the money.”
The Quarterly Harms Report findings showed:
- Russia’s RT and Sputnik media platforms are thriving with non-English content, racking up engagement on Meta’s platforms.
- In Poland, false anti-Ukrainian refugee narratives are proliferating, including disproven claims that non-Ukrainians are the dominant refugees entering Poland and are a terrorist threat.
- Across Facebook for the quarter, 62% of #1 news posts and 49 % of the top ten news posts are from known Dis-informers.
RFOB also noted the TIME magazine expose into Facebook/Sama content moderation conditions in Q1, and the continued risk to elections globally from disinformation spreading on Facebook. RFOB and its members, however, praised the EU’s passage of the Digital Services Act.
“If I were any Eastern European leader right now, I’d be asking tough questions of Big Tech,” said Fadi Quran, Campaign Director at Avaaz, “like why, with conflict raging on their doorsteps, the platforms don’t seem to be taking the steps needed to effectively defend eastern europeans from information warfare? Have the platforms applied the ‘break-glass’ measures applied during the US insurrection that limited the spread of disinformation and dangerous content?”
The primary focus of this quarter’s report was on research provided to RFOB by partner organization Avaaz, showing Meta/Facebook’s continued challenges with non-English language content moderation. New research shows that in spite of blocks on two of Russia’s most notorious disinformation superspreaders in some nations and in English, RT and Sputnik are still thriving.
Avaaz measured the interactions and interaction rates on RT and Sputnik’s Facebook pages globally in the 40-day period (January 19, 2022 to February 27, 2022) before restrictions were enforced for certain EU citizens using the platform and the 40-day period after those restrictions (February 28, 2022 to April 8, 2022). While RT and Sputnik saw decreased interactions in English, the following pages saw increases both in their interactions and interaction rates within the time period analyzed:
- RT Online (59.91% increase in interactions and 62.09% increase in interaction rate [RT Online is an Arabic-language page)
- RT (16.10% increase in interactions and 19.14% increase in interaction rate [RT is an English-language page)
- Sputnik Brasil (13.79% increase in interactions and 28.38% increase in interaction rate)
- Sputnik Japan (6.05 increase in interactions and 1.88% increase in interaction rate)
In other words, dangerous and false Ukraine-related RT and Sputnik content is still spreading on Facebook.
Also alarming, and reminiscent of other content globally on Facebook platforms targeting and scapegoating refugees and vulnerable communities, research found a dangerous rise in anti-refugee narratives on Facebook platforms.
In an op-ed earlier this month in The Boston Globe, Nora Benavidez, senior counsel and director of digital justice and civil rights at Free Press and Kate Coyer, RFOB Member and a fellow with the Democracy Institute’s Center for Media, Data and Society at Central European University, reminded us that Facebook knows how to fix some of these challenges — because they’ve done it before -
Facebook’s engagement algorithms are built to amplify and profit from content that generates strong responses from users. And in the days just after the 2020 US election, then-president Donald Trump was spreading the toxic lies of the “Stop the Steal” campaign to millions over social media. In this instance, Facebook caved to external and internal pressure to do more to stop the proliferation of election disinformation on its many platforms. For five days after the election, Facebook’s News Feed and other features looked very different. They prioritized, or “upranked,” more-credible news sources in an effort to drown out hateful and untrue content. As we know from the testimony of whistleblower Frances Haugen, Facebook’s leadership implemented “break-glass” measures that are designed to slow the spread of the worst disinformation during important elections and times of crisis.
“With critical elections coming up in the Philippines, Lebanon, Kenya, India, Brazil and the United States, Facebook continues to show it cannot be trusted to protect free elections and democracy,” said a spokesperson for the Real Facebook Oversight Board. “Shareholders concerned about human rights, democracy and free elections should be sounding the alarm now — not celebrating profits.”
The Real Facebook Oversight Board in the report called for three action steps:
- Fix the algorithm, up-ranking credible and trusted news sources in the weeks leading up to and following an election.
- Make an equal investment in non-English language content moderation, including Russian and 25 other languages.
- Let academic researchers back in, allowing outside, independent scholars to monitor Facebook’s content around the 2022 elections.
Interviews are available with RFOB members, who will have breaking reactions to Meta’s Q1 earnings call.
Real Facebook Oversight Board members are available for interviews. Statements do not reflect the individual positions of all Real Facebook Oversight Board Members. Contact us at firstname.lastname@example.org. The Real Facebook Oversight Board is an emergency response to the ongoing harms on Facebook’s platforms from leading global scholars, experts and advocates