Press releases
USA: Rohingya survivor demands US regulator investigates Meta's role in Myanmar atrocities
Amnesty is supporting activist Maung Sawyeddollah in filing a complaint against Meta and its role in Myanmar violence
Meta was warned repeatedly by activists and researchers that its algorithms were amplifying hateful content against the Rohingya
The violence that unfolded in Myanmar in 2017 has been classified as a genocide
'We hope the Securities and Exchange Commission will consider the submission and investigate Meta for any potential violations of federal securities laws' - Mandi Mudarikwa
Rohingya human rights activist, Maung Sawyeddollah, has filed a whistleblower complaint with the US Securities and Exchange Commission (SEC), asking the agency to investigate Meta for alleged violations of securities laws stemming from the company’s misrepresentations to shareholders on its substantial contribution to what the US government has classified as genocide perpetrated against the Rohingya in Myanmar in 2017.
Amnesty International, the Open Society Justice Initiative and Victim Advocates International have jointly supported the submission.
Mandi Mudarikwa, Head of Strategic Litigation at Amnesty International, said:
“The submission provides information on Meta’s alleged role in the atrocities perpetrated against the Rohingya, and highlights misrepresentations to the SEC and public investors. We hope the SEC will consider the submission and investigate Meta for any potential violations of federal securities laws.”
Meta: Repeatedly warned against amplifying harmful content
The submission to the SEC, an independent US agency responsible for ensuring that shareholders are treated fairly and honestly, details how Meta was repeatedly warned by activists and researchers about the risk of Facebook being used to foment and incite violence against the Rohingya in the lead-up to 2017. The filing argues that, despite this, Meta continued leaving out key information on this risk of real-world violence in statements made to public investors.
A 2022 report by Amnesty found that Meta contributed to the atrocities in Myanmar against the Rohingya through Facebook’s use of algorithms that amplify harmful content and inadequate moderation of harmful content, which breached its own Community Standards – rules that define permissible content on the platform.
The report revealed that Meta’s business model relied on invasive profiling and targeted advertising, which promoted the spread of harmful content including incitement to violence. Meta’s algorithmic systems are designed to maximize user engagement in order to increase its advertising revenue. As a result, these systems often have the effect of prioritising inflammatory, divisive, and harmful content.
Maung Sawyeddollah, recalling his frustration at his futile attempts to alert Meta about the proliferation of harmful content on Facebook, said:
“I saw a lot of horrible things on Facebook, and I just thought that people who posted were bad. I didn’t realise then that Facebook was to blame. One day I saw a post that made me feel so bad. I tried to report that to Facebook. I said it was hate speech but I got a response that said...it does not go against Community Standards.”
Even though such content clearly violated Facebook’s Community Standards, which recently changed as part of a new policy shift, Meta did not sufficiently enforce these in Myanmar nor adequately remove anti-Rohingya content in the months and years before the 2017 atrocities in northern Rakhine State. The insufficient number of content moderators with necessary language skills, the result of the company’s budgeting and staffing choices, also contributed to Meta’s shortcomings. This reflects the company’s broader failure to adequately invest in content moderation across many countries in Asia, Africa and Latin America, notwithstanding its public claims.
Eva Buzo, Executive Director at Victim Advocates International, explained:
“In Myanmar, where Facebook served as the primary social media platform and news source, the reckless deployment of Meta’s harmful algorithms, with negligible safeguards in place, promoted widespread anti-Rohingya online campaigns which contributed to offline violence.”
The SEC complaint underscores Meta’s failure to heed multiple civil society warnings from 2013 to 2017 regarding Facebook’s potential role in fueling violence. During that time, civil society repeatedly warned Meta employees that the platform was contributing to a pending “genocide”, similar to the role radios played in the Rwandan genocide.
James Goldston, Executive Director of the Open Society Justice Initiative, added:
“Although investors had asked Meta to look into the human rights implications of its business, Meta fell far short of being fully transparent towards them, even though by that time Meta had been warned multiple times about the escalating situation in Myanmar and Facebook’s role in it.”
Despite these warnings, between 2015 to 2017, Meta told investors that Facebook’s algorithms did not result in polarization, despite having been warned of Facebook’s role in proliferating anti-Rohingya content in Myanmar. At the same time, Meta did not fully disclose in its financial reporting to shareholders the risks the company’s operations in Myanmar entailed. Instead, in 2015 and 2016 Meta objected to shareholder proposals to conduct a human rights impact assessment and to set up an internal committee to oversee the company’s policies and practices concerning international public issues, including human rights.
Violence in Ethiopia
Public pressure in 2018 forced Meta to partially and belatedly acknowledge Facebook’s role in the Rohingya atrocities. However, between November 2020 and November 2022, Meta again failed to adequately curb the spread of content advocating hatred and violence, this time against the Tigrayans in Ethiopia, ultimately contributing to severe offline violence. This is despite the company’s public claims to the contrary. Plainly, Meta has neither learned its lesson nor taken meaningful steps to curb its role in fueling ethnic violence around the world.
Recent policy changes by Meta in the US abolishing independent fact-checking, which may well be rolled out internationally, risk even further exacerbating Meta’s contributions to human rights harms and offline violence, as egregious as the crimes against the Rohingya.