Facebook’s Role in Rohingya Atrocities Underlines Big Tech’s Accountability Gap

Investigations have found that Meta, through its Facebook platform, played a role in the ethnic cleansing and persecution of the Rohingya, a Muslim minority from Myanmar’s Rakhine State, in 2017. Some of the key findings showed how Facebook’s algorithms – which were designed to maximize engagement above all else, with minimal safeguards in place – had actively and disproportionately amplified the most harmful content targeting the Rohingya, including hate speech, disinformation and incitement to ethnic violence.

But with access to justice for survivors of tech harms apparently blocked in the United States – which is Meta’s global headquarters – what paths toward justice, accountability and remedy remain?

This case study discusses one of the burning governance dilemmas in tech policy today: the question of platform liability for hosting and amplifying harmful third-party content.

Case Study #10

Download Includes: Case Study, Teaching Note

ISSN 2819-0475  •  doi:10.51644/BCS010

Author

Research Themes

Human Rights
Law
Platform Governance

Facebook Rohinga Flag v2 PS1800