Facebook Fails to Detect Hate Speech Against Rohingya, Report Claims
22 March 2022
A new report claims that the social media giant Facebook is still failing to detect hate speech and calls to violence against Myanmar’s Rohingya Muslim minority, years after such behavior was found to have played a central enabling role in their persecution.
In a report published on March 20, the London-based watchdog Global Witness [...] conclude that “Facebook’s ability to detect Burmese language hate speech remains abysmally poor.” It added, “Facebook and other social media platforms should treat the spread of hate and violence with the utmost urgency.”
To its credit, Facebook has admitted its role in the violence. It responded to the findings of the U.N. Fact-Finding Mission by removing the official pages of military commander-in-chief Senior Gen. Min Aung Hlaing, who led the military coup of February 2021, and other senior officials. In late 2018, Facebook published the findings of a report that it commissioned into its role in the ethnic cleansing of the Rohingya, admitting that “we weren’t doing enough to help prevent our platform from being used to foment division and incite offline violence. We agree that we can and should do more.”
“In places such as Myanmar where there is clear evidence that Facebook was used to incite real world harms that cost ten thousand people their lives, and hundreds of thousands their homes and livelihoods, and where the Rohingya face an ongoing heightened risk of violence and continued discrimination,” the Global Witness report rightly concluded, “the very minimum the platform should do is ensure it is not being used for future incitement, and provide remedy to victims.”