Sri Lanka: Facebook used to fuel violence against Muslims; inc. company statement
All components of this story
We are deeply disturbed by the violence that occurred in Sri Lanka this past March. We want to make sure that Facebook is a place where people can express themselves and connect with their friends, families, and communities, and we know this requires that our platform is a place where people feel safe. That’s why our Community Standards have clear rules against hate speech and content that incites violence, and we remove such content as soon as we’re made aware of it... Our approach to hate speech and incitement to violence—especially in conflict and post-conflict environments—has evolved over time and continues to change... . In Sri Lanka specifically, we’re actively building up teams that deal with reported content, working with civil society and government to better understand local context and challenges, and building out our technical capabilities so that we can more proactively address abusive content on Facebook. [[We’re also carrying out an independent human rights impact assessment of Facebook’s role in Sri Lanka to help inform our approach.]]... [W]e’re committed to having the right policies, products, people, and partnerships in place to help keep our community in Sri Lanka and around the world safe.
- Related stories: Social media companies allegedly taking insufficient action to stop spread of hate speech & incitement of violence through their platforms Sri Lanka: Facebook used to fuel violence against Muslims; inc. company statement
- This is a response from the following companies: Facebook
Author: Sheera Frenkel, The New York Times
Facebook... said... that it would begin removing misinformation that could lead to people being physically harmed. The policy expands Facebook’s rules about what type of false information it will remove, and is largely a response to episodes in Sri Lanka, Myanmar and India in which rumors that spread on Facebook led to real-world attacks on ethnic minorities... “We have identified that there is a type of misinformation that is shared in certain countries that can incite underlying tensions and lead to physical harm offline,” said Tessa Lyons, a Facebook product manager. “We have a broader responsibility to not just reduce that type of content but remove it.”
Facebook has been roundly criticized over the way its platform has been used to spread hate speech and false information that prompted violence... In Myanmar, Facebook has been accused by United Nations investigators and human rights groups of facilitating violence against Rohingya Muslims... In Sri Lanka, riots broke out after false news pitted the country’s majority Buddhist community against Muslims... In an interview published Wednesday by the technology news site Recode, Mark Zuckerberg, Facebook’s chief executive [said]... “I think that there’s a terrible situation where there’s underlying sectarian violence and intention... It is clearly the responsibility of all of the players who were involved there.”... Under the new rules, Facebook said it would create partnerships with local civil society groups to identify misinformation for removal. The new rules are already being put in effect in Sri Lanka... The company has started identifying posts that are categorized as false by independent fact checkers. Facebook will “downrank” those posts... so that they are not highly promoted across the platform.
- Related stories: Myanmar: Human rights assessment of Facebook reveals company not doing enough to prevent violence Social media companies allegedly taking insufficient action to stop spread of hate speech & incitement of violence through their platforms Sri Lanka: Facebook used to fuel violence against Muslims; inc. company statement Show moreShow less
- Related in-depth areas: Technology and Human Rights Technology and Human Rights: Digital Freedom
- Related companies: Facebook Instagram (part of Facebook) Whatsapp (part of Facebook)
Author: Shihar Aneez & Ranga Sirilal, Reuters
Sri Lankan officials said on Thursday they had lifted a ban on Facebook after discussions with the social network, a week after blocking access on the grounds it was being used to fuel communal violence. At least two people were killed in clashes last week when Sinhalese Buddhists, angered by the killing of a Buddhist driver by Muslims, attacked mosques and Muslim-owned properties. Some of the violence was instigated by threatening posts on Facebook, according to the government, which cut access to Facebook, Viber and WhatsApp on March 7. It initially said the ban would last for three days but extended the block without informing the public, users said... Government officials have said Facebook’s action against those who spread hate speech had been too slow. “Facebook officials agreed to speed up the response time,” [said] telecommunication minister Harin Fernando... Facebook Inc said in a statement to Reuters its officials met Sri Lankan government officials to outline the company’s community standards and commitment to removing hate speech and incitement to violence from its platform... “We have clear rules against such content, and will remove it when we’re made aware of it. We are glad access to our services, and important connections for people and businesses, have been restored,” the statement said.
Author: Hanna Kozlowska, Quartz
The government of Sri Lanka has shut down Facebook, WhatsApp, Instagram, and Viber in an attempt to quell ethnic strife in the country. Authorities say people were using the social media platforms to stoke violence against Sri Lanka’s Muslim minority... The government has declared a state of emergency after riots in the district of Kandy have left two dead, and property destroyed. “We have clear rules against hate speech and incitement to violence and work hard to keep it off our platform,” a Facebook spokesperson said in a statement. “We are responding to the situation in Sri Lanka and are in contact with the government and non-governmental organizations to support efforts to identify and remove such content.”