You are being redirected to the story the piece of content is found in so you can read it in context. Please click the following link if you are not automatically redirected within a couple seconds:
New Zealand: Social media's artificial intelligence unable to stop spread of videos on mosque attacks
Author: Washington Post, New Zealand Herald, Published on: 19 March 2019
"Christchurch mosque shootings: How social media's business model helped the massacre go viral" 20 March 2019
People celebrating the mosque attacks that left 50 people dead were able to keep posting and reposting videos on Facebook, YouTube and Twitter despite the websites' use of largely automated systems powered by artificial intelligence to block them... Those pushing videos of Friday's attack made small alterations...to evade detection by artificial-intelligence systems designed by some of the world's most technologically advanced companies to block such content.
Mia Garlick, the head of communications and policy for Facebook in Australia and New Zealand, said the company would "work around the clock to remove violating content using a combination of technology and people." Garlick said the company is also now even removing edited versions of the video that do not feature graphic violence. Twitter did not respond to a request for comment...and Reddit declined to comment, but both have described working hard over several days to remove objectionable content from the shooting.
A YouTube executive...acknowledged that the platform's systems were overwhelmed and promised to make improvements. "We've made progress, but that doesn't mean we don't have a lot of work ahead of us, and this incident has shown that," said Neal Mohan, YouTube's chief product officer...Those who study social media say that slowing the spread of appalling videos might require the companies to change or limit some features that help spread stimulating... Stephen Merity, a machine learning researcher in San Francisco, said tech companies do not want to use more drastic measures, such as tougher restrictions on who can upload or bigger investments in content-moderation teams, because of how they could alter their sites' usability or business model.