abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb

이 페이지는 한국어로 제공되지 않으며 English로 표시됩니다.

기사

2019년 3월 19일

저자:
Washington Post, New Zealand Herald

New Zealand: Social media's artificial intelligence unable to stop spread of videos on mosque attacks

 "Christchurch mosque shootings: How social media's business model helped the massacre go viral"  20 March 2019

People celebrating the mosque attacks that left 50 people dead were able to keep posting and reposting videos on Facebook, YouTube and Twitter despite the websites' use of largely automated systems powered by artificial intelligence to block them... Those pushing videos of Friday's attack made small alterations...to evade detection by artificial-intelligence systems designed by some of the world's most technologically advanced companies to block such content.

Mia Garlick, the head of communications and policy for Facebook in Australia and New Zealand, said the company would "work around the clock to remove violating content using a combination of technology and people." Garlick said the company is also now even removing edited versions of the video that do not feature graphic violence. Twitter did not respond to a request for comment...and Reddit declined to comment, but both have described working hard over several days to remove objectionable content from the shooting. 

A YouTube executive...acknowledged that the platform's systems were overwhelmed and promised to make improvements. "We've made progress, but that doesn't mean we don't have a lot of work ahead of us, and this incident has shown that," said Neal Mohan, YouTube's chief product officer...Those who study social media say that slowing the spread of appalling videos might require the companies to change or limit some features that help spread stimulating... Stephen Merity, a machine learning researcher in San Francisco, said tech companies do not want to use more drastic measures, such as tougher restrictions on who can upload or bigger investments in content-moderation teams, because of how they could alter their sites' usability or business model.

타임라인