abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb

The content is also available in the following languages: français

Article

6 Oct 2022

Author:
Raghu Malhotra, Indian Express (India)

UK: Court rules suicide of 14-year-old girl attributed to harmful posts on Instagram & Pinterest; incl. company statement

"Why a UK court has blamed tech companies for a 14-year-old girl’s suicide", 3 Oct 2022

A London coroner on Friday (September 30) ruled that harmful social media content contributed to a teenager’s death in 2017 “in a more than minimal way.”

This ruling is perhaps the first of its kind to directly blame social media platforms for a child’s death.

Molly Russell, a 14-year-old schoolgirl from London, died by suicide in 2017 after viewing online content about suicide and self-harm on platforms like Instagram and Pinterest...

"She died from an act of self-harm while suffering from depression and the negative effects of online content”, senior coroner Andrew Walker on Friday...

For six months before she died, Molly had saved, liked or shared 16,300 pieces of content on Instagram, of which more than 2,100, or 12 a day were related to suicide, self-harm and depression... It was revealed that she also formed a digital pinboard on Pinterest with more than 400 images of similar subjects...

Walker told the court that Instagram and Pinterest had used algorithms that led to accessing “binge periods” of harmful material, some of which she had never requested...

Judson Hoffman of Pinterest apologised for some of the content the teenager had viewed and agreed that Pinterest was “not safe” when she had used it. He stated that the platform now uses artificial intelligence to remove harmful content.

Elizabeth Lagone, Meta’s head of health and wellbeing policy said that during the inquest, content about suicide and self-harm, which Molly had accessed before her death, was “safe.” However, she admitted that some of the posts Molly had viewed could have violated Instagram’s policies...

While Lagone said she was sorry that Molly had viewed distressing content, she nonetheless claimed that it was important for online platforms to allow people to express their feelings.

The inquest into Molly Russell’s death was significant because for the first time, senior executives from Meta and Pinterest were summoned and they gave evidence under oath in a court in the UK...

Andy Burrows... [of] The National Society for the Prevention of Cruelty to Children (NSPCC), called the ruling “social media’s big tobacco moment.” He added, “for the first time globally, it has been ruled that content a child was allowed and encouraged to see by tech companies contributed to their death”...

Michele Donelan, serving as the UK’s Secretary of State for Digital, Culture, Media and Sport, said that... “through [the Online Safety Bill], we will use the full force of the law to make social media firms protect young people from horrendous pro-suicide material.”

The Online Safety Bill seeks to improve internet safety, while also allowing defending freedom of expression...

Among other things, the bill aims to prevent the spread of illegal content and protect children from harmful material...

Companies that do not comply with the rules will face fines of up to £18m or 10% of global annual turnover (whichever is higher)...