Meta's platforms are allegedly designed to be addictive to children
"Meta designed platforms to get children addicted, court documents allege", 27 November 2023
Instagram and Facebook parent company Meta purposefully engineered its platforms to addict children and knowingly allowed underage users to hold accounts, according to a newly unsealed legal complaint.
The complaint is a key part of a lawsuit filed against Meta by the attorneys general of 33 states... and was originally redacted. It alleges the social media company knew – but never disclosed – it had received millions of complaints about underage users on Instagram but only disabled a fraction of those accounts. The large number of underage users was an “open secret” at the company, the suit alleges, citing internal company documents.
The complaint said that in 2021, Meta received over 402,000 reports of under-13 users on Instagram but that 164,000 – far fewer than half of the reported accounts – were “disabled for potentially being under the age of 13” that year. The complaint noted that at times Meta has a backlog of up to 2.5m accounts of younger children awaiting action.
The complaint alleges this and other incidents violate the Children’s Online Privacy and Protection Act, which requires that social media companies provide notice and get parental consent before collecting data from children.
The lawsuit also focuses on longstanding assertions that Meta knowingly created products that were addictive and harmful to children, brought into sharp focus by whistleblower Frances Haugen...
Company documents cited in the complaint described several Meta officials acknowledging the company designed its products to exploit shortcomings in youthful psychology...
Meta said in a statement that the complaint misrepresents its work over the past decade to make the online experience safe for teens, noting it has “over 30 tools to support them and their parents”.
With respect to barring younger users from the service, Meta argued age verification is a “complex industry challenge”.
One Facebook safety executive alluded to the possibility that cracking down on younger users might hurt the company’s business in a 2019 email.
But a year later, the same executive expressed frustration that while Facebook readily studied the usage of underage users for business reasons, it didn’t show the same enthusiasm for ways to identify younger kids and remove them from its platforms.