Meta CEO Mark Zuckerberg and other top brass on the social media giant were well aware that hundreds of thousands of users on Instagram are underage, in response to newly released claims in a lawsuit filed by dozens of states over the corporate’s alleged failure to guard teens from harm.
The main points were included in a unredacted criticism filed late last week by 33 states including Recent York. Originally filed in October, the suit alleges that the corporate behind Facebook, Instagram and WhatsApp “ignored the sweeping damage” it caused to young users while implementing addictive features designed to maintain them hooked on the apps.
“Inside the company, Meta’s actual knowledge that hundreds of thousands of Instagram users are under the age of 13 is an open secret that’s routinely documented, rigorously analyzed and confirmed, and zealously shielded from disclosure to the general public,” the lawsuit alleges.
The Instagram parent received “over 1.1 million reports of under-13 users on Instagram” through in-app reporting systems from the primary quarter of 2019 through the second quarter of 2023” alone, the criticism alleged.
Nevertheless, the corporate “disabled only a fraction of those accounts” and “routinely continued to gather children’s data without parental consent,” the suit alleges. Meta’s actions allegedly violated a federal law blocking the gathering of private data from users under 13.
The unredacted version of the lawsuit also shed recent light on Meta’s internal response to whistleblower allegations by former worker Frances Haugen, who famously testified on Capitol Hill in 2021 that Zuckerberg’s firm prioritized profits over user safety despite internal research showing its platforms were causing harm.
Haugen’s allegations were first reported in a sweeping investigation by the Wall Street Journal in 2021, which revealed partly that Meta executives knew Instagram was particularly toxic for teen girls – a few of whom developed body image issues, anxiety and even suicidal thoughts.
The suit cites messages from Meta spokesperson Stephanie Otway to Instagram chief Adam Mosseri through which she noted that the Journal’s “arguments [are] based on our own research so [they] are difficult to rebut” and that she was “mostly fearful in regards to the fallout from the article . . . [and] that our own research confirmed what everyone has long suspected[.]”
In an internal chat from November 2021, Mosseri reportedly wrote, “tweens want access to Instagram, and so they lie about their age to get it now.” The state attorneys general argue that’s one among many signs that Meta employees knew about its underage user problem.
A month later, nonetheless, Mosseri declared to Senate members during a highly-publicized hearing that children under 13 were “not permitted on Instagram.”
The lawsuit claims that company research from March 2021 revealed Instagram’s suggestion algorithm was “shown to recommend content related to eating disorders when it received indications that the user had engaged with content referring to eating disorders up to now.”
Meta employees created test accounts through which they followed profiles with names suggestive of eating disorders. Soon after, the algorithm suggested “accounts related to anorexia, reminiscent of @milkyskinandbones, @skinny._.binge, @_skinandbones__, and @applecoreanorexic.”
The lawsuit notes that Meta executives briefed Zuckerberg “as early as 2017” that targeting children under the age of 13 as potential users would boost “the speed of acquisition when users turned 13.”
The criticism cited an internal Meta document from 2018 through which the corporate admitted that it does “little or no to maintain U13s off our platform.”
The Post has reached out to Meta for comment on the newly unredacted criticism.
The states seek unspecified financial damages, in addition to “injunctive relief” blocking Meta from engaging within the harmful business practices outlined within the suit.
Meta previously said it was “upset” that the states opted to file suit as an alternative of working with the corporate to deal with concerns. The corporate said it blocks users under 13 from using Instagram and doesn’t knowingly collect data from anyone under 13.
As The Post previously reported, the states also allege that Instagram and Facebook bombard young users with notifications that disrupt their sleep and interrupt their school.
The notifications include so-called “haptic alerts” reminiscent of phone vibrations and pulses, in addition to sound or banner notifications, emails and “badge notifications” that display a red indicator showing unread messages.
Meta is one among several social media corporations to face pressure from state and federal officials over their purported failure to police content served to young users.
Instagram’s chief rival, China-owned TikTok, recently faced scrutiny after users began praising Osama Bin Laden within the wake of Hamas’s attack on Israel.