42 states sue Meta over youth mental health crisis, data privacy violations
A lawsuit filed today by North Carolina Attorney General Josh Stein and 41 other state attorneys general alleges the social media giant Meta illegally collects data on preteens and is contributing to a youth mental health crisis by designing its platforms to be as addictive as possible.
Stein says Meta, Facebook and Instagram's parent company, has violated Children's Online Privacy Protection Act (COPPA), a 1998 law that aims to place parents in control over what online information is collected from children under the age of 13.
COPPA requires that websites directed to children under 13 and collect personal information (name, location, etc.) must give parents control over how their child’s data is used. The suit alleges Instagram is designed to attract preteens through young content creators, like JoJo Siwa, and design features like Instagram Live, Stories, and its recommendation algorithms.
"For many years, they didn't verify the ages of young people. Once they finally started doing so, they were incredibly lax in how they done it, and that violates federal law,” Stein said in an interview with PRE. “You cannot collect data from children without the consent of parents."
COPPA does not require platforms to ask a user's age and is only triggered if a platform knows the user is under the age of 13.
In 2021, whistleblower Francis Haugen revealed Meta knew Instagram was worsening body image issues among teenagers, especially young girls. Stein says Meta has misled Congress and the public about the safety of its platforms for preteens.
A 2021 report by Thorn, an international nonprofit that combats human trafficking, found 45% of children under 13 use Facebook and 40% use Instagram.
“Meta goes through great lengths to avoid meaningfully complying with COPPA, looking for loopholes to excuse its knowledge of users under the age of 13 and maintain their presence on the Platform,” the complaint reads.
Facebook and Instagram’s privacy policies and terms of service say kids under 13 are not allowed to join the platforms without adult supervision, and Meta executives have repeatedly testified to Congress that it kicks off users it verifies to be underage.
The suit also alleges that the platforms are designed to hook young children, and in turn, negatively affect their mental health. Features like filters, “infinite scroll,” autoplay, and persistent notifications are used to entice users to stay on the platform. On both platforms, users can scroll endlessly and are constantly fed new content – even if they don’t follow the user who posted it.
In 2021, a series of articles from The Wall Street Journal using leaked Meta documents reported that the company held onto internal research for years that suggested Instagram make body image issues worse for teen girls.
"They knew that it was harming our kids,” Stein said about Meta. “They have all kinds of internal studies that show the real harms to young people, and they lied about the risks to the public."
In a statement, Meta said it shares “the attorneys general’s commitment to providing teens with safe, positive experiences online.” The company says it has tools available to support teens and guardians.
“We’re disappointed that instead of working productively with companies across the industry to create clear, age-appropriate standards for the many apps teens use, the attorneys general have chosen this path,” the company said.
The case was filed in a federal court in California yesterday. The attorneys general are seeking to force Meta to make changes to its platforms, Stein says.
"What we want of Facebook is that it changes how it is structured so all these features that serve to hook kids don't exist, and full disclosures of the risks that Meta and Instagram pose to young people," Stein said.