California is one of 33 states, led by Colorado and Tennessee, that joined to file a joint lawsuit in U.S. District Court for the Northern District Court of California, saying Meta — which owns Facebook, Instagram, WhatsApp and Messenger — violated consumer protection laws by unfairly ensnaring children and deceiving users about the safety of its platforms. The District of Columbia and eight other states filed separate lawsuits on Tuesday against Meta with most of the same claims.
In their complaint, the states said Meta had “designed psychologically manipulative product features to induce young users’ compulsive and extended use” of platforms like Instagram. The company’s algorithms were designed to push children and teenagers into rabbit holes of toxic and harmful content, the states said, with features like “infinite scroll” and persistent alerts used to hook young users. The attorneys general also charged Meta with violating a federal children’s online privacy law, accusing it of unlawfully collecting “the personal data of its youngest users” without their parents’ permission.
“Meta has harnessed powerful and unprecedented technologies to entice, engage, and ultimately ensnare youth and teens,” the states said in their 233-page lawsuit. “Its motive is profit.”
“We're here because we're facing a problem that is national in scope, so it requires a national response,” California Attorney General Rob Bonta said during a news conference announcing the suit Tuesday.
“There's a mountain of growing evidence that social media has a negative impact on our children, evidence that more time on social media tends to be correlated with depression, with anxiety, body image issues, susceptibility to addiction and interference with daily life,” Bonta said.
Meta said it was working to provide a safer environment for teenagers on its apps and has introduced more than 30 tools to support teenagers and families.
“We’re disappointed that instead of working productively with companies across the industry to create clear, age-appropriate standards for the many apps teens use, the attorneys general have chosen this path,” the company said in a statement.
Why the Case Matters
It’s unusual for so many states to come together to sue a tech giant for consumer harms. The coordination shows states are prioritizing the issue of children and online safety and combining legal resources to fight Meta, just as states had previously done for cases against Big Tobacco and Big Pharma companies.
“Just like Big Tobacco and vaping companies have done in years past, Meta chose to maximize its profits at the expense of public health, specifically harming the health of the youngest among us,” Phil Weiser, Colorado’s attorney general, said in a statement.
Lawmakers around the globe have been trying to rein in platforms like Instagram and TikTok on behalf of children. Over the past few years, Britain, followed by states like California and Utah, passed laws to require social media platforms to boost privacy and safety protections for minors online. The Utah law, among other things, would require social media apps to turn off notifications by default for minors overnight to reduce interruptions to children’s sleep.
Regulators have also tried to hold social media companies accountable for possible harms to young people. Last year, a coroner in Britain ruled that Instagram had contributed to the death of a teenager who took her own life after seeing thousands of images of self-harm on the platform.
Laws to protect the safety of children online in the United States, however, have stalled in Congress as tech companies lobby against them.
“We’ve been warning about Meta’s manipulation and harming of young people from its start and sadly it has taken years to hold it and other companies like Google accountable,” said Jeffrey Chester, the executive director of consumer advocacy at the Center for Digital Democracy. “Hopefully justice will be served but this is why it’s so crucial to have regulations.”
How the Investigation Started
States began investigating Instagram’s potentially harmful effects on young people several years ago as public concerns over cyberbullying and teen mental health mounted.
In early 2021, Facebook announced that it was planning to develop “Instagram Kids,” a version of its popular app that would be aimed at users younger than 13. The news prompted a backlash among concerned lawmakers and children’s groups.
Soon after, a group of attorneys general from more than 40 states wrote a letter to Mark Zuckerberg, the company’s chief executive. In it, they said that Facebook had “historically failed to protect the welfare of children on its platforms” and urged the company to abandon its plans for Instagram Kids.
Concerns among the attorneys general intensified in September 2021 after Frances Haugen, a former Facebook employee, leaked company research indicating that the company knew its platforms posed mental health risks to young people. Facebook then announced it was pausing the development of Instagram Kids.
That November, a bipartisan group of attorneys general, including Colorado, Massachusetts and New Hampshire, announced a joint investigation into Instagram’s impact — and potential harmful effects — on young people.
Under local and state consumer protection laws, the attorneys general are seeking financial penalties from Meta. The District of Columbia and the states are also asking the court for injunctive relief to force the company to stop using certain tech features that the states contend have harmed young users.
What Happens Next
Meta is expected to fight to dismiss the case. Mr. Weiser, the Colorado attorney general, said in a news conference that he filed the lawsuit because he wasn’t able to reach a settlement with the company. He noted that Meta had filed a motion to dismiss a separate lawsuit filed by consumers, which accuses the company of similar allegations of harms toward children and teenagers.
Separately, a group of attorneys general from more than 40 states is pursuing an investigation into user engagement practices at TikTok and their possible harmful effects on young people. That investigation, which was announced in 2022, is ongoing.
Cecilia Kang and Natasha Singer are reporters with The New York Times. Copyright 2023, The New York Times.
Nonsense. I have kids – and one is a teenager. They are NOT allowed on social media. Not at all. They are allowed when they turn into adults and don’t have to listen to me as much anymore.
It is unbelievable to me that parents will not monitor or pay attention to their child’s – then blame a social media company for their failures. Pay attention parents and stop scapegoating other entities for your issues.