SEATTLE - Seattle Public Schools filed a novel lawsuit against the tech giants behind TikTok, Instagram, Facebook, YouTube and Snapchat, seeking to hold them accountable for the mental health crisis among youth.
The school district filed the lawsuit Friday in U.S. District Court. The 91-page complaint says the social media companies have created a public nuisance by targeting their products to children.
It blames them for worsening mental health and behavioral disorders including anxiety, depression, disordered eating and cyberbullying; making it more difficult to educate students; and forcing schools to take steps such as hiring additional mental health professionals, developing lesson plans about the effects of social media, and providing additional training to teachers.
"Defendants have successfully exploited the vulnerable brains of youth, hooking tens of millions of students across the country into positive feedback loops of excessive use and abuse of Defendants’ social media platforms," the complaint said. "Worse, the content Defendants curate and direct to youth is too often harmful and exploitive ...."
While federal law — Section 230 of the Communications Decency Act — helps protect online companies from liability arising from what third-party users post on their platforms, the lawsuit argues that provision does not protect the tech giants’ behavior in this case.
"Plaintiff is not alleging Defendants are liable for what third-parties have said on Defendants’ platforms but, rather, for Defendants’ own conduct," the lawsuit said. "Defendants affirmatively recommend and promote harmful content to youth, such as pro-anorexia and eating disorder content."
In emailed statements Sunday, Google and Snap said they had worked to protect young people who use their platforms.
Snap launched an in-app support system called Here For You in 2020, to help those who might be having a mental health or emotional crisis find expert resources, and it also has enabled settings that allow parents to see whom their children contact on Snapchat, though not the content of those messages. It also has recently expanded content about the new 988 suicide and crisis phone system in the U.S.
The company sent the following written statement:
"While we can’t comment on the specifics of active litigation, nothing is more important to us than the wellbeing of our community. Snapchat was designed to help people communicate with their real friends, without some of the public pressure and social comparison features of traditional social media platforms, and intentionally makes it hard for strangers to contact young people. We also work closely with many mental health organizations to provide in-app tools and resources for Snapchatters as part of our ongoing work to keep our community safe. We will continue working to make sure our platform is safe and to give Snapchatters dealing with mental health issues resources to help them deal with the challenges facing young people today."
José Castañeda, a spokesperson for Google, said Google, which owns YouTube, had also given parents the ability to set reminders, limit screen time and block certain types of content on their children’s devices.
"We have invested heavily in creating safe experiences for children across our platforms and have introduced strong protections and dedicated features to prioritize their well being. For example, through Family Link, we provide parents with the ability to set reminders, limit screen time and block specific types of content on supervised devices," said Google spokesperson José Castañeda.
Meta issued the following statement:
"We want teens to be safe online. We’ve developed more than 30 tools to support teens and families, including supervision tools that let parents limit the amount of time their teens spend on Instagram, and age verification technology that helps teens have age-appropriate experiences. We automatically set teens’ accounts to private when they join Instagram, and we send notifications encouraging them to take regular breaks. We don’t allow content that promotes suicide, self-harm or eating disorders, and of the content we remove or take action on, we identify over 99% of it before it’s reported to us. We’ll continue to work closely with experts, policymakers and parents on these important issues."
TikTok did not respond to requests for comment.
The lawsuit says that from 2009 to 2019, there was on average a 30% increase in the number of Seattle Public Schools students who reported feeling "so sad or hopeless almost every day for two weeks or more in a row" that they stopped doing some typical activities.
The school district is asking the court to order the companies to stop creating the public nuisance, to award damages, and to pay for prevention education and treatment for excessive and problematic use of social media.
While hundreds of families are pursuing lawsuits against the companies over harms they allege their children have suffered from social media, it’s not clear if any other school districts have filed a complaint like Seattle’s.
Internal studies revealed by Facebook whistleblower Frances Haugen in 2021 showed that the company knew that Instagram negatively affected teenagers by harming their body image and making eating disorders and thoughts of suicide worse. She alleged that the platform prioritized profits over safety and hid its own research from investors and the public.