According to the Child Mind Institute, teens and young adults who spend the most time on social media platforms like TikTok or Instagram are anywhere from 13% to 66% more likely to exhibit symptoms of depression.
“It's hard to know what types of information or even what types of design features cause anxiety or depression or even suicide in some cases,” said Jason Kelley, director of activism at the Electronic Frontier Foundation. The nonprofit is a San Francisco-based group founded by information technology insiders that seeks to educate the government on tech matters, fight what they consider bad tech legislation in court, and generally promote internet civil liberties.
Kelley’s group is speaking out in response to a bipartisan group of over 60 U.S. Senators, including Chris Murphy and Richard Blumenthal from Connecticut, who support the “Kids Online Safety Act” or KOSA. The measure is designed to make the companies behind popular online platforms more liable for design and content features that are thought to have caused harm in people under the age of 17.
While the Senate seems to have the votes to pass KOSA, critics including Kelley from the Electronic Frontier Foundation hope the lawmakers will re-consider.
“What we worry about is that the law will require platforms to sort of go out of their way to limit certain types of content just for users who are under 17,” Kelley said. “Or really, any user who can't prove or doesn't want to prove that they're over 17.”
Kelley said it raises free speech concerns.
“People who can enforce it will be able to chill speech that otherwise is entirely legal, and speech that often young people benefit from considerably,” he said.
Central to the KOSA bill is the “Duty of Care” provision. U.S. Sen. Blumenthal’s website defines “Duty of Care” as a provision that ”requires social media companies to prevent and mitigate certain harms that they know their platforms and products are causing to young users as a result of their own design choices, such as their recommendation algorithms and addictive product features. The specific covered harms include suicide, eating disorders, substance use disorders, and sexual exploitation.”
Kelley said the vague wording of the provision could lead politically-minded lawmakers to pressure litigation-averse companies to widely eliminate access to content on things like LGBTQ+ issues and Black history.
“Someone in a particular state might say, you know, ‘I'm worried that LGBTQ content is going to lead to depression in young people,’” Kelley said. “‘So I'm going to sue this platform, so that they can't recommend LGBTQ content to young people.’ That means everybody in this country now can't see that if they're under a certain age, or at least that wouldn't be recommended to them.”
Blumenthal’s office said the latest version of the KOSA bill has turned the focus of the duty of care provision away from moderating content and more towards adjusting product design features, like notifications and recommendation algorithms. Also, the legislation now gives the Federal Trade Commission as the only entity authorized to enforce duty of care. Previous iterations granted that enforcement power solely to state attorneys general.
“We're glad that the Attorney General enforcement has been limited in the most recent version of the bill,” Kelley said. “We're still worried about what they're able to do because they can still enforce other portions of the bill. And we think they can get around some of those limitations like that. We're also of course worried that the FTC [Federal Trade Commission], maybe when it's appointed by someone like President Trump, could be a pretty dangerous, politically-motivated actor.”
Kelley said he thinks there are two better ways for the federal government to make social media platforms safer for young people. The first way is to limit the ability of social media platforms to collect user data. The second way is to use antitrust laws to foster more competition in the industry.
“They've (parents) come to Washington to ask legislators to do something, and that's because they can't ask the platforms to do anything, because they don't have any power,” Kelley said. “With competition, they would have that power.”
Kelley gave an example.
“If there was a competitor to TikTok that had different choices, or better features, maybe those features involve the kinds of content that young people see, maybe the kinds of content that they can see,” he said. “Maybe it involves better parental settings.”
One thing Kelley said that the KOSA bill does not directly address is the comparative dissatisfaction young people can feel from frequent social media use.
“There's no easy way to legislate that kind of regular exposure,” Kelley said, “because one person saying that they're in Aspen isn't something that really anyone I think would agree should be removed from people's ability to view.”