California bill will allow parents to sue social media

California parents whose children have become addicted to social media apps will be able to sue for damages during a bill introduced in the State Assembly Tuesday by a bipartisan pair of lawmakers.

Assembly Bill 2408, or the Social Media Platform’s Responsibilities to Children Act, was introduced by Republican Jordan Cunningham of Paso Robles and Democrat Buffy Wicks of Oakland, with support from the Child Welfare Institute of the University of San Diego Law School. This is the latest in a string of legislative and political efforts to combat the exploitation of social media by its youngest users.

“Some of these companies are really intentionally designing features in their apps that they know kids are using that are getting kids to use them more and more. [and] showing signs of addiction,” Cunningham said in an interview. “So the question for me becomes… who should pay the social costs of this? Should it be borne by schools, parents and children, or should it be borne in part by the companies that profited from making these products?

“We do this with any product you sell to kids. You must make sure it is safe. Some kind of plush animal or something you sell to parents who are going to put it in their 5 year old’s crib – it shouldn’t have toxic chemicals in it… We just didn’t do it as a society, though, when it comes down to it. to social networks. And I think now is the time to do it.”

A media filing from the Child Welfare Institute explains that the bill would first oblige social media companies to be non-addictive to child users — modifying their design features and data collection methods if necessary — and then give parents and guardians the right to sue on behalf of any children. affected by companies that do not comply.

Damages could include $1,000 or more per child in a class action lawsuit or up to $25,000 per child per year in civil penalties, the institute said.

However, he said there would also be a safe harbor provision that would protect “responsible” social media platforms from punishment if they take “basic steps to avoid children’s addiction.” Companies with less than $100 million in annual revenue will also be excluded.

“I suspect you will see a number of potential [compliance] solutions,” Cunningham said. “Maybe some companies will stop allowing kids to register; this is probably the safest. But I don’t know that they are going to do it. Whatever features in their algorithms are addictive, especially for teenagers, they can disable those features. It might be different.”

Calls for regulation of social media companies have grown louder in the past few years, fueled by mounting backlash against companies such as Twitter, TikTok and Meta (formerly Facebook). Critics have focused on issues including companies’ collection of user data, their role in shaping public discourse, and their largely one-sided decisions about how to moderate—and not moderate—user-generated content.

But the impact they have on children has been a particularly important issue, and it has proven to be a unique enabler of broad collaboration. The issue came to a head late last year when whistleblower and former Facebook employee Frances Haugen leaked documents indicating that the company was aware of the extent to which its sister platform Instagram could negatively impact the mental health of young users, especially when it comes to teenage girls and body image issues.

After the Haugen leaks and subsequent testimony before Congress, extensive bipartisan criticism of Big Tech coalesced around social media’s impact on underage users.

This month California Atty. Gene. Rob Bonta helped launch a multi-state investigation into how TikTok may prey on children. A few months before Bont launched a similar investigation into Instagram also focused on younger users.

In November, the Ohio Attorney General sued Meta for allegedly misleading investors about the impact its products could have on children by increasing its stock in violation of federal securities laws.

And in January a mother from Connecticut sued against Meta and Snapchat owner Snap for “defective design, negligence and unreasonably dangerous features” after her daughter killed herself last summer.

The case documents, reported by the Washington Post, state that the Meta and Snap are responsible for the “evolving mental health crisis of children and adolescents in the United States” and, in particular, for the wrongful death of 11-year-old Selena Rodriguez. caused by Selena’s addiction to “platforms”.

Efforts to launch an Instagram Kids spin-off app were put on hold after Haugen was exposed. A similar product launched by YouTube in 2015, YouTube Kids, turned out to be more durablewith human curation replacing algorithmic content recommendations of the main platform.

The topic of protecting children from the harm of social networks even appeared in President Biden’s latest address to the US Congress.

“We must hold social media platforms accountable for the nationwide experiment they are doing on our children for profit,” the president said.

Cunningham called the Haugen leaks a “catalyst” for the new bill, but not its only motivator.

“That’s something that’s been on my mind – and on the mind of my collaborator Buffy Weeks – for a number of years,” he said. “We are approaching this from the perspective of legislators who are also parents. I have four children: three teenagers and a first grader. And I have a lot of friends who, over the past couple of years, have admitted to me that their children, due to their use of TikTok, Instagram, or both, suffered from mental problems: depression, body image problems, and in some cases even anorexia.

Representatives for Twitter, Reddit and TikTok declined to comment on the bill. A TikTok spokesperson said the company hasn’t had a chance to look into it in depth yet, but added that it already has tools in place to make it easier to manage screen time and turn off push notifications for underage users at night.

A spokesperson for Meta did not say if the company would change its app policies, features, or algorithms if the law passed, and how, instead pointing to it. past denials How Haugen described the influence of Instagram on the mental health of adolescents. The representative noted that the Meta on Wednesday launched a new resource center to help parents connect to social media surveillance tools, along with other safety features that were already in place.

Thanks to the protection it provides under section 230 of the regulatory language, internet platforms enjoy broad legal protection post content that their users post without being responsible for it. Some lawyers are calling this a “brick wall” preventing any meaningful lawsuits against the tech giants.

The Cunningham-Weeks bill attempts to get around that wall by targeting the platform’s algorithms rather than any specific content.

According to the Child Welfare Institute, the bill will be considered by the Assembly’s Judiciary Committee this spring. Cunningham said he hoped to pass it on to the governor. Gavin Newsom by September.