TikTok sued over death of two young girls in viral ‘blackout challenge’

Eight-year-old Lalani Erica Walton wanted to become “famous on TikTok.” Instead, she ended up dead.

It’s one of two such tragedies that sparked a pair of related wrongful death lawsuits filed Friday in Los Angeles County Superior Court against the social media giant. The company’s app supplied Lalani and Arriani Jaileen Arroyo, 9, with videos related to a viral trend called the blackout challenge, in which members try to choke themselves into unconsciousness, the cases allege; both young girls died after trying to join them.

This is an indication that TikTok is very popular, algorithmically curated video app Headquartered in Culver City in the US, this is a defective product, says the Social Media Victims Law Center, the law firm behind the lawsuits and self-proclaimed “legal resource for parents of children affected by social networks. TikTok promoted Lalani and Arriani’s video of a dangerous trend, was designed to be addictive, and didn’t offer adequate safety features to girls or their parents, all in the name of maximizing ad revenue, according to the Law Center.

TikTok did not immediately respond to a request for comment.

The girls’ deaths bear a striking resemblance.

According to the Law Center complaint, Texas-based Lalani was an avid TikToker, posting videos of herself dancing and singing on the social network in hopes of going viral.

At some point in July 2021, her algorithm started showing videos of the self-choke challenge, the lawsuit continues. In the middle of this month, Lalani told her family that the bruises that appeared on her neck were the result of a fall, the lawsuit says; shortly after, she spent about 20 hours in the car with her stepmother watching what her mother later learned was a blackout challenge video.

When they got home from their trip, Lalani’s stepmother told her they could go swimming later and then took a nap. But upon waking the costume continues, her stepmother went to Lalani’s bedroom and found the girl “hanging from the bed with a rope around her neck”.

Police, who took Lalani’s phone and tablet, later told her stepmother that the girl had watched a blackout call video “on repeat,” the lawsuit says.

Lalani “was convinced that if she posted a video of herself participating in the Blackout Challenge, she would become famous,” it says, but the young girl “did not appreciate or understand the dangerous nature of what TikTok encouraged her to do. …”

Arriani of Milwaukee also enjoyed posting songs and dance videos on TikTok, the lawsuit says. She adds that she has “gradually become obsessed” with the app.

Feb. On February 26, 2021, Arriani’s father was working in the basement when her younger brother Edwardo came downstairs and said that Arriani was not moving. The costume states that the two siblings were playing together in Arriani’s bedroom, but when their father rushed upstairs to check on her, he found his daughter “hanging on the family dog’s leash.”

Arriani was rushed to the hospital and put on a ventilator, but it was too late – the girl lost all brain function, the suit says, and was eventually taken off life support.

“The TikTok product and its algorithm was sending extremely and unacceptably dangerous calls and videos” to Arriani’s feed, the lawsuit continues, encouraging her to “participate and participate in the TikTok Blackout Challenge.”

Lalani and Arriani are not the first children to die in an attempt to defy the blackout.

Niyla Anderson, 10, accidentally hanged herself in her family’s home in an attempt to emulate the trend, claims her mother recently sued. filed on TikTok in Pennsylvania.

BUT amount or Another childrenbetween the ages of 10 and 14 are reported to have died under similar circumstances while attempting to complete a blackout challenge.

“TikTok undoubtedly knew that the Blackout Challenge was being distributed through their app and that their algorithm was specifically feeding the Blackout Challenge to children,” the Center for Victims’ Rights of Social Media Victims Rights said in a complaint, adding that the company “knew or should have known that not taking immediate and significant action to contain the spread of the deadly Blackout Challenge will result in more injuries and deaths, especially among children.”

TikTok has denied in the past that the blackout challenge is a TikTok trend, pointing to pre-TikTok child deaths from a “choke game” and telling the Washington Post that the company has blocked #BlackoutChallenge from its search engine.

Such viral problems are usually built around a hashtag making it easy to find all posts in one place is an important part of TikTok user culture. Most of them are harmless and often encourage users to sing along to a particular song or imitate a dance move.

But some are more risky. Injuries have been reported from attempts to recreate stunts known as fire call, milk crate challenge, Benadryl Challenge, challenge to the broken skull as well as dry spoon testamong others.

This is not just a TikTok issue. YouTube has been home to trends such as Tidal capsule test as well as cinnamon challenge, both of which experts warned could be dangerous. In 2014, the Internet urban legend known as Slenderman famously led two teenage girls to stab a friend 19 times.

While social media platforms have long been accused of posting socially harmful content, including hate speech, defamation, and misinformation, a federal law called Section 230 doing this hard sue the platforms themselves. Under Section 230, apps and websites are free to host user-generated content and moderate it as they see fit without worrying about being sued.

The Law Center Complaint attempts to circumvent this firewall by presenting death by power outage as a failure of product design, not content moderation. The theory is that TikTok is to blame for the development of the algorithmically curated social media product that sent Lalani and Arriani into a dangerous trend. present TikTok errors as publisher errors.

The Law Center argues that “an unreasonably dangerous social media product … that is designed to create addiction in young children, and does so that sets them on a path of harm, is not immunized third-party content, but rather willful behavior on behalf of social media companies.” networks. said Matthew Bergman, the attorney who founded the firm.

Or, as the complaint says: Plaintiffs “argue that TikTok is not responsible for what third parties said or did, but for what TikTok did or didn’t do.”

In large part, the lawsuits do this by criticizing the TikTok algorithm as addictive, with a slot machine-like interface that feeds users an endless, specially crafted stream of videos in the hope that they will stay online longer and longer. “TikTok developed, manufactured, marketed and marketed a social media product that was unnecessarily dangerous because it was designed to be addictive for underage users,” the complaint says, adding that the videos that were shown to users included “ harmful and exploitative.” those. “TikTok had a duty to monitor and evaluate the performance of its algorithm and ensure that it did not direct vulnerable children to dangerous and deadly videos.”

Leaked Documents indicate that the company views both user retention and the length of time users stay in the app as key success metrics.

This is a business model that many other free web platforms use – the more time users spend on the platform, the more ads the platform can sell – but which is increasingly being criticized, especially when children and their still-developing brains are involved.

A couple of bills now making them way through The California Legislature is seeking to change the perception of how social media platforms attract younger users. One of them, the Social Media Platform Duty to Children Act, empower parents sue web platforms that are addictive to their children; the other, California’s Age Appropriate Design Code Act, requires web platforms to offer children substantial privacy and security protection.

Bergman has spent much of his career representing victims of mesothelioma, many of whom became ill from asbestos exposure. The social media sector, he says, “makes the asbestos industry look like a bunch of choirboys.”

But as bad as things are, he added, cases like this against TikTok also offer some hope for the future.

According to him, mesothelioma “has always been a compensation for past mistakes.” But lawsuits against social media companies provide “an opportunity to prevent people from becoming victims; actually implement the change; to save lives.”