TikTok faces lawsuit over Chester Co. girl’s death in ‘blackout challenge’: Court – NBC10 Philadelphia

TikTok faces lawsuit over Chester Co. girl’s death in ‘blackout challenge’: Court – NBC10 Philadelphia

A U.S. appeals court on Tuesday reinstated the lawsuit filed by the mother of a 10-year-old Pennsylvania girl who died while attempting a viral challenge she allegedly saw on TikTok that asked people to strangle themselves until they lost consciousness.

While federal law generally protects online publishers from liability for content posted by others, the court said TikTok could potentially be held liable for promoting the content or using an algorithm to target it to children.

“TikTok makes decisions about the content recommended and promoted to particular users, thereby engaging in its own first-hand expression,” wrote Judge Patty Shwartz of the 3rd U.S. District Court in Philadelphia in her opinion released Tuesday.

A family is mourning the loss of a 10-year-old girl who died when she strangled herself while miming a social media challenge. Now they’re warning other parents to talk to their children to prevent future tragedies. NBC10’s Tim Furlong has the story.

Lawyers for TikTok’s parent company ByteDance did not immediately respond to phone and email messages seeking comment.

Lawyers for the mother, Tawainna Anderson, had argued that the so-called “blackout challenge,” which was popular in 2021, appeared on Nylah Anderson’s “For You” feed after TikTok determined that she might watch the challenge – even though other children had died trying.

Nylah Anderson’s mother found her unconscious in the closet of her home in Chester, near Philadelphia, and tried to resuscitate her. The girl, described by her family as a fun-loving “butterfly,” died five days later.

“I can’t stop replaying that day over and over in my head,” her mother said at a press conference in 2022 when she filed the lawsuit. “It’s time for these dangerous challenges to end so other families don’t have to experience the heartache we experience every day.”

A district judge initially dismissed the lawsuit, citing Section 230 of the Communications Decency Act of 1996, which is often used to protect Internet companies from liability for content they publish on their sites.

The three-judge appeals court partially overturned that decision on Tuesday and sent the case back to the lower court for a hearing.

“Nylah, still in her first year of adolescence, probably had no idea what she was doing or that following the images on her screen would kill her. But TikTok knew Nylah would watch because the company’s customized algorithm placed the videos on her ‘For You Page,'” Judge Paul Matey wrote in partial concurrence in the opinion.

Jeffrey Goodman, a lawyer for the family, said it was “inevitable” that courts would take a closer look at Section 230 as technology intrudes into every aspect of our lives. He said the family hopes the ruling will help protect others, even if it doesn’t bring Nylah Anderson back.

“Today’s opinion is the clearest statement yet that Section 230 does not provide the comprehensive protections that the social media companies have claimed,” Goodman said.

Leave a Reply

Your email address will not be published. Required fields are marked *