A U.S. appeals court has revived a lawsuit against TikTok following the tragic death of a 10-year-old girl. The lawsuit, brought by the mother of Nylah Anderson, claims that TikTok’s algorithm recommended a dangerous viral “blackout challenge” to her daughter. This challenge involved choking oneself until passing out, leading to Nylah’s death in 2021.
Typically, a federal law known as Section 230 of the Communications Decency Act shields internet companies from lawsuits over user-generated content. However, the 3rd U.S. Circuit Court of Appeals in Philadelphia ruled on Tuesday that the law does not protect TikTok in this case. The court found that TikTok’s algorithm, which recommended the challenge, represents the company’s own speech, not just content from third-party users.
U.S. Circuit Judge Patty Shwartz, writing for the three-judge panel, highlighted that this ruling marks a departure from previous decisions. She noted that the Supreme Court’s recent stance on algorithms, as tools for curating content, implies that they represent the company’s own editorial choices. This interpretation means that TikTok could be held liable for its algorithm’s recommendations.
The ruling overturns a lower court’s dismissal of the case, allowing Tawainna Anderson’s lawsuit against TikTok and its parent company, ByteDance, to proceed. Jeffrey Goodman, Anderson’s lawyer, declared that this ruling strips “Big Tech” of its longstanding legal protection. TikTok has not commented on the ruling. The case will now return to a lower court for further proceedings.