TikTok Tragedy: Mother Appeals to US Court for Renewed Lawsuit Following Child’s Death

547

The 3rd US Circuit Court of Appeals is assessing TikTok’s potential liability for the death of a 10-year-old girl in a lethal blackout challenge, making it a noteworthy legal case.

The debate revolves around the applicability of Section 230 of the Communications Decency Act, a law intended to shield internet companies from user-generated content lawsuits.

The heart of the matter lies in determining whether Section 230 covers platforms like TikTok, which employ complex algorithms for content recommendations. 

During oral arguments, the three-judge panel acknowledged the technological evolution since the enactment of Section 230 in 1996. Judge Paul Matey noted, “I think we can all probably agree that this technology didn’t exist in the mid-1990s, or didn’t exist as widely deployed as it is now.”

Tawainna Anderson filed a lawsuit against TikTok and ByteDance, the business’s parent company, after her daughter tragically died in 2021 after undertaking the blackout challenge. 

In his argument that TikTok shouldn’t be covered by Section 230, Anderson’s lawyer, Jeffrey Goodman, emphasized the platform’s allegedly careless algorithmic content recommendations to the young user.

TikTok’s Exposure to Perilous Challenges

tiktok-tragedy-mother-appeals-to-us-court-for-renewed-lawsuit-following-child's-death
The 3rd US Circuit Court of Appeals is assessing TikTok’s potential liability for the death of a 10-year-old girl in a lethal blackout challenge, making it a noteworthy legal case.

While acknowledging the protections of Section 230, Goodman maintained that TikTok consistently exposed an impressionable 10-year-old to perilous challenges, contributing to her belief that it was cool and fun.

In defense, TikTok’s lawyer, Andrew Pincus, urged the panel to uphold a lower court judge’s ruling that Section 230 barred Anderson’s case. Pincus warned that ruling against TikTok could undermine Section 230’s protections and open the door to product defect claims related to algorithms for various platforms.

The legal dispute occurs amid heightened global scrutiny on social media platforms, including TikTok, regarding their responsibility to protect children from harmful content. 

State attorneys general are investigating TikTok over potential harm to young users, and social media giants like Meta Platforms are facing lawsuits alleging harm to children’s mental health due to platform-induced addiction.

As the court deliberates, the case’s outcome could establish a precedent for the liability of platforms in recommending and disseminating potentially dangerous content.

Comment via Facebook

Corrections: If you are aware of an inaccuracy or would like to report a correction, we would like to know about it. Please consider sending an email to [email protected] and cite any sources if available. Thank you. (Policy)


Comments are closed.