In a potentially groundbreaking case for the entire internet industry, TikTok is being held accountable in the U.S. for the death of a 10-year-old user. A U.S. appeals court allowed the lawsuit to proceed, reinterpreting a crucial law that has long protected internet platforms.
The case centers on Section 230 of the Communications Decency Act of 1996, which typically shields internet companies from liability for content uploaded by users. However, Judge Patty Shwartz ruled that this protection does not extend to algorithms that recommend specific content. This marks a departure from previous legal interpretations and follows a July 2024 U.S. Supreme Court ruling on content moderation.
The court argued that a platform's recommendation algorithm reflects "editorial decisions" about the content it promotes, making these choices a form of the company’s own expression, which is not covered under Section 230. "TikTok makes decisions about the content recommended to certain users, thereby exercising its own free speech," the ruling stated.
The lawsuit stems from the death of a 10-year-old girl who died in 2021 while participating in the "Blackout Challenge," a viral trend where users choke themselves until they pass out. The girl tragically strangled herself with a handbag strap. TikTok has not yet commented on the case.
Jeffrey Goodman, the attorney for the girl's mother, stated, "Big tech companies have just lost their 'Get Out of Jail Free' card."
Keywords
References