This case against TikTok might spur the Section 230 reform we desperately need
Section 230 of the Telecommunications Decency Act is a very important law that allows the Internet to function as it does today. Without it, your favorite website would either cease to exist or change in ways that make it unrecognizable. We need these protections because, without them, we would have no way to express ourselves online if we didn’t agree with whoever is tasked to moderate the content.
But it’s also a very broad law that needs to be reformed. When it was written in 1996, nobody could predict the power that a few tech firms would wield or how much influence social media sites would have on us all. As situations change, the laws governing them must do the same.
A recent decision by the Third Circuit US Court of Appeals has ruled that ByteDance, the parent company of TikTok, is responsible for the distribution of harmful content even though it is shielded as its publisher. It’s a tragic story of a 10-year-old girl trying the “blackout challenge” she saw in a TikTok short and dying of asphyxia as a result.
The child’s mother sued for negligence and wrongful death and the case worked its way through the courts to the Third Circuit. The next stop is the Supreme Court. While the case is a terrible one, the ruling from the Third may be what’s needed to revamp Section 230 and hold big tech “accountable” while shielding them at the same time.
Android Central has reached out to TikTok for a statement and will update this article when we receive one.Â
There’s a difference between a publisher and a distributor. If I write a post on X or make a video on TikTok encouraging illegal activity, X or TikTok are only publishing it. Once their algorithm picks it up and forces it upon others, they are distributing it.
You really can’t have one without the other, but the third has decided 230 stating “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider” does not protect the publisher from the consequences of distributing the content.
I don’t agree with the Third’s reasoning here simply because it’s distributed as a result of it being published. Then again, I have no say in the matter because I’m just some dude, not a circuit court judge. It does point out that social media giants have to have some incentive to better police their content, or the law needs to be changed.
No, I’m not calling for censorship. We should be able to say or do any dumb thing we want as long as we are willing to deal with the consequences. But the Metas and ByteDances of the world don’t have to like what we say or do and can yank it down any time they like as a consequence.Â
Without Section 230, they would do it a lot more often and that’s not the right solution.
I have no idea how you fix things. I don’t need to know how to fix it to know that they are broken. People collecting much larger salaries than me are responsible for that.
I know a 10-year-old child should not be enticed to asphyxiate herself because TikTok told her it was cool. I know nobody working for ByteDance wanted her to do it. I also know that no amount of parental control could prevent this from happening 100% of the time.
We need legislation like Section 230 to exist because there is no way to prevent terrible content from slipping through even the most draconian moderation. But it needs to be looked at again, and lawmakers need to figure it out. Now could be the right time to do it.
Source link