Did Section 230 Meet Its Match?

A new ruling from a federal appeals court finds that Section 230 protections may not apply to algorithms like the one TikTok uses. That could (potentially) be a big problem for the internet.

By Ernie Smith

If you’ve never used the internet before, I have a piece of useful advice. The secret to navigating any “challenge” a social media platform recommends—whether it involves eating a spoonful of cinnamon, walking on a stack of dairy crates, or blacking out? Immediately disengage, because bad things will happen if you don’t.

But I don’t blame anyone, especially teenagers, for not knowing this.

The issue is, of course, that people don’t know these things and end up getting injured or even killed over it. And it’s the consequences of one of these challenges that has led to perhaps the most notable ruling on a fundamental rule of the internet era in years.

A recent appeals-court ruling in Anderson v. TikTok, Inc. could put the brakes on Section 230 of the Communications Decency Act as we know it. The case, pitting the mother of a young woman who died in a “blackout challenge” against the most addictive social network on the internet—puts a fundamental law of the internet at risk. And in some corners of the internet, people seem ready for this tree trunk of internet law to fall over.

TLDR

Want a byte-sized version of Hacker News? Try TLDR’s free daily newsletter.

TLDR covers the most interesting tech, science, and coding news in just 5 minutes.

No sports, politics, or weather.

Subscribe for free!

As Matt Stoller wrote in his newsletter BIG:

This case is going to be catalytic. If/when this goes to the Supreme Court, there are going to be a gazillion amicus briefs, and endless stories on how This Is the Case That Could Destroy/Save the Internet. And now plaintiff lawyers will think about the litigation they can bring.

Tik Tok Phone

TikTok’s famously aggressive algorithms may have unwittingly taken a slice out of Section 230. (Nik/Unsplash)

Let’s take a step back here. The ruling in question, in the U.S. Court of Appeals for the Third Circuit, essentially stands out because it appears to be a dramatic evolution in case law that affects how the internet works. The ruling, building on the Supreme Court’s Moody v. NetChoice decision from last fall, hits at a gap that NetChoice did not touch—the use of algorithms, in which no human hand has actually helped present the content. The key line of this whole thing, from U.S. Circuit Judge Patty Schwartz, an Obama appointee:

Given the Supreme Court’s observations that platforms engage in protected first-party speech under the First Amendment when they curate compilations of others’ content via their expressive algorithms … it follows that doing so amounts to first-party speech under § 230, too.

Did you catch that? Essentially, she argues that the algorithms for presenting content are first-party speech, which would make TikTok or other networks liable. In a concurrence, U.S. Circuit Judge Paul Matey, a Trump appointee, goes further, suggesting (in a lengthy concurrence that goes over the history of Section 230) that TikTok’s role as a distributor has limits under Section 230:

What does all this mean for Anderson’s claims? Well, § 230(c)(1)’s preemption of traditional publisher liability precludes Anderson from holding TikTok liable for the Blackout Challenge videos’ mere presence on TikTok’s platform. A conclusion Anderson’s counsel all but concedes. But § 230(c)(1) does not preempt distributor liability, so Anderson’s claims seeking to hold TikTok liable for continuing to host the Blackout Challenge videos knowing they were causing the death of children can proceed.

Mike Masnick, the guy I rely upon for all my good Section 230 takes, obviously thinks this ruling is stupid, and has said as much on his site:

It’s a bastardization of an already wrong argument put forth by MAGA fools that Section 230 conflicts with the argument in Moody. The argument, as hinted at by Justices Thomas and Gorsuch, is that because NetChoice argues (correctly) that its editorial decision-making is protected by the First Amendment, it’s somehow in conflict with the idea that they have no legal liability for third-party speech.

But that’s only in conflict if you can’t read and/or don’t understand the First Amendment and Section 230 and how they interact. The First Amendment still protects any editorial actions taken by a platform. All Section 230 does is say that it can’t face liability for third party speech, even if it engaged in publishing that speech. The two things are in perfect harmony. Except to these judges in the Third Circuit.

As Masnick notes, the Section 230 ruling literally exists because we wanted to prevent tech companies from having liability for what people did on their platforms, as it would slow down the growth of the internet. Schwartz’s majority ruling effectively sets algorithms as an extension of the provider’s own hand. Matey, meanwhile, seems like he just wants the uploading of data to apply to Section 230, while everything else from a distribution standpoint is a potential liability.

If Schwartz’s POV wins the day, our platforms are about to rely on far fewer algorithms. If Matey’s? Well, it may be tough to even run a forum on the modern internet.

I’m obviously with Masnick on this, because I know that Section 230 was created as a repudiation of Stratton Oakmont, Inc. v. Prodigy Servs., a ruling that effectively hinged on a reading like Matey’s.

But being with someone on something means nothing if there’s case law to be decided here. Section 230 is a law that has been road-tested for decades, but it appears someone finally found a potential way to pierce it. Let’s hope they don’t hit the heart of the internet in the process.

Legalese-Free Links

After 27 years in business, and as one of the few GeoCities-forged sites to make it to the modern day, AnandTech reviews its last CPU. It had a good run, but sad to see it go.

A few months ago, I appeared briefly on an episode of Search Engine, on one of their board meetings on the paid subscriber feed, to sort of nudge P.J. Vogt about the fediverse, as he had a resigned response to it on a recent episode. The challenge is that it seems like a very difficult thing to explain in an audio format. Which is why I wanna give the WVFRM Podcast’s David Imel a huge shout-out for actually pulling it off.

Apparently, it is possible to avoid getting pulled into Scientology by being annoying. It worked for Stamos.

--

Find this one an interesting read? Share it with a pal!

We’ll be back with something during this long Labor Day weekend. If you like seeing us in your inbox, be sure to give us a show of support on Ko-Fi. We’d deeply appreciate it.

 

Ernie Smith

Your time was just wasted by Ernie Smith

Ernie Smith is the editor of Tedium, and an active internet snarker. Between his many internet side projects, he finds time to hang out with his wife Cat, who's funnier than he is.

Find me on: Website Twitter