On February 26, 2024, the Supreme Court heard oral arguments in NetChoice v. Paxton and Moody v. NetChoice (hereinafter “NetChoice Cases”)—cases which turn on questions of whether, and to what extent, states may monitor, regulate, or police the content moderation policies of large social media companies. The NetChoice Cases have huge implications not only for First Amendment law, but also for tort law and the Internet itself. Companies from every corner of the Internet ecosystem are watching closely, and more than 60 entities filed so-called “friend of the court” (“amici”) briefs, including Reddit, Yelp, and Discord. To quote Justice Sotomayor during the Moody v. NetChoice oral argument: “The one thing I know about the Internet is that its variety is infinite.” I discuss three high-level takeaways from the oral arguments here.


For those who have been preoccupied, here is some background. In the wake of recent political turmoil, lawmakers in Texas and Florida passed laws that impose various requirements on social media companies such as Facebook or YouTube, essentially requiring that they host speech the platforms may disagree with. See, e.g., Florida Bill SB7072 Section 106.072(2) (“A social media platform may not willfully deplatform a candidate for office who is known by the social media platform to be a candidate, beginning on the date of qualification and ending on the date of the election or the date the candidate ceases to be a candidate.”). These laws also broadly require social media companies to disclose their content moderation policies with particularity; provide written justifications for content moderation actions such as bans, shadow-bans, or de-prioritization; allow users to appeal such actions; and comply with other such rules.

While the discourse around these laws appear to be directed largely toward platforms such as X, Facebook, and YouTube, the text of the statutes are broad, and many Internet-based services may be implicated, such as Etsy, Gmail, and online video game platforms. Further, as discussed in the numerous amici, compliance with the statutes are nigh impossible for some platforms such as Reddit, which relies on community moderation.

NetChoice, representing a number of social media companies, argued in federal court that these laws were unconstitutional as written (a so-called “facial challenge”). The 5th Circuit Court of Appeals ruled against NetChoice on the Texas law, while the 11th Circuit Court of Appeals ruled for NetChoice on the Florida law. The cases are now before the Supreme Court.

Takeaway 1: The justices question whether existing First Amendment case law applies.

Early in the arguments, the justices pointed out the difficulty in relying on the existing body of First Amendment jurisprudence. Justice Sotomayor said, speaking about Etsy, “I’m going to try to analogize it to a physical space, which I think in this area, is a little crazy.” Chief Justice Roberts, speaking about Rumsfeld v. Forum for Academic and Institutional Rights, Inc.—a landmark First Amendment opinion he authored—said, “I don’t think [that case] has much to do with the issues today at all.” Justice Alito noted that these social media platforms were “worlds away” from newspapers and telegraph companies. Justice Barrett noted that while she was not entirely closed off to the idea that services like Gmail are similar to common carriers, she also did not find the analogy convincing because “each of these platforms has different functionalities within it.” Not only did the justices appear to have a strong grasp of the complexities of Internet companies, they appear unwilling to analogize the issues in the NetChoice Cases to those of First Amendment cases involving non-Internet businesses. However the justices rule, it appears they will do so with awareness of how unique 21st-century Internet businesses are.

Takeaway 2: Some of the justices are unhappy that the NetChoice Cases are facial challenges.

Justice Thomas opened the Moody v. NetChoice argument with his concern that the case had been brought as a facial challenge. Justice Thomas later said, “With these facial challenges, I always have a problem that we don’t—we’re not talking about anything specific. In an as-applied challenge, at least we know what’s in front of us and what your interpretation or at least the state’s interpretation of its law is in that case. Now we’re just speculating as to what the law means.” Justice Jackson said, “I guess the hard part for me, is really trying to understand how we apply this analysis at the broad level of generality that I think both sides seem to be taking here.”

The justices appeared frustrated by the possibility that the Texas and Florida laws could be applied to Internet platforms such as Gmail, which are not involved in hosting speech, and what “content moderation” even means, as the types of actions social media platforms can take with respect to users is as varied as the imagination allows.

Takeaway 3: The justices noted inconsistencies in how Internet companies describe their practices under CDA Section 230—as opposed to the First Amendment.

Justice Alito addressed what he perceived as inconsistent positions that Internet companies have taken regarding whether content moderation is “editorial discretion” depending on whether they are arguing under CDA Section 230 or the First Amendment: “[Content moderation is] your message when you want to escape state regulation, but it’s not your message when you want to escape liability under state tort law.” Justice Gorsuch added, “So it’s speech for the purposes of the First Amendment, your speech, your editorial control, but when we get to Section 230, your submission is that that isn’t your speech?” While CDA Section 230 is not at issue in this case, how the Court rules may implicate future Section 230 issues.


Throughout oral argument, the Court demonstrated a very clear grasp on just how distinctive Internet-based businesses are. Many issues, including the safe harbor offered by CDA Section 230, have surfaced together, and whatever the outcome, the NetChoice Cases have the potential to heavily disrupt the way almost all large Internet platforms are run. Compliance with these content moderation laws can be burdensome and expensive, if not outright impossible, and if the platforms successfully argue that the First Amendment defeats the Florida and Texas laws, then the CDA Section 230 safe harbor may also be vulnerable to constitutional challenge. We will continue to monitor the NetChoice Cases and report.