By: Anthony Aceto
On September 29, 2023 the Supreme Court granted the certiorari petition for the dual cases of NetChoice, LLC v. Paxton and Moody v. NetChoice, LLC. These cases concern a seminal development of First Amendment application to social media platforms. These cases address the issue of whether a social media platform’s content moderation policies constitute speech protected by the First Amendment. This decision will have serious implications for the future of online discourse.
These cases concern state laws enacted in 2021 by Texas and Florida to regulate platforms like Facebook, YouTube, and X (formerly known as Twitter). The details of the two laws differ, but each law includes: (1) content-moderation provisions restricting platforms’ choices about how to present user-generated content to the public; (2) individualized-explanation provisions requiring platforms to explain content-moderation decisions to affected users; and (3) general-disclosure provisions requiring platforms to disclose information about their content-moderation practices.
Texas passed H.B. 20 after it concluded that the selective refusals of these large platforms to deal with disfavored consumers rose to the level that it implicated the State’s interest in protecting the free exchange of ideas and information. The Attorney General of Texas argues that H.B. 20 is designed to ensure the Platforms provide undifferentiated service to the public without discriminating based on viewpoint and disclose their content-moderation practices.
On September 16, 2022, reviewing de novo, a divided Fifth Circuit reversed the district court, concluding that the Platforms’ facial First Amendment challenge failed. The majority concluded the Platforms are not entitled to pre-enforcement relief because the bill “does not chill speech; if anything, it chills censorship.”
Florida enacted S.B. 7072 in May 2021. S.B. 7072’s legislative findings state that platforms “have unfairly censored, shadow banned, deplatformed, and applied post-prioritization algorithms to Floridians.” S.B. 7072 responds to that perceived unfairness by imposing the three requirements stated above. The major difference between the Florida and Texas bills is that Florida’s requires that a platform “may not willfully deplatform a candidate” for public office or use “post-prioritization or shadow banning algorithms for content and material posted by or about” a candidate. More broadly, and consistent with the Texas bill, S.B. 7072 provides that a platform must “apply censorship, deplatforming, and shadow banning standards in a consistent manner.”
The platforms, however, argue that the editorial discretion exercised by them is a constitutionally protected expressive activity. Like publishers, parade organizers, and cable operators, the companies that run the major social-media platforms “are in the business of delivering curated compilations of speech” created by others. When these platforms decide what, who, and how content appears, they are exercising the same sort of editorial discretion that has been recognized by the Supreme Court in the past.
Both sides make interesting First Amendment arguments. An interesting issue raised by Paxton and Moody is that the public has now learned major platforms have begun “partnering” with federal officials to exclude certain users deemed undesirable. The White House, for example, admitted in July 2021 that it is “in regular touch with these social media platforms” and that it “flag[s] problematic posts for” them to censor.
It is unclear to what degree the government and the various platforms are comingling to isolate and censor particular targets. Any discovery undertaken to investigate the subject has been, more or less, fruitless. All we really know at the moment is that Federal Agents have made recommendations to the platforms concerning potentially harmful content. Yet, if this entwinement is more extensive than previously understood, it could easily render the whole “it’s a private company” argument wholly without teeth.
Much of the buzz surrounding this dispute is fairly overblown. With various media outlets calling it “the beginning of the end of the internet” and the decision of the Fifth Circuit being “disastrous” for the future of the internet and all users. Even if the Court is inclined to affirm the Fifth Circuit’s decision, it will not be the case that crime and hatred will run rampant on all these platforms. These platforms will still be bound by the same restrictions that any designated public forum is bound to. Prohibitions against violence and obscenity on the platforms will still be permitted and enforceable under the First Amendment, even were the Florida and Texas bills executed to the fullest extent possible.
Yet, for many years, the dispute around social media is whether the various platforms we all use today are the functional equivalent of the town square for the modern age. It certainly shares many commonalities with the town square, allowing people to freely voice their opinions and share with others. While not absolutely unfettered like the traditional public square, social media allows us to reach more individuals than ever before. For First Amendment purposes, the essential idea boils down to whether these social media companies are publishers or platforms. In an age where more and more conversations are shifting to online fora, the way the Supreme Court decides this issue is one of the most important questions in the technological age.
The resolution to this dispute will answer the question of whether Twitter, Facebook, YouTube, and other social media platforms, are finally and officially the new public square of the technological age.
Student Bio: Anthony Aceto is a 2L at Suffolk University Law School. He is a staff writer on the Journal of High Technology Law. Anthony received a Bachelor of Arts Degree in Philosophy and Economics from Boston College.
Disclaimer: The views expressed in this blog are the views of the author alone and do not represent the views of JHTL or Suffolk University Law School.