Insight

Moody v. NetChoice: Implications for Speech Regulation

Executive Summary

  • The Supreme Court remanded two major content moderation cases dealing with state laws designed to force social media platforms to remain politically neutral for further review, citing the need for a more rigorous factual analysis of the full extent of the laws.
  • Despite the remand, the majority opinion casts doubt on the constitutionality of the state laws, specifically criticizing the Fifth Circuit for upholding a Texas law based on a faulty understanding of the First Amendment.
  • As the Court makes clear, as Congress continues to consider legislation that could impact social media content moderation decisions, it cannot simply force private speakers to present views the speakers don’t wish to host or limit the spread of content Congress doesn’t like.

Introduction

The Supreme Court recently ruled on Moody v. NetChoice, a case reviewing two conflicting appellate court decisions dealing with state laws designed to force social media companies to host content. Conservatives have long alleged social media companies largely disfavor conservative speech on their platforms, and Texas and Florida specifically passed laws requiring platforms to remain politically neutral. After the Fifth Circuit upheld the Texas law and the Eleventh Circuit struck down the Florida law, the Supreme Court decision sends the cases back to the lower courts for further factual analysis.

While this may seem like a neutral outcome, the majority opinion suggests that these laws largely overstep the bounds of the First Amendment. As the Court made clear, the First Amendment protects social media companies that moderate user content when a company is forced to host content it would prefer to exclude, and the government cannot get its way simply by asserting an interest in better balancing the marketplace of ideas. The issue present in this case is that for facial challenges of the constitutionality of a law, “a substantial number of [the law’s] applications are unconstitutional,” and the lower courts didn’t fully examine the scope of the laws. On remand, the lower courts must take a fuller examination of the law’s effects and to what services the law would apply to determine that the law is facially unconstitutional, but the opinion indicates that as applied to social media services such as Meta’s news feed, the law would unconstitutionally burden Meta’s speech.

For Congress, the decision should make clear that laws designed to target political bias, misinformation and disinformation online, or any other public policy goal that requires control over a social media platform’s content moderation practices will face significant First Amendment scrutiny.

Texas and Florida Common Carriage Laws

At issue in these cases are two separate laws designed to prevent the censorship of conservative voices online.

The Florida law restricts a wide range of practices that could be seen as disfavoring posts based on their content or source, specifically including deleting, altering, labeling or deprioritizing the content. The law also only applies to platforms that have an annual gross revenue of over $100 million or have 100 million monthly active users. Further, it requires that platforms provide an explanation to a user any time it removes or alters any of the users’ posts.

The Texas law likewise prohibits a platform from censoring a user or a user’s expression based on viewpoint, which covers any action to “block, ban, remove, deplatform, demonetize, de-boost, restrict, deny equal access or visibility to, or otherwise discriminate against expression.” The law applies to a broader set of platforms than the Florida law and also allows users to appeal moderation decisions, which must be addressed by the platform within 14 days.

Both laws essentially require social media platforms to host content that they may otherwise choose to exclude, which raises First Amendment concerns. The Eleventh Circuit upheld an injunction of the Florida law arguing that the restrictions on content moderation trigger First Amendment scrutiny: “[W]hen a social-media platform ‘removes or deprioritizes a user post,’ the court explained, it makes a ‘judgement rooted in the platform’s own views about the sorts of content and viewpoints that are valuable and appropriate for dissemination.’” The Fifth Circuit, on the other hand, upheld the Texas law, arguing that content moderation decisions are not speech at all, and even if they were a state has an interest in protecting a diversity of ideas.

Majority Opinion: Reasons for Remand

The Supreme Court held that both the Florida and Texas courts got it wrong. NetChoice brought the case as a facial challenge to the law, meaning it argued the law is always unconstitutional and therefore void. This would differ from an “as applied” challenge, where a firm such as Meta could argue that the law, as applied to its content moderation decisions, is unconstitutional.

Both the Fifth and Eleventh Circuit analysis focused on the laws as applied to social media and content moderation more generally. As the Court explains, however, the laws could apply to a wide range of services such as direct messaging or events management that covered platforms also offer. The laws could also cover things such as Gmail incoming message filters or Venmo financial exchanges between friends. Because this is a facial challenge to the law, the lower courts were required to fully consider the scope of the law beyond just the classic example of social media companies and their content moderation practices, and determine whether the law as applied to these services violates the First Amendment.

As the courts failed to consider the full scope of the law, and because the Supreme Court cannot undertake these inquiries, the decisions were remanded for the lower courts to conduct such an analysis.

Criticisms of the Fifth Circuit’s Analysis

After explaining the decision for remand, the majority then widely criticized the Fifth Circuit’s First Amendment analysis, specifically stating that the Fifth Circuit would make the same analysis and draw the same conclusion but that conclusion would “rest on a serious misunderstanding of First Amendment precedent and principle.”

As the majority makes clear, editorial function itself is an aspect of speech, and that applies to third-party content just as much as a platform’s own content. Deciding to exclude speech from a social media platform is an expressive activity, and the government cannot interfere with such editorial choices without raising First Amendment scrutiny.

Under First Amendment analysis, asserting an interest in improving or better balancing the marketplace of ideas is insufficient. As the majority explains, “case after case, the Court has barred the government from forcing a private speaker to present views it wished to spurn in order to rejigger the expressive realm.” The First Amendment can allow for distasteful or even harmful ideas to spread online, and government cannot shape the public discourse as it sees fit.

Taken together, the majority makes clear that the Texas law, as applied to major social media companies, raises significant First Amendment concerns and will likely fail a direct challenge. At issue, however, is whether on its face the law would violate the First Amendment as it applies to the scope of products and services that the lower courts failed to consider.

What This Means Going Forward

The Supreme Court decision’s stark defense of the First Amendment as applied to social media platforms’ content moderation decisions should give significant pause to legislators attempting to regulate how these companies deliver and moderate user-generated content. For example, laws that attempt to force social media platforms to moderate in a politically neutral way or take down harmful content such as misinformation and disinformation will likely violate the First Amendment.

That said, a more robust analysis will be required for bills such as the Kids Online Safety Act, which doesn’t directly designate the content that must be removed or hosted and promotes other government interests such as protecting children online. As Congress considers these and other bills that would impact speech online, the Moody v. NetChoice decision highlights the importance of fully considering the First Amendment implications.

Disclaimer