On the last opinion day of the 2023-2024 term, the Supreme Court issued its decision in Moody v. NetChoice LLC. The case involved the constitutionality of two laws that sought to limit social media companies’ freedom to moderate content. What was decided (as well as what wasn’t) in the Moody case will have important implications for digital media companies and sets the stage for the constitutional future of content moderation.
Moody v. NetChoice
In 2021, Florida and Texas enacted laws banning certain content moderation practices for covered platforms. The laws also require individualized explanations for users whose posts have been altered or removed.
Although similar, there are several key differences between the laws. Namely, for digital media companies, the Texas law largely excludes traditional media players, including news, sports, and entertainment platforms. There is no such exception present in the Florida law.
NetChoice and the Computer & Communications Industry Association, two trade associations that represent industry leaders such as Meta, Google, and X, challenged both laws. While both laws were initially put on hold by courts in both states, the Texas law was allowed to take effect following an appeal. However, following an order from the Supreme Court, the Texas law was put on hold once again.
In September 2023, the Supreme Court agreed to jointly hear these cases.
The Supreme Court decision
In a somewhat unexpected move, the Supreme Court voided both judgements, sending the cases back to the lower courts. In a 9-0 decision, the Court found that neither court of appeals properly analyzed the First Amendment challenges presented by both laws.
The Court took issue with NetChoice’s legal approach, which forced the association to prove that nearly every application of these laws would be unconstitutional. The nature of these laws and the industry they attempt to regulate, undoubtedly makes this a monumental task for NetChoice. Given the Court’s criticism of Net Choice’s approach, it is likely that organizations seeking to challenge similar laws will concentrate on challenging specific applications, a strategy that might bring more distinct victories to social media companies.
The Court also noted its dissatisfaction with the analyses carried out by the lower courts. To conduct a proper assessment, the lower courts will first have to determine the scope of these laws (which activities and which actors the laws regulate). They will also need to determine which applications of these laws violate the First Amendment, and then measure said number of applications against the number of applications that do not violate the First Amendment.
The Supreme Court also directed the lower courts to evaluate whether the laws’ content moderation provisions intrude upon protected editorial discretion and whether individualized explanation provisions disproportionately burden freedom of expression.
Implications for media companies
Although the Court did not decide the constitutionality of these laws, there is still much to take away for digital media companies. Throughout the decision, the Court repeatedly highlighted the importance of editorial discretion. Writing for the majority, Justice Elena Kagan noted that companies engaging in expressive activity are protected by the First Amendment. Because content curation and moderation has been found to fall within the scope of expressive activity, governmental interference with editorial choices would implicate the First Amendment.
The Court’s clear support of editorial discretion and freedom provides digital media companies necessary cover not only from laws of this nature but from broader governmental infringement upon editorial rights. The Court was also clear in its judgment that “the choice of material” constitutes the exercise of editorial control not only for traditional newspapers but for online platforms as well.
However, while the Court was clear in stating that editorial discretion (including content moderation practices) is subject to First Amendment protections, it was not clear whether all content moderation practices are worthy of these protections. This, because as technology has evolved, the lack of human interference in content moderation raises questions as to how “expressive” this conduct really is.
In her concurring opinion, Justice Amy Comey Barrett questioned whether an algorithm that “presents automatically to each user whatever the algorithm thinks the user will like” or AI tools meant to remove hateful content would constitute expressive activity. Delving further on this point, Justice Comey Barrett added that technological advancements will increasingly shine a light on the overlap between content moderation and consumers’ rights to decide for themselves what they wish to post and view on social media. This overlap, which is certain to have some degree of constitutional implications, should be top of mind for tech and media companies as they continue to develop and finetune their content moderation practices.
In his concurring opinion, Justice Samuel Alito was even more emphatic in raising concerns over the extent to which content moderation constitutes expressive activity. Justice Alito cast content moderation practices under a dubious light, arguing that it is not known how platforms moderate content.
Notwithstanding the accuracy of this assertion, media companies should be prepared for calls requesting increased transparency over their policies. Justice Alito also criticized the majority for assuming that “secret” algorithms are equally as expressive as print news editors. This delineation between the era of print news and today’s digital landscape will surely be echoed by the attorneys general of Florida and Texas as well as other proponents of government interference on content moderation.
What digital media companies can expect going forward
The majority’s reading of the First Amendment is a positive outcome for digital publishers. In reiterating that editorial discretion, including content moderation, warrants First Amendment protections, the Court enabled media companies to continue curating and moderating content at their discretion. If the Court were to have ruled adversely, this freedom would have been placed on the chopping block, and media companies would soon be battling a slew of unfavorable laws seeking to manipulate their editorial discretion.
It is difficult, however, to predict to what extent the Court’s opinion will inform the lower courts’ analyses. Regardless of how the lower courts rule on these laws, the Moody decision serves as a warning that while the Supreme Court remains committed to protecting the right to editorial discretion, there is also a lack of clarity, as well as a sense of skepticism, surrounding the scope of these protections over emerging forms of content moderation.
As algorithms and AI continue to grow even more prevalent and sophisticated, the Supreme Court will eventually have to clarify, and possibly draw new constitutional lines around, First Amendment precedents that originated at a time when computers were just arriving in newsrooms.