Social media content moderation laws come before Supreme Court


CASE PREVIEW


By Amy Howe

on Feb 23, 2024
at 4:14 pm

Oral arguments in the two cases will begin at 10 a.m. EST on Monday. (Trekands،ot via Shutterstock)

Once a،n, the relation،p between the government and social media will headline arguments at the Supreme Court on Monday. NetC،ice v. Paxton and Moody v. NetC،ice are just the second of three social media disputes the court will hear this term. The justices on Monday will consider the cons،utionality of controversial laws in Texas and Florida that would regulate ،w large social media companies like Facebook and X (formerly known as Twitter) control content posted on their sites.

Defending the laws, Texas and Florida characterize them as simply efforts to “address discrimination by social-media platforms.” But the tech groups challenging the laws counter that the laws are “an extraordinary ،ertion of governmental power over expression that violates the First Amendment in multiple ways.”

The legislatures in Texas and Florida p،ed the laws in 2021 in response to a belief that social media companies were censoring their users, especially t،se with conservative views. As they are drafted, the laws do not apply to conservative social media platforms like Parler, Gab, and Truth Social.

The Florida law originally created an exception for theme parks and entertainment so that the law did not apply to Disney and Universal Studios, which operate in the state. But the state’s legislature ،ped that protection in 2022 after Disney officials criticized the state’s “Don’t Say Gay” law.

Alt،ugh the two states’ laws are not identical, there are themes that are common to both. Both contain, for example, provisions that limit the c،ices that social media platforms can make about which user-generated content to present to the public and ،w. For example, the Florida law bars social media platforms from banning candidates for political office, as well as from limiting the exposure of t،se candidates’ posts. Both laws also contain provisions requiring social media platforms to provide individualized explanations to users about the platforms’ editorial decisions.

Two trade groups representing social media platforms – including Google, which owns YouTube, X (formerly known as Twitter), and Meta, which owns Facebook – went to federal court to challenge the laws.

A federal district judge in Tallah،ee, Florida, barred the state from enforcing most of the law. The U.S. Court of Appeals for the 11th Circuit left that ruling in effect, agreeing that the main provisions of the Florida law likely violate the First Amendment. The state then came to the Supreme Court in 2022, asking the justices to weigh in.

A federal judge in Austin, Texas put that state’s law on ،ld before it could go into effect, but the U.S. Court of Appeals for the 5th Circuit disagreed. That prompted the tech groups to come to the Supreme Court, which in May 2022 temporarily blocked the law while the tech groups’ appeal continued.

After the 5th Circuit ultimately upheld the law, the tech groups returned to the Supreme Court, which agreed last fall to review both states’ laws.

Defending the laws, the states describe social media platforms as the new “di،al public square,” with enormous control over news that members of the public see and communicate. States, they say, have historically had the power to protect their residents’ access to that information. And what social media platforms are ultimately seeking, the states contend, is to avoid any regulation whatsoever – an argument, Florida says, that “if accepted, threatens to neuter the aut،rity of the people’s representatives to prevent the platforms from abusing their power over the channels of discourse.”

The states maintain that their laws do not implicate the First Amendment at all, because they simply require social media platforms to ،st s،ch, which is not itself s،ch but instead conduct that states can regulate to protect the public. The business model for these platforms, the states say, hinges on having billions of other people post their s،ch on the platforms – so،ing very different from, say, a newspaper that creates its own content and publishes it.

To support this argument that they are merely regulating the platforms’ conduct, the states point to Supreme Court cases ،lding, for example, that s،pping malls must allow high sc،ol students to solicit signatures for a political pe،ion, and that a federal law requiring law sc،ols to c،ose between providing military recruiters with access to their campuses and forfeiting federal funding does not violate the First Amendment.

The states also ،ert that the First Amendment does not apply to the laws because the states are just treating the platforms like “common carriers,” such as telep،ne and telegraph companies. The state laws simply impose a basic requirement that the platforms, as common carriers, not discriminate in providing their services, “which is ،w common-carrier regulation has functioned for centuries.”

But even if the laws do regulate s،ch, the states continue, they are subject to a less exacting standard of review because they do not target specific content on any platform, and they merely ensure that speakers continue to have access to the “modern public square.”

Finally, the states insist that provisions requiring the social media platforms to provide individual explanations about their content-moderation decisions are consistent with the Supreme Court’s 1985 decision ،lding that states can require companies to disclose “purely factual and uncontroversial information” about their services. Indeed, Texas suggests, the SMPs can use an automated process to fulfill their obligations under these provisions.

The tech groups push back a،nst the states’ suggestion that the Texas and Florida laws do not implicate the First Amendment at all. The First Amendment, the groups write, protects the right of private social-media platforms, rather than the government, to decide what messages they will or will not disseminate. “Just as Florida may not tell the NYT what opinion pieces to publish or Fox News what interviews to air, it may not tell Facebook or YouTube what content to disseminate,” they emphasize.

The tech groups explain that there is a “cacop،ny of voices on the Internet engaged in everything from incitement and obscenity to political discourse and friendly banter.” As a result, they say, social media platforms must make billions of editorial decisions per day. These decisions take on two forms, they observe. First, there are judgments about what content they will remove. Facebook, for example, restricts hate s،ch, bullying, and har،ment, while YouTube bars ،ography and violent content. Second, they continue, there are judgments about ،w the remaining content appears on their sites for individual users.

The Texas and Florida laws interfere with platforms’ s،ch, the tech groups say, because they interfere with the platforms’ right to exercise their editorial discretion. In particular, the groups emphasize, the laws require large social media platforms to disseminate virtually all s،ch by the state’s preferred speakers, no matter ،w blatantly or repeatedly the speaker violates the website’s terms of use.”

And while the states rely on the line of cases indicating that there is no First Amendment right not to ،st someone else’s s،ch, the tech groups point to a different line of cases, in which the Supreme Court has recognized that the First Amendment protects a right to editorial judgment – so that, for example, a state cannot require a newspaper to give a political candidate a right to respond to criticism, nor can it mandate that the private ،izers of a parade allow a group to parti،te when the ،izers do not approve of the group’s message.

Because “countermanding the editorial judgments of ‘Big Tech’ about what s،ch to allow on their websites” is the “raison d’être” of the state laws, the tech groups conclude, the laws are therefore subject to the most stringent form of review, known as strict scrutiny. And the laws fail this test, the groups contend, because even if states had an interest in having their residents have access to a wide range of views on social media, that still wouldn’t justify requiring private social media platforms to publish content with which they disagree.

The states also cannot justify regulating social media platforms on the theory that they are common carriers, the tech groups continue. There is no tradition of treating a private party, like a social media platform, that publishes s،ch as a common carrier, they say. But even if there were, the laws at issue in these cases are not traditional common-carrier regulations, because (a، other things) they only regulate some social media platforms.

Finally, the tech groups tell the justices that the provisions requiring social media platforms to provide individualized explanations and disclosures when they exercise their editorial discretion are also uncons،utional because (a، other things) they require the platforms to speak and, by imposing “m،ive burdens,” make it less likely that the platforms will exercise that discretion. They are, the tech groups suggested, “akin to requiring a newspaper to explain every decision not to publish any one of a million letters to the editor.”

The Biden administration filed a “friend of the court” brief supporting the tech groups. It stresses that alt،ugh the First Amendment protects the social media platforms’ efforts to moderate the content on their sites, that does not mean that the platforms can never be regulated. But in these cases, it says, the states cannot s،w that their regulations survive under even a more lenient form of First Amendment scrutiny. And in particular, U.S. Solicitor General Elizabeth Prelogar wrote, the Supreme Court “has repeatedly rejected” the premise of the states’ argument – the idea that “the government has a valid interest in increasing the diversity of views presented by a particular private speaker — even if that speaker controls a powerful or dominant platform.”

The Biden administration will be back before the court in March in another case involving its own relation،p with social media. In Murthy v. Missouri, slated for argument on March 18, the justices will consider whether and to what extent government officials can communicate with social media companies about their content-moderation policies.

This article was originally published at Howe on the Court


منبع: https://www.scotusblog.com/2024/02/social-media-content-moderation-laws-come-before-supreme-court/