Skip to content
The Present

The Supreme Court will soon decide the future of social media

Should social media platforms have the right to decide what speech to allow online? Should the government?
The United States Supreme Court building, a neoclassical structure with tall columns.
Wiki Commons / Joe Ravi
Key Takeaways
  • Two cases, Moody v. NetChoice and NetChoice v. Paxton, are currently before the US Supreme Court.
  • The outcome could decide whether social media companies have the legal right to moderate content on their platforms or if that qualifies as censorship.
  • Whether the Court confronts the issue now or postpones a decision for a later date remains to be seen.

Do Facebook and YouTube have the legal right to block or remove users or posts from their websites because of what the users or posts are saying? Or, should the law compel Facebook and YouTube to feature users or posts that the platforms do not want to be associated with?

Those questions are at the heart of a pair of potentially blockbuster cases that the US Supreme Court is currently deciding. The stakes are high not only for the future of social media, but for the future of First Amendment law as well. 

The cases are Moody v. NetChoice and NetChoice v. Paxton. At issue are state laws, passed by Florida and Texas, respectively, which seek to control the content moderation practices of the biggest social media platforms, including Facebook and YouTube.

According to the state officials involved, those content moderation practices currently amount to illegal “censorship” of conservative and right-leaning views. “Big Tech censors will now be held accountable,” declared Florida Governor Ron DeSantis, when he signed his state’s law in May 2021, when they “discriminate in favor of the dominant Silicon Valley ideology.” Texas Governor Greg Abbott made a similar point when he signed his state’s law several months later. “There is a dangerous movement by social media companies to silence conservative viewpoints and ideas,” Abbott announced. “That is wrong, and we will not allow it in Texas.”

The Florida law forbids large social media platforms from taking actions that “unfairly censor, shadow ban, deplatform, or apply post-prioritization algorithms to Florida candidates, Florida users, or Florida residents.” The Texas law likewise makes it illegal to “block, ban, remove, deplatform, demonetize, [or] de-boost” any user or post “based on…the viewpoint of the user” or “the viewpoint represented in the user’s expression.”

According to Paul Clement, the lawyer representing the social media platforms before the Supreme Court, the consequences of letting these laws go into effect would be sharply negative. If social media sites “have to be viewpoint-neutral,” Clement has argued, that means “if you have materials on your site that are pro-Semitic, then you have to let materials onto your site that are anti-Semitic. And that is a formula for making these websites very unpopular to both users and advertisers.”

Legally speaking, there are two potentially fatal problems with these state laws. The first problem is that both laws operate under the assumption that companies like Facebook and YouTube have been guilty of “censoring” certain viewpoints. However, censorship is technically something that only the government can do. 

Think about the language of the First Amendment, which says, “Congress shall make no law…abridging the freedom of speech.” Because Facebook and Twitter are private entities — not government entities like Congress — the First Amendment does not stop them from “censoring” anyone because they are not technically capable of censoring anyone. Yes, a particular content moderation decision may result in a user getting booted from a popular social media site, thus preventing the user from reaching the site’s audience on its platform. But because that decision was not made by the government, it does not implicate “the freedom of speech” recognized by the Constitution.

In an effort to overcome this problem, officials in Florida and Texas have asserted that Facebook and Twitter should not count as private entities at all because they have morphed into something more like quasi-government bodies. “Social media platforms have transformed into the new town square,” the text of the Florida law declares, and “have become as important for conveying public opinion as public utilities are for supporting modern society.” The Texas law similarly states that social media platforms “are affected with a public interest,” likening the platforms to “common carriers,” such as telegraph or railroad companies, which were traditionally required by law to serve all comers. According to Florida and Texas, today’s social media giants can and should be legally stopped from turning away users because of their views.

Which brings us to the second potentially fatal legal problem with the two state laws. As private entities, Facebook and YouTube have First Amendment rights of their own. And a long line of Supreme Court precedent has held that media companies of every sort possess the First Amendment right to make their own “editorial” decisions about what content to feature — and not feature — on their own platforms.

In Columbia Broadcasting System, Inc. v. Democratic National Committee (1973), for example, the Supreme Court held that broadcasters have a First Amendment right to refuse to run political ads. “Editing is selection and choice of material,” the Court said. “That editors — newspaper or broadcast — can and do abuse this power is beyond doubt.” But “the presence of these risks is nothing new,” the ruling held, referring to the “risk” of powerful media entities elevating some voices while denying their platforms to others. “The authors of the Bill of Rights accepted the reality that these risks were evils for which there was no acceptable [legal] remedy.”

One year later, in Miami Herald Publishing Co. v. Tornillo (1974), the Supreme Court overturned a state law that required newspapers to grant political candidates equal space to respond to their critics. When a state government forces a newspaper to publish something that the paper’s editors do not want to publish, the Court said, the First Amendment is violated. “Compulsion to publish that which ‘reason’ tells them should not be published,” the decision stated, “is unconstitutional.”

More recently, in 303 Creative v. Elenis (2023), the Supreme Court noted that the same free speech principles that apply to print and broadcast media also apply to speech “conveyed over the Internet.” In that case, the Court held that the First Amendment barred a state government from requiring a graphic designer to create same-sex wedding websites despite her personal objections to creating such expressive content. The First Amendment secures the right of private parties “to think and speak as they wish,” the Court said, “not as the government demands.”

NetChoice, the trade group that hired Paul Clement to represent the big social media platforms before the Supreme Court, has cited these and related precedents in support of its argument that the Florida and Texas laws both violate the First Amendment because they interfere “with the rights of private parties to exercise editorial discretion in the selection and presentation of speech.” In other words, the argument goes, online content moderation is just another form of constitutionally protected editing done by a private media enterprise. If Facebook or YouTube decides to edit (or moderate or arrange) the expressive content on its website in a manner that spotlights some views instead of other views, it is the social media company’s editorial right to do so under the First Amendment. Unhappy users are free to post their unwelcome content elsewhere.

The Supreme Court heard oral arguments in the NetChoice cases on February 26. While it is never wise to predict the outcome of a case based solely on the oral arguments, there were some notable moments.

One such moment occurred when Justice Samuel Alito grew impatient with the use of the term content moderation to describe conduct by the social media companies that he seemed ready to call censorship. “The particular word you use matters,” Alito said, a note of irritation discernible in his voice, because “some may want to resist the Orwellian temptation to recategorize offensive conduct in seemingly bland terms.” 

A few minutes later, Justice Brett Kavanaugh spoke up to rebut Alito’s statement. “When I think of ‘Orwellian,’ I think of the state, not the private sector,” Kavanaugh said, such as “the state taking over media, like in some other countries.” And “in Tornillo,” Kavanaugh continued, referring to the earlier free speech case, “the Court made clear that we don’t want to be…that country” and “we don’t want the state interfering with these private choices.”

Smarter faster: the Big Think newsletter
Subscribe for counterintuitive, surprising, and impactful stories delivered to your inbox every Thursday

This verbal sparring between two nominally “conservative” justices captured in miniature what the underlying constitutional dispute is all about. Alito suggested that big social media sites now wield government-like powers over their users, and therefore the sites should be subjected to greater restrictions. Kavanaugh, by contrast, suggested that the platforms are private media enterprises, and so they are protected from government efforts to dictate what appears on their websites. If a majority of the Supreme Court agrees with Alito, the Florida and Texas laws will win. If a majority agrees with Kavanaugh, the social media sites will come out on top.

A third, more anticlimactic legal outcome is also possible. During the oral arguments, several justices suggested that the Supreme Court might dodge the constitutional debate by finding that one or both of the laws are so broadly written that even if they are probably unconstitutional as applied to Facebook or YouTube, they might still have other constitutional applications not covered by the facts of these cases. Under this view, the Court would effectively send the dispute back down to the lower courts on the grounds that it would be a mistake at this stage to either reject or uphold the laws in their entirety.

That result would mean a technical win for the states (since the laws were not overruled, as NetChoice urged), but it would also open the door for new lawsuits filed against specific applications of the laws. NetChoice (and others) would immediately file those new suits and, in the meantime, the state laws would most likely be blocked from going into effect while the new litigation played out. Needless to say, one or more of those new lawsuits would inevitably arrive back at the Supreme Court. Still, if the justices are looking for an off-ramp right now, this approach is their map to it.

Sooner or later, the Supreme Court will have to squarely confront the underlying constitutional debate over censorship and social media content moderation. Whether the Court confronts it here in these two cases or postpones the fight for a later date still remains to be seen.

This article was originally published by our sister site, Freethink.


Related

Up Next