Texas social media law violates the First Amendment
Photo 80513268 © Peng Ge | Dreamstime.com


Texas social media law violates the First Amendment

Prohibiting social media companies from engaging in content moderation would harm consumers.

A new Texas law that aims to regulate social media companies has been halted by a US District Court while it awaits a full hearing.  The Texas Tribune reported:

A federal judge on Wednesday blocked a Texas law that seeks to restrict how social media companies moderate their content and was championed by Republicans who say the platforms are biased against conservatives.

The law, signed by Gov. Greg Abbott on Sept. 9, would ban platforms with more than 50 million monthly users in the U.S. from removing a user over a “viewpoint” and require them to publicly report information about content removal and account suspensions.

It was set to take effect Dec. 2. In his ruling, U.S. District Judge Robert Pitman wrote that the First Amendment protects social media platforms’ right to moderate content and rejected the defendants’ argument that such companies are “common carriers.” Pitman also ruled that some aspects of the law were “prohibitively vague.”

“This Court is convinced that social media platforms, or at least those covered by [House Bill] 20, curate both users and content to convey a message about the type of community the platform seeks to foster and, as such, exercise editorial discretion over their platform’s content,” Pitman wrote.

This marks the second successful First Amendment lawsuit by the internet freedom association NetChoice in the last six months as a similar law in Florida was also blocked by a federal judge in June.

The main thrust of the Texas law passed by Republicans in the state legislature earlier this year would prevent social media companies from “censoring” a user based on their “viewpoint.”  It defines a user as anyone that can create a profile and make a post. This effectively means that social media companies would not be allowed to engage in any form of content moderation on their platforms. 

The court expressed concern that the law was being passed because some legislators perceived social media as, “too large and too liberal,” in their ability to exclude certain viewpoints from the most popular media platforms on the internet today. The federal judge presiding over the case also said the law’s “prohibitions on ‘censorship’ and constraints on how social media platforms disseminate content violate the First Amendment.”

NetChoice’s first lawsuit against legislation limiting content moderation by social media companies involved a similar bill passed in Florida and led a District Court to declare, “Balancing the exchange of ideas among private speakers is not a legitimate government interest. And even aside from the actual motivation for this legislation, it is plainly content based.” 

In other words, the court rightly recognized that the government has no legal authority to interfere with the way a private company decides to curate content that it presents to its customers, especially when their interference is politically motivated. 

The key legal reason the Texas law fails is the way it misapplies the First Amendment to different types of content providers.  So-called “common carriers” of content like telecommunications companies and postage services, whose sole purpose is to transmit information, are regulated without First Amendment concerns and are not legally allowed to filter out any kind of content.  These companies don’t present content to customers, they simply “carry” it to them, usually without seeing the content themselves, and are legally not allowed to screen content and refuse to carry it if they disagree with it. 

This is in contrast to an editorial platform that curates content designed to attract and appeal to users. Anything posted on the platform by users is entirely at the discretion of the host primarily because they are providing a service to users/readers and are seeking to make a profit by providing what customers want. A content feed clogged with spam and violence would immediately turn away customers, but finding just the right mix of content is harder than it sounds.  

The COVID-19 pandemic has created an important debate about the science of viruses and the efficacy of vaccines.  YouTube initially suspended the Ron Paul Institute’s channel due to concerns about misinformation regarding the virus but later reinstated it.  This suggests an obvious internal struggle at YouTube to keep up with developing science, uphold its internal standard of information quality, respond to complaints from other users, and be fair to individual accounts on appeal.

Social media companies face a complex task deserving of flexibility, the kind that comes with trying to figure out what customers want, how to provide it, and make a profit. The gloomy alternative is a “chilling” effect whereby social media companies would forgo any kind of controversial content to forgo liability.  This would be a tragedy given the diversity of conversations occurring on the internet and would reduce total business output in the United States. 

Beyond pleasing customers and making a profit, it is also a private business’s First Amendment right to promote any kind of speech they want without government coercion.  In Miami Herald Publishing v Tornillo, the Supreme Court struck down a “right of reply” statute that forced newspapers to host the reply of a political candidate if they attacked their record. The law would have forced privately-owned newspapers to print speech against their will.

Similarly, Hurley v Irish struck down a law requiring a private parade host to include a gay rights group in their parade because it resulted in the government forcing people to host speech against their will.  A unanimous decision stated that forcing the parade to host the gay rights group would, “violate[s] the fundamental First Amendment rule that a speaker has the autonomy to choose the content of his own message.”  Whether the forum is a parade or a Facebook post, the constitutional principle is that users have no right to force platform providers to host their speech. 

Social media companies can certainly improve their policies and transparency around content moderation, but market forces are already creating this pressure, raising doubt about the government’s compelling interest. 

Long before the internet, politicians leveled similar complaints about content in newspapers as “the dominant features of a press that has become noncompetitive and enormously powerful… in its capacity to manipulate popular opinion and change the course of events.”

This sounds exactly like the complaints levied today against big tech and, thankfully, the courts are seeing it that way and once again ruling against political interference in free speech issues.  Prohibiting social media companies from engaging in content moderation on their own platforms would harm consumers, clutter the internet or ultimately destroy social media companies and such regulations should remain unconstitutional.