The Supreme Court is considering a very important case regarding the future of the internet and digital platforms, from search to social media. As SCOTUS Blog puts it, Gonzales v. Google questions:
Whether Section 230(c)(1) of the Communications Decency Act immunizes interactive computer services when they make targeted recommendations of information provided by another information content provider, or only limits the liability of interactive computer services when they engage in traditional editorial functions (such as deciding whether to display or withdraw) with regard to such information.
Reason Foundation submitted an amicus brief in the Gonzales v. Google case in which we argue that Section 230 is functioning as Congress intended, with great benefits to the users of the digital platforms, and that the change in interpretation of the law that plaintiffs are asking for would have devastatingly negative consequences.
The following segments are pulled from that amicus brief and quoted at length to convey our arguments.
Section 230 and congressional intent
Reason Foundation’s brief argues that the plain text of Section 230 precludes considering a digital platform to be a publisher just because they use an algorithm to organize and present the content its users provide to others who might be interested in it. Indeed, when Congress passed the Communications Decency Act, it included congressional findings and purposes that make this clear. Reason’s amicus brief states:
What Congress did know is that, for the Internet to grow, it had to be left alone without fear of the “litigation minefield,” … that would cripple its expansion in its infancy if the providers and users of interactive computer services could be found liable for the content created by others.
Congress intended the government, including the judiciary, to get out of the way of the Internet’s growth. Congress explained that the goal of Section 230 is to “promote the continued development of the Internet” by, among other things, “encourag[ing] the development of technologies that maximize user control over what information is received by” those “who use the Internet and other interactive computer services.” Id. (b)(1), (3). And Congress expressed its goal of “preserv[ing] the vibrant and competitive free market that presently exists for the Internet and other interactive computer services, unfettered by Federal or State regulation.”
Crucial to this intent is understanding that using algorithms to organize, present, or prioritize content created by users does not equate to publishing or to accepting liability for the content shared. Technically, a chronological timeline of user content would still be a simple algorithm. Just because services use other algorithms to present content does not fundamentally change their legal standing under section 230:
Like every other court to decide this issue, this Court should recognize that “[m]erely arranging and displaying others’ content to users of [YouTube] through… algorithms—even if the content is not actively sought by those users—is not enough to hold [YouTube] responsible as the ‘developer’ or ‘creator’ of that content.”
The benefits of broad Section 230 protections
Section 230 has provided immunity from liability for content posted by others, which has greatly leveled the playing field in terms of information available. Interactive services make it easier than ever to reach others online:
The technological innovations made possible by Section 230 have also greatly increased the ability of the average American to spread ideas. Section 230 protections … allow interactive computer services to provide their users, rich and poor alike, with a reach that, historically, would not have been available even to the most privileged classes with access to the gatekeepers of the institutional press. Indeed, because of its ability to expand the reach of speech, Section 230 has been described as “the internet’s First Amendment—possibly better.”
Section 230 has allowed for a proliferation of information sources that consumers have utilized for everything from general news to the details of specific products. Analysis of how consumers use the internet and digital platforms shows how crucial it is that Section 230 enables user-submitted reviews of products, something platforms can share without fear of liability. Those reviews make it easier for consumers to find a product that meets their needs. One Internet Association (IA) survey of how consumers use online reviews found:
- 67% of respondents said they check online reviews either most of the time or every time before buying products in person or online
- 72% said it is highly important for a business to have positive online reviews before they buy
- 85% either strongly agreed or somewhat agreed that they would be less likely to purchase products online that did not have any reviews
- 65% responded with a seven, or above, out of 10 when asked how much they trust online reviews on a scale of one to 10
As we explain in the amicus brief:
[R]esearch shows that “[b]uyers are looking to their peers to understand which products and services will benefit them, as peers can provide unbiased, individualized information.” Neither consumer reviews—nor the purchases they lead to—would be possible without Section 230 protections, particularly if platforms were potentially responsible for any reviews they hosted or organized in a manner useful to other shoppers.
Beyond the importance of being able to share user reviews without liability for them, Section 230 is crucial to the flourishing of small businesses in a world where the digital side of doing business is crucial.
After all, a “single small provider may use multiple large providers to operate their own service or forum” by, among other things, “maintain[ing] accounts and advertis[ing] across multiple social media and other services, in addition to relying on ISPs, domain name registrars, and hosting providers.” If larger providers lacked protections for content posted by smaller providers, it is unlikely that they would make their services available to them. Loss of Section 230 protections could thus harm not only digital platforms that have dominant market share, but every-one down to the atomized worker in the gig economy merely trying to get word out about her services and to be matched with users most likely to be interested in such services.
We can’t all be publishers
The narrow interpretation of Section 230 sought in this case would make everyone who likes or shares content on a digital platform a publisher and liable for that content, which would be patently absurd. There are many reasons to like and share content besides endorsing it. Again from the brief:
If Petitioners’ theory is correct, and Google truly is liable for recommending the content created by its users via an algorithm or otherwise, then every time a user of an interactive computer service shares a video, blog, or tweet created by another, then that user would become a developer of the underlying content and face potential liability for such content. Indeed, under Petitioners’ approach, by sharing another user’s content, the sharing user becomes the means by which that content reaches a broader audience. This is no different than what an algorithm does.
And, as with algorithms, even though retweeting or sharing the content of another may not be an endorsement of a particular message, both are—at the very least—recommendations that the retweeted or shared content be viewed. Put differently, if suggesting content via an algorithm somehow falls outside of Section 230’s gambit by magically transforming one party’s speech into the speech of the platform, so too does retweeting it.
All of this does not equate to an argument that digital platforms are never publishers. If a platform affirmatively endorses an idea or content, then that would constitute publishing and be subject to liability. However, given that some type of algorithm, chronological or other, is required to present information to users, the mere presentation of content cannot and should not constitute publication.
Section 230 clearly is meeting the intent of Congress when it was created. If there are real problems with the current rules governing sharing digital content, Congress can fix them with new legislation. That is how such problems should be solved, not by the Supreme Court reinterpreting what Congress quite clearly intended. We would argue that Congress should not mess with Section 230 and should avoid trying to fix what is not broken.
Who is more responsible?
The recent so-called Twitter Files and the Facebook Files revealed how, in recent years, the federal government has pressured and implicitly threatened digital platforms to suppress some speech the government did not like. However unhappy you may be with the content moderation decisions of any digital platform, giving the government the power to regulate those decisions has a clear outcome–more of what we saw them do to Twitter and Facebook, even when the government did not have the clear legal authority to do so. Weakening Section 230 would ensure that whichever political party is in power at a given time could steer the speech that is allowed online, and the online speech we see would be even more partisan than today. No one wants that.