No. 21-1333
In the Supreme Court of the United States
REYNALDO GONZALEZ, ET AL., Petitioners,
v.
GOOGLE LLC.
On Writ of Certiorari to the United States Court of Appeals for the Ninth Circuit
Brief for Reason Foundation as Amicus Curiae supporting respondent
I. For nearly three decades, Section 230 has served as the backbone of the Internet, precisely as Congress correctly anticipated and intended. The legislatively enacted congressional findings and purpose favor an expansive reading of Section 230’s protections in the event of any uncertainty or perceived ambiguity in the language of Section 230(c)(1).
A. Section 230’s benefits were by design, even if Congress could not have predicted every detail—or challenge—of a growing Internet. What Congress did know is that, for the Internet to grow, it had to be left alone without fear of the “litigation minefield,” Resp. Br. 19, that would cripple its expansion in its infancy if the providers and users of interactive computer services could be found liable for the content created by others. Congress thus enacted Section 230 with a list of policy statements that show what it intended and expected the statute to do: protect platforms and users from liability for the speech of others and promote the growth and use of interactive computer services.
Congress explained that the goal of Section 230 is to “promote the continued development of the Internet” by, among other things, “encourag[ing] the development of technologies which maximize user control over what information is received by” those “who use the Internet and other interactive computer services.” Id. (b)(1), (3). Section 230 has done that. Congress also expressed the importance of “preserv[ing]the vibrant and competitive free market that present.ly exists for the Internet and other interactive computer services, unfettered by Federal or State regulation.” Id. (b)(2) (emphasis added). Section 230 has created that world, too.
Those policy statements are not mere pieces of legislative history entered into the Congressional Record by opportunistic politicians or their staffers—to the contrary, they are the product of bicameralism and presentment just like any other duly enacted legislation. And such statements are entitled controlling weight regarding what policy considerations might potentially influence the interpretation of Section 230. Whether Section 230 creates good policy is not a question for this Court to decide. That question remains where it was in 1996—with Congress.
B. Even years after Congress’s legislative findings and purpose, Section 230 has overwhelmingly fulfilled such legislative predictions and goals. By providing immunity from liability for the content posted by others, it has allowed for the development of new technologies that make it easier for everyone to find information online, to organize and to let others help organize the information they receive, and to associate both directly and indirectly with people around the world sharing common interests. These advances in technology have also led to the development of all manner of social media sites, including video-based platforms, dating apps, and even improved traditional chatrooms providing users many of the same organization tools as providers themselves.
The improved ability to find and organize information online is only one of the many benefits of Section 230. It also has led to an exponential growth in the amount of speech online. As providers have innovated and users have enthusiastically participated in online speech free from “the “specter of liability,” Zeran v. America Online, Inc., 129 F.3d 327, 331 (4th Cir. 1997), interactive computer services have made it easier for ideas to spread than ever before in human history. Through retweets and other user engagements, the views and content created by even the poorest Americans can spread around the country and world in a way that wouldn’t have been possible just twenty years ago.
Other benefits from Section 230 abound. The economic benefits to innovators, providers, users, and the economy as a whole have been tremendous. It has facilitated the gig economy by allowing individuals and small businesses to flourish on websites provided by bigger platforms. It has also allowed consumers to directly review products and other services, make those reviews readily available online for the next consumer, and pass along or comment upon reviews by others, thus democratizing the marketplace of products and services as well as the marketplace of ideas. Thus, insofar as such practical considerations matter to the interpretation of Section 230(c)(1), the findings and purposes of Congress are not only controlling, they are right.
II. The language of Section 230 both reflects such Congressional policy and confirms that Respondent should prevail in this case.
A. An “interactive computer service” “provides or enables computer access by multiple users to a computer server.” 47 U.S.C. § 230(f)(2). “Interactive computer services” expressly include “access software providers,” which—as relevant here—are providers of software or tools that can “pick, choose, analyze, or digest,” “transmit, receive, display, forward, cache, search, subset, organize, reorganize, or translate content.” Id. (f)(2), (4)(B), (C). The providers of such services and their users can both create their own information content and can organize, transmit, and provide access to information content provided by others.
B. YouTube’s algorithm, which organizes and reorganizes the content uploaded to YouTube by others, thus performs a function which Congress expressly included in the definition of an interactive computer service. Indeed, as both a provider and user of such software, Respondent falls squarely within the class protected by Section 230(c)(1). Insofar as Petitioners are seeking to hold Google liable for the consequences of having presented or organized the “information provided by another,” rather than for creating and publishing Google’s own information content, Section 230(c)(1) bars such liability.
To the extent any given algorithm or other organizational policy or choice might be said to create Google’s own “content,” the further question becomes the precise parameters of such content as distinguished from the content of others. That distinction helps clarify that even where an algorithm or other organizational action or policy itself might create some information content (appending a warning label for example), a user or provider may only be held responsible for that information alone, and not the underlying information “provided by another.” Alternatively, if YouTube or any other user of its service were to expressly adopt or endorse the information content of another as its own, such adopted content may well fall outside of Section 230’s protection.
But merely identifying, organizing, or even recommending the content of another is a far cry from adopting it as your own. YouTube’s algorithm, for example, analyzes different users’ activity and viewing behavior to predict what that user might find interesting and to organize further information content provided by others according to such predictions. Though the algorithm’s analysis and predictions are more automated and sophisticated than manual efforts to organize or recommend content in a manner appealing to users, it remains fundamentally the same as the manual choices exercised by chatroom moderators, bloggers, and indeed, any individual user who selects, reposts, “likes,” or otherwise passes along the information content of others in a way such user believes might be interesting or appealing to her followers and potential followers. Such organizational effort by both providers and users of interactive computer services is precisely what Congress anticipated and intended to encourage via Section 230, and the text provides broad protection reflecting that purpose.