As lawmakers continue to investigate and hold hearings on the Jan. 6 riots, some claim social media companies like Facebook should be getting a bigger share of the blame. One of Facebook’s fiercest critics on the topic is Frances Haugen, a former Facebook data scientist and product manager turned whistleblower who shared company documents with Congress and the media. CNN reported on Haugen’s claims and the documents she leaked last year:
One of Haugen’s central allegations about the company focuses on the attack on the Capitol. In a SEC disclosure she alleges, “Facebook misled investors and the public about its role perpetuating misinformation and violent extremism relating to the 2020 election and January 6th insurrection.”
Leaked documents from Haugen first began appearing in The Wall Street Journal earlier this year. Revelations in the newspaper’s ongoing series of reports, The Facebook Files, captured the attention of lawmakers around the world. Facebook denies the premise of Haugen’s conclusions and says Haugen has cherry-picked documents to present an unfair portrayal of the company.
“The responsibility for the violence that occurred on January 6 lies with those who attacked our Capitol and those who encouraged them. We took steps to limit content that sought to delegitimize the election, including labeling candidates’ posts with the latest vote count after Mr. Trump prematurely declared victory, pausing new political advertising, and removing the original #StopTheSteal Group in November,” Facebook spokesperson Andy Stone told CNN Friday.
As the Jan. 6 committee continues its private interviews and public hearings, the focus should be on the actions of rioters and government officials. Politico reports that “more than 855 members of that crowd are facing charges that range from trespassing on restricted grounds to seditious conspiracy,” and “325 defendants have pleaded guilty to crimes stemming from the breach of the Capitol, the vast majority to misdemeanor crimes” so far.
Facebook is not ultimately accountable for what the rioters did or what President Donald Trump posted to his social media accounts after the Nov. 2020 elections or leading up to Jan. 6. In the Internet’s infancy, Congress passed the Communications Decency Act of 1996, which includes a clause known as Section 230. This clause provides crucial protections for social media and content platforms because it does not hold the companies liable for the speech that users post on websites. Essentially, just as a local water utility is not to blame for whatever a user flushes into the pipes, Section 230 says that technology platforms like Facebook are not to blame for users’ posts.
With years and years of customer data, many social media websites believe that a somewhat moderated user feed creates happier users and ultimately more profit for the companies. Social media platforms typically view improving the user experience by limiting the amount of undesirable content in customers’ feeds as good for business. Today, most large-scale online user platforms somewhat monitor and control what their users post.
Companies like Facebook utilize algorithms to automatically flag and remove inappropriate content and also provide users with methods to flag content for review. But Harvard University legal scholar Jeffrey Hermes summarizes the potential impact on social media companies like Facebook and YouTube if Section 230 was repealed:
Think about YouTube. [Without Section 230 protections,] Google today would need to hire people with sophisticated legal backgrounds to review every single piece of content on that site. There would not be enough hours in the day. You would need to have literally millions of lawyers whose only responsibility would be reviewing user videos.
Hermes is right. But if Section 230 were repealed, Facebook and Google would also be among the few companies with the money and resources to try some form of content moderation. In contrast, most smaller competitors couldn’t afford to defend themselves against every possible frivolous lawsuit related to every user’s posts. Most companies would likely stop letting people post to their platforms to avoid potential legal liability. In a future without Section 230, very few new competitors would have the financial strength to enter the market to compete with massive social media companies. Thus, big tech companies like Facebook could be further strengthened, and users would have fewer choices.
In recent years, prominent lawmakers in both major political parties have called for repealing Section 230. But Section 230 helps protect free speech online and succeeds by rightly stating that companies should not be held responsible for the actions of their users. News publishers, websites, and social media platforms are all still entirely liable for the content they create.
Jan. 6 was a terrible day, and lawmakers should continue investigating and pursuing accountability. But Section 230 and social media companies aren’t to blame. If Section 230 were weakened or repealed, it would not produce a better Internet. It would launch a slew of frivolous lawsuits and unleash massive attempts to censor online content.