Feminist Writing. Fourth Wave. For Women.

Repealing Section 230 is Not Going to End Big Tech Censorship

In fact, it’s likely to make it even worse

Repealing Section 230 is Not Going to End Big Tech Censorship

In August of 2019, I co-founded a new social media site, spinster.xyz, with my partner Alex Gleason. We created Spinster in response to the growing trend of censorship on platforms like Twitter and the abuses of “Big Tech” platforms, which manipulate users, abuse data, and control the public narrative. For a year and a half, now, we have been dealing with the day-to-day reality of running and moderating a social network with thousands of users—users who are largely disillusioned with Big Tech and mainstream social media platforms because of their censorship.

As someone who is deeply concerned about the unfettered power of sites like Google, Facebook, and Twitter, my experience actually running an alternative social platform has made one thing clear: repealing Section 230 is not the solution to ending Big Tech’s reign of terror. In fact, doing so would only make the situation worse. Without the liability shield afforded by Section 230, mainstream platforms would be motivated to engage in even more censorship and small platforms, such as Spinster, would be crushed under the burden of the new legal liabilities.

What is Section 230?

Section 230 is a part of the Communications Decency Act of 1996, a federal law that attempted to regulate underage access to “obscene or indecent materials” and the transmission of child pornography. Section 230, specifically, provides immunity from liability to internet services that host user-generated content by stating that the service provider shall not be “treated as the publisher or speaker of any information provided by another information content provider.”

In the United States, offline publishers of content, such as newspapers and magazines, can be held liable for damages caused by republishing the statements of others. This means, for example, that if a newspaper directly quotes someone making defamatory statements, the victim of the defamation may choose to hold the newspaper liable for publishing those statements (in addition to the original speaker of the defamatory statement). This also extends to content in advertisements and letters to the editor.

Prior to Section 230, online platforms that engaged in any moderation of content may have been considered “publishers”—meaning, they would have been liable for the content posted by users.

The law was created in response to the case of Stratton Oakmont, Inc. v. Prodigy Services Co. in 1995. In this case, a user posted defamatory statements against Stratton Oakmont on an internet forum hosted by Prodigy. The New York Supreme Court found that because Prodigy engaged in some moderation of content, it should be considered a publisher and therefore was liable for the defamatory content.

This ruling determined that Prodigy was a “publisher” based on four things:

  1. Prodigy had published “content guidelines” preventing things such as “insulting” notes.
  2. The forum was using software to pre-screen posts for offensive language.
  3. The forum had moderators whose duties included enforcing the content guidelines.
  4. Moderators were able to remove content and inform users of this action.

Meanwhile, in a similar case, Cubby, Inc. v. CompuServe Inc., an internet forum was not found to be liable for defamatory statements because they were not considered a publisher but merely a “distributor,” since they were not actively engaged in content moderation.

Section 230 reconciled this perverse incentive against moderation by stating that sites hosting user-generated content will not be considered publishers and therefore are not liable for that content. This is called the “Good Samaritan” protection. Section 230 has been called the “most important law protecting internet speech” and is considered instrumental in building up the internet as we know it today.

The protections provided by 230 are not without limits. For example, sites are still responsible for removing illegal content (such as in instances of copyright infringement or child pornography). Additionally, the 2018 FOSTA-SESTA amended the Communications Decency Act to add additional limitations to online services involved in sex trafficking.

How would platforms respond to Section 230 repeal?

Politicians on both the right and left have expressed a desire to repeal Section 230. However, this is not a case of bipartisan agreement. Conservatives and liberals want to revoke 230 for different reasons, and in hopes of very different outcomes.

Donald Trump and some Republicans have expressed outrage at the censorship of mainstream media, alleging an unfair bias on platforms like Twitter against conservatives. Democrats like Joe Biden, on the other hand, have expressed frustration about the ability for misinformation to spread across platforms like Facebook and their inability to hold platforms accountable.

In short, Republicans want to repeal Section 230 because they want less censorship on social media. Democrats want to repeal Section 230 because they want more power to reign in Big Tech. Unfortunately, both are likely to be disappointed if Section 230 was fully repealed.

In a world without Section 230, online hosts of user-generated content have two options to avoid liability:

  1. Aggressively moderate and censor anything that could pose even a remote liability risk.
  2. Moderate nothing at all, removing only content as required by law.

I don’t believe that anyone can reasonably expect platforms like Twitter and Facebook to take the free speech option. These sites have already demonstrated a desire to censor beyond what is required by law time and time again. Large platforms like Twitter, Google, and Facebook know that their users want at least some moderation. Allowing their sites to become a total free speech free-for-all would provide a bad user experience for the average user, and drive their money-makers (including advertisers) away.

Rather, Big Tech platforms are likely to double-down on moderation and censorship in a post-230 internet. Any content that could possibly put the site at risk would be removed with the goal of limiting liability.

That costs money, though. Platforms will need to massively increase their moderation infrastructure to meet the new demand. Responding to all of the lawsuits that will arise will become a major expense for any site hosting user-generated content, even with robust moderation.

But not all social media or websites are Big Tech. In fact, most aren’t. Small platforms like mine, Spinster, would crumble. If we chose to maintain moderation, keeping the culture of the site in line with the desires of our users, we could not possibly stand up financially to the lawsuits with which we would likely be hit. If we chose to end moderation and become a “distributor” of content, the site would deteriorate in a matter of days.

Spinster, a site whose core audience is feminist women, has rules against hate speech, harassment, and any form of pornography. Much of this content is not necessarily illegal to host in the United States, so we don’t have an obligation to remove it. We choose to remove this sort of content because it provides a bad user experience for our regular users and core audience who have come to expect a (mostly) civil, welcoming, and woman-first culture on the site.

However, if 230 was repealed, moderating this content would result in Spinster being classified as a “publisher” of content. We would be legally liable for every potential statement of defamation posted on the platform. This would open us up to lawsuits from every person who has been criticized on the platform (of which, there are many). Even if the statements were ultimately not found to be false or defamatory, the cases could go far enough to cause us significant legal costs.

Sites that aren’t focused on user-generated content, such as blogs that allow comments, may choose to simply stop hosting third-party content altogether to avoid liability.  

Platforms like Twitter and Facebook can afford to take on this cost. Of course, they don’t want to, which is why they are fighting Section 230 repeal, but they would likely survive the change. Small sites and alternative social networks would likely crumble or become unusable under the repeal of Section 230. The net effect is that there would be more censorship on Big Tech platforms, and fewer alternatives to them.

Some proposals aimed at addressing Section 230 have not proposed the full repeal of the law but rather attempt to reform it by targeting platforms of a certain size, focusing on areas of concern like child pornography or harassment, forcing political neutrality, or creating further requirements to gain liability protection such as transparency disclosures. There are pros and cons to many of these proposals, and examining them each in detail is beyond the scope of this article.

So how do we fight back against Big Tech, then?

Just because Facebook and I agree on not repealing Section 230, that doesn’t mean we are on the same side. The threats posed by Big Tech and other proprietary software can not be overstated. Technological freedom must be considered the civil rights fight of the 21st century. There are a variety of ways that we can resist Big Tech without undermining the efforts of alternative platforms and the free software movement.

We can, and should, use every tool known to the social justice movements of the past to resist the power and abuses of Big Tech. But, most importantly, we must support the growth of alternative platforms. Big Tech will fall only when users have other viable spaces where they can meet their needs for connection, community organizing, information gathering, and everything else the internet can offer us.

Repealing Section 230 will likely only make every problem with Big Tech worse—monopolies, censorship, and abuse of users. Instead, you can help end their death grip on society by creating an account on the Fediverse (a distributed social network), moving your email hosting away from Gmail, making little swaps in the day-to-day software you use, or by supporting organizations like the Free Software Foundation fighting to protect our technological freedoms.


The generous support of our readers allows 4W to pay our all-female staff and over 50 writers across the globe for original articles and reporting you can’t find anywhere else. Like our work? Become a monthly donor!

Enter your email below to sign in or become a 4W member and join the conversation.
(Already did this? Try refreshing the page!)