Section 230 of the Communications Decency Act (the “CDA” or “Section 230”), known prolifically as “the 26 words that created the internet,” remains the subject of ongoing controversy. As extensively reported on this blog, the world of social media, user-generated content, and e-commerce has been consistently bolstered by Section 230 protections. The same law, however, has also famously played a factor in the proliferation of harmful content. Despite bipartisan support to amend the CDA and a renewed appetite of late to regulate technology, the myriad attempts to reform the CDA in Congress have largely sputtered, outside of one Congressional carveout in 2018 (and most litigants that have attempted artful pleadings to bypass CDA immunity have failed in court, outside of a few narrow exceptions).
Yet, the bipartisan critics of the CDA have not given up. More recently, those critics have become more vocal in blaming Section 230 for enabling the worst of the internet (e.g., hate speech, revenge porn, disinformation and other objectionable content) and for giving large tech platforms too much leeway in moderating or presenting content. This past year, Congress examined additional measures to expand regulation of “Big Tech,” including: passing a law with a provision that could ban the social media platform TikTok; engaging in serious discussions about a comprehensive data privacy bill that would minimize online data collection; holding hearings about social media use by minors; and drafting amendments to update the federal children’s privacy law (“COPPA”). In particular, Congress has focused on AI regulation, introducing a bill that would expressly remove most immunity under the CDA for a provider of an interactive computer service if the conduct underlying the claim or charge “involves the use or provision of generative artificial intelligence.” Legislation has also been considered to regulate the use of AI deepfakes and prohibit the sharing of non-consensual deepfake pornographic images.
In May 2024, members of the Congressional Subcommittee on Communications and Technology held a hearing to present the latest attempt at taming the CDA: an official discussion draft of a bill that would sunset Section 230 by December 1, 2025. In that meeting, sponsors of the bill commented that “Big Tech” must be reined in, and that ending Section 230 protection is a critical first step in doing so. According to the subcommittee meeting summary, the intent of the legislation is “not to have Section 230 actually sunset,” but to prompt Congress “to advance a long-term reform solution to Section 230.” The drafters propose a collaboration with Big Tech and others to “enact a new legal framework that will allow for free speech and innovation while also encouraging these companies to be good stewards of their platforms.” In the aforementioned subcommittee meeting, House Energy Committee Ranking Member Frank Pallone, Jr., (D-NJ) urged Congress to act now, stating that “[s]ection 230 has outlived its usefulness and has played an outsized role in creating today’s ‘profits over people’ internet.” Pallone observed that “it is only a matter of time before more courts chip away at Section 230, or the Supreme Court or technological progress upends it entirely.” Critically, Pallone believes that prioritization of AI generated search results provided by major search engines over those provided by third parties is an “an intentional step outside of the shelter of Section 230’s liability shield” and should be swiftly addressed.
How We Got There
Congress originally passed Section 230 in 1996 to incentivize the expansion of the internet, a then relatively new technological frontier. At that time, Congress enacted CDA Section 230 in response to case law that raised the specter of liability for any online service provider that attempted to moderate its platform, thus discouraging the screening out and blocking of offensive material.
The key provision of the CDA, Section 230(c)(1)(a) states: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” Congress hoped that new platforms, mostly protected from the nuisance of multiple suits related to third party content or content moderation efforts, would have room to innovate and that the law would promote the growth of e-commerce and the social web. The CDA was passed in the early days of e-commerce and was written broadly enough to cover not only the online bulletin boards and basic websites that were common then, but also more modern online services, web 2.0 offerings, and today’s platforms that might use algorithms to recommend user-generated content.
Section 230 has been widely acknowledged as instrumental in allowing the modern web and e-commerce to flourish. However, many critics in Congress and elsewhere have assailed certain content promotion and moderation practices online that allowed undesirable and fraudulent content to maintain a presence on the web and have pointed to some well-reported instances of harmful online behavior. In addition, some critics of the CDA claim that courts are construing the CDA too broadly, particularly with respect to provider liability if users are harmed in relation to third party content. CDA detractors have recently argued that the line between service provider and content creator has blurred in some cases when providers use algorithms that recommend or repackage content for users, conflating third party content and first party contribution. Still, such an argument has been rejected by federal appellate courts, and in what we labeled a “close shave,” the Supreme Court declined to take up this CDA issue in a closely-watched appeal involving claims against social media platforms back in May 2023. Through it all, the CDA continues to provide broad immunity for providers and users of interactive computer services against claims related to the hosting or publishing of third-party content, or if the online provider is exercising “traditional editorial functions” over third-party content.
In tandem with such cases, legislative efforts have been made to narrow the scope of Section 230, but no CDA reform bill has crossed the finish line since Congress passed the FOSTA carveout to the CDA. The passage of the Stop Enabling Sex Traffickers Act (“SESTA”), a subsection of the Fight Online Sex Trafficking Act (FOSTA), served to carve out claims involving sex trafficking from CDA 230 protections. The aim of Congress in passing FOSTA was to encourage internet companies to exercise greater responsibility over sex-trafficking related content and to give law enforcement additional tools to go after the worst offenders. And now, the sunset proposal is the latest attempt to spur Congress to come together to amend the CDA.
Final Thoughts
In an op-ed piece (subscription required; summary available here), the drafters of the CDA Section 230 sunset proposal, House Energy and Commerce Committee Chair Cathy McMorris Rodgers (R-WA), along with Ranking Member Pallone, opined that “lawmakers have tried to no avail to address these concerns….” Despite efforts to guide providers into better practices, a group of House members –surprisingly unified on this issue (to a point) – has therefore proffered that the only remaining option is to sunset Section 230 altogether.
Under the threat of repeal, the drafters of the sunset proposal, as stated in their op-ed, have now given online platforms a choice: “Work with Congress to ensure the internet is a safe, healthy place for good, or lose Section 230 protections entirely.” The likelihood of passage of this bill, particularly in an election year, are slim. Still, calls to repeal or replace the CDA have only grown louder in the last few years and some recent court decisions have further narrowed Section 230 protections. Moreover, given the somewhat unexpected bipartisan support of the recent Tik-Tok law and past statements in favor of curtailing the CDA made by President Biden, passage of such a drastic measure is still possible, if not improbable. Businesses and investors should therefore follow developments closely, as internet users and organizations that rely on Section 230 immunities may not necessarily see the demise of Section 230 in the coming months but will likely witness momentum building in support of some future reform to this important law.