The limitations of one of the most fundamental laws of the internet, Section 230 of the Communications Decency Act, is once again being tested. As a reminder, Section 230 is what makes freedom of speech on the internet possible. It does so by granting website owners/operators immunity from any liability relating to the content posted by users. Without Section 230, the internet as we know it would not exist because any site that’s built on user-generated content—which is, basically, all of them—would be too risky to operate. As important as Section 230 has been to the growth of the internet, it’s not without its faults, and in recent years it’s been at the center of much debate, and it has becoming seemingly inevitable that reform is on the way. A recent ruling by the Ninth Circuit Court of Appeals in Lemmon v. Snap may be an early indication of Courts narrowing the broad immunity that’s historically been provided by Section 230.
Social media has experienced an unprecedented growth in popularity and usage since its inception. This is owed in large part to Section 230 of the Communications Decency Act. Unlike their print counterparts, internet publishers enjoy an increased level of freedom and immunity under Section 230 for the content they publish. It is Section 230 that gives social media companies, large and small, the ability to manage or host third-party content without fear of lawsuit. As intermediaries, Section 230 ensures these companies will not be liable for filtering decisions that allow them to establish their own standards and delete or modify content they consider obscene, lewd, lascivious, filthy, excessively violent, harassing or otherwise objectionable—regardless of constitutional protections. But Section 230 also protects them if they decide not to filter such content. However, recent events suggest that Section 230’s until now all-encompassing shield could soon be less so.