Intermediary liability protections make the best of the internet possible. The key legal protections in the U.S. that allow people to share content and connect with each other are Section 230 of the Communications Decency Act and Section 512 of the Digital Millennium Copyright Act. These laws provide limited protections to online platforms for content that third parties create and distribute through their services. But make no mistake, website owners and app developers are still liable if they create illegal content, knowingly host illegal content, or otherwise engage in conduct that violates federal criminal laws.
CDA 230 also enables online services to restrict access to objectionable content, without fear that the decision to restrict access will actually lead to legal liability. This protection for “Good Samaritans” empowers companies to do things like prevent children from being exposed to adult-themed content, and take steps to address legal – but nevertheless unwanted/abusive – behavior. While internet content regulations outside the U.S. do not typically shield companies from liability when they proactively engage in such activities, the Good Samaritan clause in CDA 230 is one of the key reasons why U.S. internet companies have been so successful compared to their foreign equivalents.
What Is Section 230?
This law was enacted as part of the Communications Decency Act (CDA) of 1996 to “promote the continued development of the Internet and other interactive computer services and other interactive media.”
It states, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”
This provides protections to service providers and users who post content created by someone else.
It also states, “No provider or user of an interactive computer service shall be held liable on account of any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected.”
This enables companies to take action, without fear of liability, to remove illicit and illegal content or content that violates the company’s rules.
These protections are limited. They don’t apply to:
- content created by service providers
- violations of federal criminal law
- intellectual property violations
- federal communications privacy law violations