IA Calls On Court To Affirm Platforms Can Enforce Community Standards Thanks To CDA 230
“This case demonstrates why Section 230 is so important in empowering sites to remove harmful content while fostering open discourse. The law exists precisely so that services like Vimeo can set community guidelines to protect their users without an endless stream of lawsuits litigating those decisions.”
Elizabeth Banker, IA Deputy General Counsel
Washington, DC — Today, Internet Association (IA) filed an amicus brief in Domen v. Vimeo, Inc. supporting Vimeo’s decision to enforce its community guidelines under Section 230, which empowers online platforms to moderate content. In this case, Vimeo removed Mr. Domen’s video after warning him about the video violating the platform’s policy against promoting sexual orientation change efforts. While Mr. Domen claims Vimeo’s actions constitute unlawful censorship, Section 230 empowers platforms like Vimeo to both create and enforce community guidelines and engage in robust content moderation efforts, without fear of facing liability for doing so. IA’s brief calls on the U.S. Court of Appeals for the Second Circuit to uphold the district court’s dismissal of the case under Section 230.
“Section 230 enables online platforms to create safe and enjoyable online experiences for their users through robust codes of conduct and community guidelines,” said Elizabeth Banker, IA Deputy General Counsel. “This case demonstrates why Section 230 is so important in empowering sites to remove harmful content while fostering open discourse. The law exists precisely so that services like Vimeo can set community guidelines to protect their users without an endless stream of lawsuits litigating those decisions.”
In support of defendant Vimeo, Inc., the brief outlines how Section 230 ensures that internet sites can develop and deploy tools for proactive content moderation without creating additional liability. Highlights from the brief include:
- The district court correctly interpreted that Section 230 protects platforms’ ability to enforce their codes of conduct without becoming the “publisher” of the removed content. From the filing:
- “Here, the district court correctly ruled that Vimeo satisfies the statutory elements of both Sections 230(c)(1) and (c)(2)(A) and is thus immune under both provisions from appellants’ claims challenging its decision to remove their content and suspend their accounts.”
- Platforms can craft community guidelines to keep harmful content, such as discrimination, off their sites because of Section 230. From the filing:
- “When deciding whether Section 230(c)(1) immunity applies, what matters is not how plaintiffs choose to characterize their claims—e.g., as being ‘based on Vimeo’s discriminatory ban,’ Appellants Br. 14—but simply whether the cause of action seeks to impose liability for ‘publishing conduct.’”
- Ruling for Domen in this case would put “the entire enterprise of the internet at risk.” From the filing:
- “If providers faced the threat that each content moderation decision might, at any point, subject the provider to the immense costs of discovery, the cost of running their platforms could grow exponentially, putting the entire enterprise of the internet at risk.”
To read the full brief, click here.
###