State lawmakers in at least 22 states have introduced misguided legislation purportedly seeking to protect their residents from being censored. Most of these bills, if ever implemented, would prevent providers of online services from taking critical actions to protect these same residents from clearly harmful content.
Though the state bills vary in their particulars, they all share a common thread of wanting to restrict or punish providers for moderating content on their platforms. Some would restrict the types of content that providers are able to remove and others would create private rights of action to enable users to sue providers for removing content.
Despite the differences, these bills would all have similar results if they were to succeed: they would expose online users in these states to a range of harmful content by prohibiting or punishing service providers who adopt and enforce safety rules. Under some of the bills, a provider could not continue to engage in safety measures so commonplace that they appear to be taken for granted, like filtering out massive amounts of spam, phishing, scams, malware and other potentially harmful efforts that could result in consumers falling victim to fraud, identity theft, or other tangible harms from compromises to the security of user’s devices or personal data. Literally, billions of messages and posts that previously would have been blocked would litter services we use everyday. Some bills would even would restrict or disincentivize providers from addressing potentially dangerous activity like attempts to engage in opioid sales or promote eating disorders or self-harm to teens. These are things that providers work diligently to identify and remove today.
The lawmakers behind these bills claim that these measures are necessary to ensure diverse online discourse. But they forget that before online services gave the general public access to services where they could publish their own thoughts and opinions, there were only a few options for ordinary people to share their thoughts broadly, many of which were out of reach to most Americans unless they could get a letter to the editor of a newspaper or a book published. Online services have, without question, empowered MORE voices to be heard, MORE diversity of opinions, MORE information to be available at our fingertips than any other communication tool in human history. The measures they are championing in the name of diverse discourse target the very online services that make this discourse possible and threaten to up-end essential elements to their success.
This seems like cutting off your nose to spite your face, particularly as the pandemic batters our economy, local officials across the states are trying to attract new jobs and investment for state emerging tech and startup hubs. Job creators are likely to think twice about the long-term business climate of states seeking to create a private right of action to punish the same services they are trying to recruit.
It is unfortunate that in this critical time in the pandemic with Americans in desperate need of assistance securing basic necessities, state lawmakers are spending their energy on proposals that are purely rhetorical vehicles which perpetuate the false narrative that individuals have the right to say whatever they want wherever they want online. These measures are a distraction from other priorities. If passed, they will face inevitable lawsuits challenging their legality. Lest there be any doubt, these types of measures face an insurmountable hurdle in the form of federal law, which both preempts inconsistent state laws in this area and provides clear First Amendment protections which prevent state legislative bodies from passing laws that interfere with the rights of service providers to do exactly what these bills seek to take away — the ability to set and enforce the rules for what kinds of content is and is not allowed on their service.
IA member companies share the interest in promoting diverse online discourse, but believe it is imperative that online safety for all users does not suffer as a result.