On March 11, Internet Association (IA) Deputy General Counsel Elizabeth Banker testified before the Senate Judiciary Committee on internet companies’ efforts to proactively identify and remove illegal CSAM material and how the EARN IT Act could impede those efforts..
In a hearing on the EARN IT Act, Banker outlined what internet companies are doing to protect children online and discussed the best ways to further those efforts through collaboration with law enforcement and the National Center for Missing and Exploited Children (NCMEC). Banker also outlined IA’s concerns that the EARN IT Act would undermine IA member companies’ efforts to detect and prevent crimes against children.
In her opening statement, Banker highlighted IA members’ strong commitment to protecting children and outlined IA’s concerns on the EARN IT Act.
“Thousands of people working on these issues at IA member companies are extraordinarily committed to fighting this content. …Today IA member companies, alongside governments, civil society and other stakeholders, continually work to stop the spread of child exploitation.”
“IA has concerns the bill would hinder existing efforts by internet companies to keep their platforms safe. Or worse, it would undermine the efforts of law enforcement to hold bad actors to account. Our chief concern is the bill would exacerbate a growing trend among criminal defendants charged with child exploitation in which they attempt to suppress evidence by arguing that providers who proactively detected child exploitation and reported it to NCMEC acted as agents of the government for fourth amendment purposes…this would create a worst case scenario in which criminal defendants could not be prosecuted using the evidence found through provider detection efforts…”
Banker elaborated on this worst case scenario in an exchange with Sen. Lee (Banker’s full remarks can be found here):
Key Excerpts:
IA Member Companies Are Committed To Substantial Efforts To Combat Child Sexual Abuse Material
“…IA and its member companies share the goal of eradicating child exploitation online and offline, and our member companies strive to end child exploitation online. They take a variety of actions, including dedicating engineering resources to the development of tools like PhotoDNA and Google’s CSAI Match, assisting in the modernization of the Cybertipline through donations of engineering resources or funds, and engaging with law enforcement agencies.
“Many companies proactively detect and then report instances of CSAM to NCMEC. IA supported the CyberTipline Modernization Act of 2018 to strengthen engagement between NCMEC, the public, and the internet sector and to improve law enforcement’s capabilities in the fight to combat child exploitation online and offline. These are just a fraction of the steps that IA companies take to make the online and offline world a safer place.”
The EARN IT Act Would Create Numerous Problems And Hinder Efforts To Combat Child Exploitation Online
“IA is concerned that the EARN IT Act would burden, discourage, or even prevent, ongoing efforts by internet companies to keep their platforms safe and to identify and remove abusive content. It also would undermine the efforts of law enforcement, and nongovernmental organizations like NCMEC, to hold bad actors to account and combat Child Sexual Abuse Material online.”
Strong Encryption is Key to Protecting National Interests
“…Requiring companies to engineer vulnerabilities into their services would make us all less secure. Encryption technology stands between billions of internet users around the globe and innumerable threats—from attacks on sensitive infrastructure, including our highly automated financial systems, to attempts by repressive governments to censor dissent and violate human rights. Strong encryption is key to protecting our national interests because encryption technology is an essential proactive defense against bad actors.”
Section 230 Has Important Benefits And Allows Punishment of Bad Actors
“Section 230 empowers internet companies to identify and remove CSAM and other illegal or harmful material. Section 230 was enacted in response to a court decision that exposed internet companies to liability based on their efforts to block objectionable third-party content. Congress feared that if internet companies could be liable due to their monitoring and moderating objectionable content, they would be discouraged from performing even basic safety removal to avoid incurring liability. Section 230 removes this disincentive to self-regulation by shielding service providers from claims that would hold them liable due to their attempts to moderate certain content.”
Further reading:
- Banker’s full written testimony.
- IA’s statement on the EARN IT Act.
- More information on the importance of Section 230 and content moderation.