Author: Max Mallett

  • Pros and Cons of the EARN IT Act of 2023-2024

    Pros and Cons of the EARN IT Act of 2023-2024

    What is the EARN IT Act?

    The Eliminating Abusive and Rampant Neglect of Interactive Technologies Act, or the EARN IT Act, seeks to combat the online exploitation of children. The bill replaces various statutory references to “child pornography” with “child sexual abuse material,” also known as CSAM. Activists and advocates argue that child sexual abuse material is a more fitting term than child pornography, as the inability of children to consent makes any explicit imagery containing them evidence of sexual abuse, which should be described as such.

    Combatting CSAM is an issue of particular importance. Reports of CSAM are currently on the rise, with the National Center for Missing and Exploited Children’s CyberTipline experiencing a 329% increase in reports of child sexual abuse material over the last 5 years. While this problem is not unique to America, our government is uniquely well-positioned to combat it, as United States-based URLs now host more CSAM than any other individual country, or 30% of the world’s CSAM. While it is often assumed victims of sexual abuse are older children and teenagers, over 60% of CSAM involves prepubescent children, including toddlers and infants. Moreover, a large percentage of those arrested for possession of CSAM are dual offenders who concurrently sexually abused children. Unfortunately but unsurprisingly, victims of CSAM experience many long-term negative health impacts, from brain damage to physical health problems to increased mental disorders.

    The bill seeks to combat CSAM in a myriad of ways. First, the EARN IT Act establishes a National Commission on Online Child Sexual Exploitation Prevention, composed of various stakeholders. The Attorney General would serve as chairperson, with the Secretary of Homeland Security and the Chairman of the FTC also serving on the commission. Otherwise the Senate majority leader, the Senate minority leader, the Speaker of the House, and the House minority leader all appoint one person in each of the four categories (for a total of another 16 commission members): firstly, someone with experience criminally investigating CSAM; secondly, a survivor of CSAM or a person with experience providing victim services to survivors; thirdly, someone with experience with consumer protection, privacy, data security, or cryptography; fourthly, one with experience working for interactive computer services companies and addressing child safety and exploitation. 

    The EARN IT Act would also change the reporting requirements for instances of CSAM reported to the National Center for Missing and Exploited Children, increasing the amount of information that must be collected and extending the amount of time that providers must retain the contents of the report from 90 days to one year.

    Finally, the EARN IT Act would hold the providers of interactive computer services liable for child sexual abuse materials distributed through their sites. Historically, Section 230(e) of the Communications Act of 1934––which provides limited federal immunity to providers and users of interactive computer services––has been interpreted by courts as protecting social media providers from being held legally liable for failure to take down user-generated content. The EARN IT Act, if passed, would remove these Section 230 protections and enable interactive computer service providers to be civilly and criminally liable “regarding the intentional, knowing, or reckless advertisement, promotion, presentation, distribution, or solicitation of child sexual abuse material.”

    Arguments for the EARN IT Act

    Proponents of the EARN IT Act argue it will help combat the distribution and availability of CSAM, and in doing so protect minors online. By removing the liability protections of Section 230, supporters of the EARN IT Act believe that technology companies will be forced to proactively seek out and remove CSAM on their sites, lest they face legal repercussions. Indeed, proponents of the bill observe that technology companies have unique resources at their disposal that even law enforcement agencies themselves do not have, and they should be required to make use of those resources to combat CSAM. By requiring online platforms to retain content reported for up to a year, supporters argue that law enforcement will have the time they need to more thoroughly investigate cases of CSAM. 

    Furthermore, the commission would establish best practices in an effort to prevent, reduce, and respond to the online sexual exploitation of children. Proponents of the EARN IT Act argue that the establishment of the commission is an effective first step between governments, industry, advocates, and victims. However, the commission is not in the house version of the bill, and the best practices are merely recommendations not requirements. Nevertheless, supporters believe all conversation over how to best protect against CSAM is potentially positive, whether it results in required actions or merely suggestions.

    Arguments against the EARN IT Act

    Opponents of the EARN IT Act argue that the removal of liability protections will force online platforms to remove end-to-end encryption and actively police formerly private correspondences. In end-to-end encryption, data is encrypted on the device of the sender and decrypted on the device of the receiver; in transit the data can not be read by intermediate entities, even the service providers themselves. By holding service providers liable for the CSAM distributed on their website, many opponents of the bill believe that organizations will be forced to get rid of end-to-end encryption or suffer the legal consequences. For this reason organizations like the ACLU and the Electronic Freedom Foundation oppose the EARN IT Act and its undermining of current privacy protections.

    While proponents of the EARN IT Act argue that removing end-to-end encryption is worth it if it helps catch nefarious actors, opponents argue that nefarious actors will always encrypt their messages, and as such the EARN IT Act will remove privacy protections for law-abiding citizens while having little effect on criminals. Moreover, some argue that the EARN IT Act could counterintuitively make prosecuting criminals more difficult, and that similar bills which removed limited liability protections were used in only a handful of prosecutions after their passing.

    Conclusion

    The EARN IT Act replaces statutory references to “child pornography” with “child sexual abuse materials” (CSAM), establishes a commision of various relevant stakeholders, extends the time period information relevant to CSAM reports must be retained from 90 days to one year, and specifies that Section 230 protections do not protect online service providers from legal recourse regarding the “intentional, knowing, or reckless advertisement, promotion, presentation, distribution, or solicitation of child sexual abuse material.” Proponents of the EARN IT Act argue it will help decrease CSAM and prosecute perpetrators of CSAM. Opponents argue the EARN IT Act will undermine end-to-end encryption and existing privacy protections for little benefit.

  • Understanding the SHIELD Act of 2023: Definitions, Impacts, and Legal Debates

    Understanding the SHIELD Act of 2023: Definitions, Impacts, and Legal Debates

    What is nonconsensual pornography?

    Nonconsensual pornography, image-based sexual abuse, or as it is most commonly called, revenge porn, are all terms that refer to the nonconsensual distribution of sexually explicit imagery. Advocates often argue that the term “revenge porn” implies victim shaming, and therefore they prefer to use the term “nonconsensual pornography” exclusively. While one in 25 Americans have been a victim of nonconsensual pornography, women are more likely to be victims than men, with teenagers and young adults particularly susceptible. Nonconsensual pornography is an issue of growing concern, as the global COVID-19 pandemic resulted in an increase in cyber-based intimate partner violence, including nonconsensual pornography.

    Victims of nonconsensual pornography face various harms, such as PTSD, anxiety, depression, suicidal thoughts, and other mental health issues, resulting in a decline in overall well-being. Laws addressing nonconsensual pornography are varied, with 46 states and D.C. having criminal statutes, alongside a federal civil claim that offers a cause of action.

    The Stopping Harmful Image Exploitation and Limiting Distribution, or SHIELD, Act of 2023 seeks “to provide that it is unlawful to knowingly distribute private intimate visual depictions with reckless disregard for the individual’s lack of consent to the distribution, and for other purposes.” In other words, the SHIELD Act seeks to federally criminalize the distribution of nonconsensual pornography.

    Arguments for SHIELD

    Some argue that current state laws provide inadequate protections to victims of revenge pornography. Not only do not all states currently criminalize nonconsensual pornography, but the current state laws criminalize nonconsensual pornography in different ways and to varying degrees. Some states require that the victim is identifiable in the distributed imagery, while others require that the perpetrator had intent to harm, which makes prosecuting perpetrators who seek financial gain or social status as opposed to harm near impossible. While criminalizing conduct that occurs across state lines is a perpetual issue with state laws, the fact that nonconsensual pornography is a cybercrime makes this issue particularly relevant.

    Regardless of the efficacy of state laws, supporters of SHIELD argue federally criminalizing nonconsensual pornography would reduce incidents of nonconsensual pornography, providing victims with much-needed protections from a harmful and damaging act. When surveying Americans who admitted to distributing nonconsensual pornography, 82% said they would not have done it if they had known it was a federal felony. Indeed, deterrence theory suggests the federal criminalization of nonconsensual pornography would reduce incidents of the crime.

    Arguments Against SHIELD

    Opponents of SHIELD argue that state laws do enough to protect victims of nonconsensual pornography, even if they are less comprehensive in totality. After all, nearly all 50 states do have laws in place already, and there is a federal cause of action that enables civil suits against perpetrators. While 82% of perpetrators said they would not have distributed nonconsensual pornography if they knew it was a federal felony, 81% said they would not have done as much if they knew it was a state felony, suggesting that state laws could be almost as effective at preventing nonconsensual pornography as federal laws. 

    Others, like the American Civil Liberties Union, argue that SHIELD is unconstitutional. They further that victims of nonconsensual pornography who distribute the images out of surprise or shock might be unintentionally committing criminal acts, and that the law could restrain journalists from publishing images. While some of the language of the bill has been edited since the ACLU’s initial complaint, a lot of the specific text that was initially worrisome remains. Various state laws that seek to criminalize nonconsensual pornography have been challenged in courts on First Amendment grounds. Other concerns, such as concern for “duplicative prosecutions” because of a federal law that “largely overlaps” with state laws, still remains regardless of first amendment challenges or lack thereof.

    Conclusion

    Perpetrators of nonconsensual pornography can cause significant harm to their victims. Proponents of the SHIELD Act claim making nonconsensual pornography a federal crime will decrease the rates of nonconsensual pornography and shore up the patchwork of state laws currently in place. Opponents, however, argue that the SHIELD Act is both unnecessary and unconstitutional, potentially creating further problems through criminalizing an act that state laws by and large protect against already.