
US Survivors of Online Sexual Abuse Face Legal Gaps and Inaction by Tech Companies, New Report Finds
NEW YORK, Jan. 28, 2026 /PRNewswire/ -- A new report featuring accounts from survivors of online sexual exploitation and abuse (OSEA) in the United States exposes critical gaps in legal protections and widespread failures by the criminal justice system, regulators, and major US-based tech companies to respond effectively. Online sexual exploitation and abuse in the United States: An analysis of policy gaps, system response and prevention mechanisms through survivor-lived experiences, by Equality Now and the Sexual Violence Prevention Association (SVPA), draws on survivor insights and expert legal analysis to identify harms arising from OSEA and provide recommendations for legal and policy reform.
Survivors described navigating a confusing patchwork of federal and state laws that make access to justice and support difficult, while tech companies face minimal legal obligations to act decisively. Online platforms are often slow and ineffective in responding to requests to remove abusive content, leaving non-consensual sexual material to circulate indefinitely, causing ongoing harm and retraumatization.
Every survivor interviewed suffered repeated reposting of abusive material across multiple platforms, and none succeeded in having content removed entirely. No perpetrators were held fully to account, and for those who faced any consequences, these were limited to the initial posting of material, failing to reflect OSEA's ongoing nature and revictimization.
Gaps and inconsistencies in laws on online sexual exploitation and abuse
Legal protections against OSEA in the US are split between federal and state systems, each with its own laws, courts, and areas of authority. Federal laws on tech-facilitated abuse are not comprehensive, and while states provide additional protections, the jumble of laws across jurisdictions complicates cases and creates legal loopholes. Concerningly, there are virtually no laws spanning international borders that address OSEA involving adult victims.
State laws are often unclear and vary widely, resulting in inconsistent responses and protections that depend on where someone lives. Poor coordination and communication between states and across levels of government further undermine survivors' access to timely, meaningful assistance.
OSEA often involves multiple platforms and offenders in different jurisdictions. This creates confusion about which laws apply and what authorities have the power to act, and survivors lack knowledge about how to report violations and preserve evidence.
Only 45 states have updated their laws to cover AI-generated child sexual abuse imagery (CSAM), while protections for adults lag even further behind. State-level coverage remains patchy with penalties for nonconsensual AI-generated or computer-edited sexually explicit materials - often called deepfakes - ranging from a misdemeanor to felony.
Research and reporting focus primarily on CSAM, leaving adult survivors largely invisible and contributing to inadequate responses from lawmakers. Better data collection and reporting are urgently needed to ensure research and legislation keep pace with technological developments.
Holding tech companies accountable for abuse on their platforms
No US statute expressly requires US-based tech companies to maintain user safety and transparency, and only limited duties are placed on them to prevent and protect against tech-facilitated abuse. Policies must be created and enacted to hold tech companies accountable for the nonconsensual publication and distribution of OSEA content on their platforms.
The "Big Five" US-based corporations - Alphabet (Google), Amazon, Apple, Meta, and Microsoft - dominate global digital markets and control the primary platforms through which OSEA occurs. In 2020 alone, their combined market value reached $7.5 trillion, giving them unprecedented power to shape global standards for digital safety, content moderation, and transparency.
Survivors described multiple challenges when dealing with tech platforms. They found it difficult to locate reporting systems, responses were inconsistent and inadequate, and moderation or content removal erased critical evidence, undermining legal investigations.
Interviewees developed informal routines to monitor digital spaces and submit "takedown requests." These must often be filed repeatedly, sometimes daily, and survivors can wait months or years for a response. In several cases, platforms eventually said no action could be taken.
One interviewee, Izzy, and her partner sent intimate images to each other via Snapchat. Her account was hacked, and her images, name, and address were sold to pornography websites. Izzy's family was sent the content and threatened that it would be circulated further unless they paid. Izzy recalled Snapchat's response, "Within their community guidelines, they say you're not supposed to take any sexually explicit pictures of yourself, so if anything does happen to you, that's your fault. It genuinely made me sick to my stomach how dismissive they were!"
Equality Now's Anastasia Law explains, "US laws have failed to keep pace with the realities of tech-facilitated sexual abuse, and survivors are paying the price. With no US federal statute requiring tech companies to ensure user safety or transparent reporting systems, survivors must navigate outdated laws, inconsistent responses, and repeated obstacles when trying to take down abusive material or hold perpetrators accountable."
"Lawmakers must act to strengthen state and federal laws, with clear policies governing consent and the online distribution of sexual material in an increasingly borderless digital world. US-based tech companies need to be held fully accountable for the non-consensual publication and spread of sexually explicit content on their platforms."
Survivors face systemic failures when reporting tech-facilitated sexual abuse
Every survivor in the study who formally reported their abuse found the experience overwhelmingly negative. They had to educate themselves about complex legal systems, sort evidence, and coordinate between platforms and agencies. Law enforcement officials were unclear and uncertain about "takedown request" procedures, handling digital evidence, and evidence-collection protocols, including obtaining warrants, issuing subpoenas, and determining admissibility of online materials.
This extended to prosecutors, attorneys, judges, and even victim advocates. Criminal justice professionals typically didn't know the relevant statutes, with survivors often required to identify remedies and coordinate between the criminal and civil systems.
Survivors frequently encountered victim-blaming, and their experiences were often dismissed or minimised. Many were questioned about their consent and credibility, portrayed as overly emotional or unreliable, and excluded from key decisions affecting their cases.
Online sexual exploitation and abuse harm survivors in multiple ways
OSEA survivors face an increased risk of physical harm, including domestic abuse, sexual violence, and human trafficking. Research participants also experienced significant emotional harm, including hopelessness and depression, with four having suicidal thoughts.
Samantha had a video posted online of her being raped by a police officer. She explained, "It's one thing that the attack happened, but then, when it was shared to be rewatched over and over again, and I had no control over how far it was reaching, or how many people were viewing it, or who had access to it… It emotionally was just horrifying."
Interviewees described living in constant fear of meeting people who've seen the abusive content, or that more material could surface. They also reported profound feelings of betrayal and trust issues with partners and others.
Another burden was the financial cost. Many survivors paid for legal services, but for some, it was unaffordable, leaving them to navigate complex legal processes alone. Several spent thousands on mental health counseling, and some incurred costs from relocating due to safety concerns.
Three interviewees lost their jobs, while four others lost career opportunities. Izzy was harassed on LinkedIn by people who'd seen her on pornographic sites, and she now pays $1,000 monthly to a private company to remove OSEA content.
Katie Knick from SVPA concludes, "Online sexual exploitation and abuse is a form of systemic sexual violence rooted in misogyny, racism, and other intersecting oppressions. While technology shapes how the harm occurs, prevention depends on dismantling rape culture and reducing power imbalances through education, policy reform, and institutional accountability."
"Our research underscores the need for survivor-centered systems, including free legal representation, trauma-informed mental health care, specialized professional training, and clear pathways for reporting and removal of abusive material. Sustainable prevention requires accountability and policies informed by the voices and leadership of survivors with lived experience."
Contact:
Tara Carey, Equality Now
+44(0)7971556340
[email protected]
SOURCE Equality Now
Share this article