Image Based Abuse

 

Image based abuse is an exponentially growing problem that profoundly impacts victims. IBA involves the creation and sharing of intimate images of a person, real or synthetic, without consent.

In a continuation of our work to inspire shifts in the way the public views and responds to gender-based violence, Joyful Heart is mounting a nationwide advocacy effort to address this pressing issue. We believe that everyone deserves to live a life free from degradation, dehumanization, and abuse.

Our Approach

Joyful Heart Foundation aims to create a world where people live free from violence and abuse. Everyone deserves to have control over their body and image. Our response to online exploitation includes creating tools for legislators to update state laws, providing guidance on federal legislative approaches, developing materials, spreading awareness and prevention information, and ensuring those with lived experience are heard and afforded rights and services. 

If you would like to learn more about how your community or state can address this rapidly spreading problem, email policy@joyfulheartfoundation.org

Scripps Family Impact Fund Awards Joyful Heart Foundation Over $500,000 to Develop Image Based Abuse Initiative 

Prevalence and Impact on Survivors

Image based abuse is a widespread, global problem that is growing rapidly. In the US, 1 in 12 adults report being victims of image-based abuse. Real rates could be much higher; 1 in 3 people report being victims in the UK and Australia. The FBI recently warned that young boys are often the victims of sextortion. Survivors of image based abuse report feeling violated, humiliated, and unsafe. They often suffer from depression, fear, and anxiety to damage to reputations, job loss, and social isolation. Some survivors have died by suicide afer discovering deepfake videos were made with their likeness. The effects of this abuse, no matter what form the abuse takes, has devestaing effects on survivors that can last for years.

My Image My Choice reports that, as of January 2024, there are 276,149 deepfake images online with a total number of 4,219,974,115 views, a 1,780% increase compared with 2019. 

According to a 2023 report by Home Security Heroes, 98% of deepfake visuals are pornographic in nature, and 99% target women. Many of these altered images target underage girls.  

A report in 2024 from Panorama Global saw individuals between the ages of 20 and 29 years were twice as likely as those aged over 40 to be victims of image-based sexual abuse.

Professor Danielle Citron, of the University of Virginia School of Law, asserts there are more than 9,500 sites devoted to this type of offense. 

Forms of Image Based Abuse

  • Non Consensual Distribution of Intimate Images (NDII) Non Consensual distribution of intimate images happens when someone shares private photos of another person without their consent. The images may be originally consensually obtained or they can be taken and used nonconsensually. It’s a serious violation of privacy that can cause severe distress and have a major impact on a victim’s mental health, relationships, and career. Unfortunately, in addition to the harm it causes initially, these images could lurk online indefinitely and sustain that harm for years.
  • Synthetic Intimate Images a.k.a. deepfakes, fake nude apps, digital forgeries. Synthetic intimate images are a major personal violation that happens when someone uses AI to insert a person’s face into explicit content without their consent. Even though they are not “real” they can have a very real impact on victims, triggering emotional distress, isolation, professional setbacks, relationship challenges, and even threats to personal safety.
  • Sextortion Sextortion happens when someone threatens to share real or synthetic private and sensitive material like photos or videos unless the victim gives into their demands, such as giving them money or sexual favors. It’s a form of blackmail that can be used by intimate partners, exes, or online scammers. Victims can become overwhelmed by fear, shame, isolation, anxiety, depression, and trust issues. Sadly, some young men have died of suicide after being sextorted.
  • Cyberflashing Cyberflashing is when someone sends unsolicited and unwanted explicit images via digital devices, often as a form of harassment. Like physical flashing, cyberflashing can cause immense discomfort, fear, anger, and shame—especially when the perpetrator hides behind anonymity. If you have been violated before, it can also trigger difficult memories of past traumas.

Resources for Survivors

  • Cyber Civil Rights Initiative (CCRI) Helpline supports victims of nonconsensual pornography, sextortion, deepfakes, and other forms of image-based sexual abuse. Call 1-844-878-2274 or visit cybercivilrights.org for free advice 24/7. 
  • Revenge PornHelpline is a UK service that supports adults who are experiencing intimate image abuse. 
  • EndTAB provides training for organizations to help address online abuse. Visit endtab.org to learn more and schedule a consultation call. 
  • Stop NCII is a free tool designed to support victims of nonconsensual intimate image abuse by creating a hash for your intimate image or video that participating companies can use to detect and remove the images. Visit stopncii.org for help. 
  • Cary Goldberg is a law firm representing victims of online harassment, cyberstalking and revenge porn.

Resources for Child Exploitation Survivors 

State of the Law

  • Currently, there are inadequate laws on the books to fight this abuse at the state level, and no laws exist at the federal level. It's clear the U.S. needs a comprehensive legislative approach to image-based abuse. 
  • Almost every state has a law banning the dissemination of an intimate image of someone without their consent.
  • Only a handful of states have laws that address deepfakes.
  • There is no federal law prohibiting nonconsensual disclosure of intimate images.
  • There is a federal civil cause of action for NDII but it does not cover deepfakes. 

Follow legislative priorities here

 

 

Printer-friendly version