Skip to main content

Her classmate used AI to make deepfake nude images of her. Experts say it's not uncommon.


play
Show Caption

It was the morning after Elliston Berry’s homecoming celebration in October of 2023 when the North Texas high school student got a text: nude images of her were circulating around the school.

But the images weren’t real. A classmate had used artificial intelligence to create deepfakes, photoshopping a nude body onto photos from Berry’s Instagram. She was 14 at the time.

“Waking up that morning, it was like my whole life turned upside down,” the now-15-year-old says. “I'm more than just those pictures. But in those moments, I felt like that's all people saw.”

Mental health and cybersecurity experts say what Berry went through is increasingly part of the teen experience. 

One in eight teens age 13 to 17 personally know someone who has been victimized by deepfake nudes, according to a new report from Thorn, a nonprofit company focused on childhood safety online. The report found that deepfake nude technology is becoming increasingly accessible to teens. The AI content is often indistinguishable from real images and is being used as a tool of abuse and harassment that disproportionately targets teenage girls and women.

Bullying with deepfakes is still bullying

The initial image was sent around on Monday, and by Tuesday there were more. A male student created deepfake images of eight other girls and spread them through an anonymous Snapchat alias, Berry says.

It happened off school property, and there was no existing playbook for officers to follow, leaving the girls and their families without answers on who the perpetrator was for months. Three of the impacted girls transferred schools, she recalls.

Each day, Berry walked into school with anxiety about running into the unknown bully before class or on the way to volleyball practice. She was usually the kind of student that high-fived friends in the hallway and caught up with people during passing periods, but in the months after the incident, she didn’t want to leave her bedroom. 

“The mental anguish on her and her personality changing was very disturbing,” says Berry’s mom, Anna McAdams. She watched her daughter change from a bubbly extrovert to an introvert. 

The impact of tech-enabled abuse mirrors that of traditional sexual assault cases — shame, guilt, fear of repercussions and potential long-term mental health effects, says Jennifer Simmons Kaleba, vice president of communications at the Rape, Abuse and Incest National Network (RAINN).

But the tech-oriented nature of the exploitation makes the abuse feel pervasive. There’s no telling how many times the images of Berry and her friends were shared, or if it really did stop.

“The harm is continuous, because each time the image comes up and a friend calls and says, ‘I've seen you online,’ it's a new transgression, it's a new piece of shame,” says RAINN’s Vice President of Public Policy Stefan Turkheimer.

A significant number of teens know someone victimized by AI nudes

Thorn’s survey of 1,200 young people found one in eight teens age 13 to 17 know someone directly targeted by deepfake nudes, while one in 17 said they were a direct victim.

“One of the big takeaways from the report is this urgency to align societally that this is not a permissible behavior, and it is harmful for the person that you're targeting,” says Melissa Stroebel, vice president of research and insights at Thorn. “We should be focusing on the non-consensual, re-sharing of an explicit image, and the consequences that results from that, whether AI technology was used in its creation or not.

While Berry told her parents about what happened, many teens suffer in silence. In Thorn’s survey, roughly half of respondents sought some form of offline support, with 34% of victims reporting they told a parent or caregiver.

“I was just a 14-year-old girl, and everyone is seeing my body, and even though it's not my body, it has the same shame and the same guilt as a real photo would have," Berry says.

The findings are part of a larger trend of generative AI’s role in creating child sexual abuse material. The National Center for Missing and Exploited Children (NCMEC) started internally tracking generative AI in the spring of 2023, according to Lauren Coffren, executive director of the Exploited Children Division at NCMEC. Reports of generative AI being used to create abusive sexual content rose from 4,700 in 2023 to 67,000 in 2024.

Teenage years are key period of development for body image, self esteem

Students’ middle school and high school years come at a period when they start to develop increased independence, abstract decision making and significant physical changes through puberty.

Teens look to their families and friendships to start to determine who they are in relation to their larger peer group, according to Shadeen Francis, a licensed marriage and family psychotherapist who specializes in sex therapy and emotional intelligence.

“To have a trauma at this point in time, especially considering that trauma disrupts our sense of trust in ourselves and in others, it can be really disruptive to a young person's sense of self,”  Francis says.

Early experiences of abuse have long-term effects on teenagers’ ability to build healthy relationships and establish trust with significant others later in life. Victims may develop anxiety, depression and post-traumatic stress disorder.

Berry was concerned people wouldn’t know the images were fake and worries they could resurface in the future when she applies to college or jobs. Worst of all, she says, is that the images defamed her character and cast her in a sexual light without her consent. She grew up a devout Christian and says her parents taught her to never send nude images.

“I didn't want people to think, ‘Oh, this is the kind of person she was,’” Berry says. “He stripped me of my innocence and sent these images out for everyone to see me in that vulnerable, weak state.”

She wore sweatshirts and sweatpants to school to cover up her figure, because she didn’t want anyone to look at her in real life the same way they looked at those photos.

As deepfakes surge, resources and legislation are lagging

Deepfakes first came into the public eye in February of 2018 when Reddit suspended r/deepfakes, a subreddit threat with 90,000 followers where users created fake porn photos and videos. At that point, it often took thousands of images to create a deepfake, making public figures the primary target. Today, it’s possible to create them with only a few images.

In January 2024, users created fake sexually explicit images depicting Taylor Swift at a football game. These images spread like "wildfire" despite very clearly violating X’s platform’s policies, according to the Sexual Violence Resource Center.

McAdams says the victimization of her daughter came as a shock. “The things I heard were of movie stars or somebody in Hollywood, not a child,” she says. “I was not clued into the severity of what could be done."

The parents of victimized girls at Berry’s school filed a Title IX investigation in January 2024 and the perpetrator was charged with a class A misdemeanor for harmful distribution of material. There is currently no federal law requiring online platforms to remove non-consensual intimate images, leaving parents and schools without resources on how to handle deepfake cases, according to RAINN’s Turkheimer.

For more than eight months, McAdams couldn’t get Snapchat to take down the images. It was only after the office of Sen. Ted Cruz, R-Texas, personally reached out to Snapchat that the photo was removed, she says.

Berry is one of the advocates pushing lawmakers to pass the Take it Down Act, which criminalizes the publication of “non-consensual, sexually exploitative images, including AI-generated deepfakes.” That includes “digital forgeries” created with AI software and requires technology platforms to remove reported content within 48 hours of receiving a valid request.

First lady Melania Trump on March 3 championed the law, which was unanimously passed by the Senate. Berry, who sat next to the first lady, shared her story with lawmakers.

"It’s heartbreaking to witness young teens, especially girls, grappling with the overwhelming challenges posed by malicious online content, like deepfakes," says Trump.

Stay safe and healthy: Sign up for Paste BN's Keeping it Together newsletter for more wellness news.

Here’s what teens, parents should know

Coffren says predators creating sexualized imagery is nothing new, but a “wild, wild, west” of new deepfake technology makes the content easier to create and harder to detect.

If you are a victim of deepfake nudes, alert authorities by using NCMEC's CyberTipline at report.cybertip.org or call 1-800-843-5678.  Experts say it’s important not to delete any platforms it occurred on, which can be a helpful trail of evidence.

Opening up conversations about deepfakes  – both on the seriousness of creating the images and the potential of being harmed by them – should be done in the same way parents speak to their children about consent and traditional sexual assault. 

McAdams says she was “behind the ball” on knowing about deepfake images, but was thankful her daughter felt comfortable enough to come to her, and urged other teens to do the same if it happens to them.

“Talk to your kids about this stuff. They've got to know that they can come to you and not be fearful,” McAdams says.

Rachel Hale’s role covering Youth Mental Health at Paste BN is supported by a partnership with Pivotal Ventures and Journalism Funding Partners. Funders do not provide editorial input. Reach her at rhale@usatoday.com and @rachelleighhale on X.

Swapna Venugopal Ramaswamy contributed reporting.