Teens Share Nightmare Created by AI-Generated Explicit Images

Sen. Ted Cruz (R-Texas) introduced a bipartisan bill to make it crime to post nonconsensual intimate images, including AI-generated images, on websites.

DALLAS—On a Monday morning this past October, Texas high school freshman Elliston Berry remembers waking up to a barrage of phone messages.

Her friends in the small town of Aledo, Texas, were sounding the alarm, and it was not good.

Instagram photos, altered by an artificial intelligence (AI) app, depicted nude photos of her and eight friends being shared through Snapchat. Eventually, most teens at Aledo High School ended up seeing them.

Her mind racing, Elliston, now 15, recalled feeling fear, shock, and disgust, and she was ashamed to tell her mother.

That same month, almost 1,600 miles away at Westfield High School in New Jersey, then 14-year-old Francesca Mani and several other high school girls discovered AI-generated nude images of themselves circulating online.

Both girls told their stories during a Senate field hearing arranged by Sen. Ted Cruz (R-Texas) on June 26 concerning the growing problem of nonconsensual intimate imagery, also called “revenge porn,” online.

“I initially felt shocked, then powerless and angered by the absence of laws and school policies to protect us,” Francesca testified. Sitting next to her at the hearing, Elliston added, “I was left speechless as I tried to wrap my head around the fact that this was occurring.”

Both had to fight to get the images removed, while little to nothing happened to the high school boys responsible for creating the images, the girls said.

Mr. Cruz was part of a bipartisan group that introduced the Take it Down Act to combat the use of sexual images and computer-generated images used to malign targets and even blackmail them.

The bill would make it a federal crime for someone to knowingly publish or threaten to publish fake pornographic content on social media or online.

It would criminalize the threat to publish or the publication of nonconsensual intimate imagery in interstate commerce.

Penalties include jail time.

The bill also requires websites to remove fake images or videos within 48 hours of being notified by a victim and to make “reasonable efforts” to remove copies of the material.

That component is seen as vital because tech companies are often slow to respond or don’t respond at all to parents and teens trying to remove images.

Teenagers Elliston Berry (L) of Aledo, Texas, and Francesca Mani of Westfield, N.J., testified that they were victims of nonconsensual intimate imagery generated by artificial intelligence at a hearing in Dallas on June 26, 2024. (Darlene McCormick Sanchez/The Epoch Times)
Teenagers Elliston Berry (L) of Aledo, Texas, and Francesca Mani of Westfield, N.J., testified that they were victims of nonconsensual intimate imagery generated by artificial intelligence at a hearing in Dallas on June 26, 2024. (Darlene McCormick Sanchez/The Epoch Times)

Testimony of Aftermath

Elliston and her mother, Anna McAdams, tried for eight months to get Snapchat to take the images down, but nothing happened until the family contacted Mr. Cruz for help.

“It shouldn’t take a senator having to intervene to get them down,” Mr. Cruz said.

The Federal Trade Commission (FTC) would have the power to enforce penalties on websites that do not comply, he said.

The bill is tailored to criminalize knowingly-published pornographic images without chilling lawful speech.

Mr. Cruz said the bill is needed to address the sexual exploitation of minors through technology, emphasizing the importance of creating a solid deterrent to stop the spread of nonconsensual images of minors.

Both teens testified that the perpetrators, who were male classmates, were protected by the system even as the girls were being traumatized.

Francesca, now 15, said she decided to speak out publicly to stop the online abuse that she and others experienced.

She said her school and others need to update their policies on AI-generated images. Only one of the boys responsible for posting fake images of Francesca received punishment, which was a day of suspension.

They were even allowed to attend the same classes as her, she said.

Social media companies could easily take sexualized content offline but refuse to do so in many cases and need to be held accountable, she said.

“We girls are on our own, and considering that 96 percent of fake AI victims are women and children, we’re also seriously vulnerable, and we need your help,” Francesca said.

“Without Senator Cruz’s bill, we’ll continue to have teens making AI-deep fake images of girls.”

The Epoch Times contacted Snapchat for comment.

The social media company updated its community guidelines in May specifically addressing sexual content.

“We prohibit any activity that involves sexual exploitation or abuse of a minor, including sharing child sexual exploitation or abuse imagery, grooming, or the sexualization of children,” the policy states. It goes on to say that child sexual exploitation is reported to police.

A Parent’s Perspective

Ms. McAdams, who also spoke at the hearing, said it took persistent effort to get the school to act against the male classmate that created the images that harmed her daughter. While pop star Taylor Swift was able to get social media giants to act quickly to remove sexualized images of her, it was a different story for these high school girls, she said.

The male classmate told others he decided to “ruin the girls” and “go out with a bang,” which made the situation scarier because of today’s environment of school shootings, she said.

Her daughter and the other girls stayed home most of the week but returned on the Friday following the incident that October. The school went into lockdown after word spread that “someone was after them.”

The boy responsible for the “deep fake” images was caught after he went online during the lockdown to “continue to terrorize” the girls during the incident, which allowed the school’s tech team to catch him, she said.

“These girls were 14 years old. Their innocence was taken that day in just 24 hours,” Ms. McAdams said.

Because the perpetrator was a minor, neither the school nor the police would give his name to the families of the girls.

They only learned his identity after asking the school to investigate the incident as a Title IX violation, which covers discrimination based on sex at schools receiving federal dollars.

The school administrators suspended the perpetrator, though Ms. McAdams said she and her husband pleaded with the school board to expel him.

Eventually, the boy withdrew from Elliston’s school.

He was charged with a class A misdemeanor, but given probation. Once he turns 18, his record will be deleted, Ms. McAdams said.

“He will walk away unscathed. However, our girls will forever live in fear that when they apply for a job or go to college, these pictures might resurface,” she said.

Mr. Cruz later told reporters that people need to understand that these are not victimless crimes and that the harm stays with them their entire lives.

The bill’s enforcement mechanism for Big Tech is modeled after copyright laws, he said, allowing the FTC to impose “significant penalties” if they don’t comply.

The Texas senator said he was confident the bill will receive support from both sides of the aisle if it gets out of committee. He hopes it will become law by the end of the year.

The law would give parents tools to combat harm befalling their children on social media, he said.

Social media can trigger depression and anxiety in children, he said.

“There’s content that promotes self-harm, that promotes suicidal ideation, there’s content that promotes substance abuse,” he said.

Andrea Powell, an advocate and expert on sexual exploitation, testified that she has met with more than 50 survivors of internet sexual violence, including minors.

One of the significant problems that the bill could fix is the quick removal of the content, she said.

Girls aren’t the only ones being victimized.

A recent report showed “sextortion” schemes targeting teen boys are on the rise.

The National Center for Missing & Exploited Children reported that it received an average of 812 sextortion reports per week between August 2022 and August 2023. More than two-thirds of these reports are “financially motivated,” according to the report.

Sextortion is defined as occurring when someone threatens to expose sexual images of another person if he or she doesn’t yield to demands.

Stefan Turkheimer, vice president for public policy at the Rape, Abuse & Incest National Network, testified that sextortion is a growing problem affecting children.

Underage victims mainly use his group’s hotline, and their stories sometimes involve sexual assault occurring after sexualized photos are published, he said.

Children are often tricked online to create sexual images by others posing as children themselves.

Some children end up taking their own lives after getting caught up in the schemes, he said.

“A lot of these kids don’t realize that there are ways back from this, and so they’re taking their own lives,” he said.

Aldgra Fredly contributed to this report.

Original News Source Link – Epoch Times

Running For Office? Conservative Campaign Consulting – Election Day Strategies!