Congress considers bill that would make sharing deepfake porn illegal

Last year, there were more than 21,000 deepfake pornographic videos online β€” up more than 460% over the year prior. But Congress could soon make it illegal to share the doctored images.

Leading the charge are New Hampshire Sen. Maggie Hassan, a Democrat, and Texas Sen. John Cornyn, a Republican, who co-authored bipartisan legislation aimed at cracking down on people who share non-consensual intimate deepfake images online. The legislation proposes criminal penalties that include a fine and up to two years in prison, and civil penalties could range up to $150,000.

“It’s outrageous,” Hassan said. “And we need to make sure that our laws keep up with this new technology and that we protect individuals.”

Breeze Liu said she was shocked when a friend discovered her face superimposed on pornographic images.

“And I really feel like my whole world fell apart at that moment,” said Liu. “You have to look at how many views are there, and how many people have violated you. I just didn’t want to live anymore, because the shame was too, too much for me to bear.”

Liu, who said she knew who the perpetrator was, decided to take her case to police.

“The police did not really do anything about it,” said Liu. “The police actually called me a prostitute. They slut shamed me.”

Liu said when law enforcement didn’t pursue the issue, the perpetrator created more deepfakes of her, creating more than 800 links across the internet. Liu said the FBI is now investigating her case and she’s also part of a class-action lawsuit against Pornhub.

Pornhub told CBS News it swiftly removes any non-consensual material on its platform, including deepfakes. The site also said it has protocols in place to prevent non-consensual material from being uploaded.

People have also created artificially generated intimate images of celebrities like Taylor Swift. In January, the social media site X disabled searches related to the singer in an effort to remove and stop the circulation of deepfake pornographic images of the pop superstar.

Teens across the country are also grappling with the increasingly common problem. Some students are creating deepfake porn of fellow students and spreading them among their friends and family members, sometimes even extorting them. In New Jersey earlier this year, a teen sued another student, accusing them of creating and sharing AI-generated pornographic images of them and others.

Hassan said Congress is working toward criminalizing the creation of non-consensual intimate images.

“There is work going on in Congress right now about how to set up this kind of guardrail, but what we know is that most people don’t know about the deepfake that exists until somebody tries to distribute it, right? So we wanted to really attack this problem at the point where it becomes obvious and somebody is likely to take action,” Hassan said.

Cornyn said that while it could take months to get the bill through the Senate, he’s confident it will pass with bipartisan support.

“We’re not going to take our foot off the gas pedal,” Cornyn said. “We’re going to continue to press this issue, because then, as long as the bill is not out, there are people taking advantage of the absence of this sort of punishment to exploit people using these deepfakes.”

In the meantime, Liu created a startup called Alecto AI to help others quickly identify and remove deepfakes they find of themselves online.

“I came to the conclusion that unless I change the system, unless I change the world, justice wouldn’t even be an option for me,” she said.

Original CBS News Link</a