x
Breaking News
More () »

Sen. Ted Cruz details his proposed, first-of-its-kind bill to protect victims of deepfake pornography

In a sit-down interview, Cruz discusses how his proposed 'TAKE IT DOWN Act' would go require social media platforms to remove deepfake images at victims' request.

DALLAS — AI technology is expanding so rapidly, our laws can’t keep up. 

 And one of the largest gaps in protection involves intimate deepfake imagery -- both in terms of still pictures and video. 

"You’ve always had the ability to do, say, a photoshop,"  U.S. Senator Ted Cruz told Inside Texas Politics. "But the past versions of it have been pretty technologically clunky. What AI enables you to do, what deepfakes can do, is create either images or videos that anyone watching would believe was real."

Inspired by a teenage victim from Aledo, the Texas Republican has introduced bipartisan legislation that would protect victims of non-consensual intimate imagery (NCII), both real and deepfake.

In October 2023, a fellow student at Aledo High School took Elliston Berry’s Instagram photos and plugged them into artificial intelligence, creating fake nude images of the then-14-year-old girl.  

For more than eight months, Berry's family couldn't get Snapchat to take down the images.  

After Senator Cruz made a call to Snapchat, however, the family says the images were removed within 24 hours. 

It should be said: This wasn’t simply attaching a head to someone else’s body in a photo. AI technology can now take someone’s actual body, strip away the clothes and create an approximation of a naked victim. 

"That’s a violation when you have a computer basically saying, 'This is what this child looks like naked,'" said Senator Cruz. "That is designed to embarrass, to violate, to exploit the victim." 

Berry and her mother Anna McAdams recently took part in Senator Cruz’s first-ever field hearing at the University of North Texas at Dallas in support of legislation that Republican just introduced on this subject. It's called the "TAKE IT DOWN Act," and it stands for "Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks". 

 If it is signed into law, it would: 

  • Criminalize the publication of NCII or the threat to publish NCII in interstate commerce 
  • Require platforms/websites to take down NCII upon notice from the victim within 48 hours, in addition to making reasonable efforts to remove copies of the images 
  • Protect good faith efforts to assist victims 

"We know that what we’re seeing is the tip of the iceberg," Cruz told us. "We have seen that the victim advocacy groups are reporting thousands upon thousands of instances."

The Republican is working closely with Senator Amy Klobuchar, D-Minnesota, to get the legislation across the finish line.  

In total, seven Republicans and seven Democrats are co-sponsors of the bill, which Cruz hopes will help it gain momentum to get signed into law by the end of the year. 

While the legislation will hold Big Tech accountable, the Senator says the industry has not been lobbying against it so far. 

"I don’t know if they will push back or not," he said. "I hope they don’t. I mean, this is the right thing to do. If there’s any modicum of corporate responsibility and integrity, the right thing for the tech companies to say is, 'We want this bill because it’s a rule that makes sense.'"

If the TAKE IT DOWN Act becomes reality, our laws will finally start to catch up to the technology, offering more protection to everyone. 

“In this instance, the exploitation of actual people using deepfakes is something that falls between the cracks of the law, that there’s not currently a federal law that covers it, and in many states there’s not a state law that covers it either,” Cruz said.

Also in Inside Texas Politics' sit-down with Senator Cruz, he discusses another piece of legislation he recently introduced -- a bill called the “No Tax on Tips Act,” which would exempt cash tips from income tax. Watch the full interview below to learn more. 

    

Before You Leave, Check This Out