Senator Ted Cruz, along with a bipartisan group of lawmakers including Sen. Todd Young, has introduced the 'Take It Down Act' to combat the growing issue of AI-generated deepfake pornography and revenge porn. The legislation mandates social media companies to remove nonconsensual intimate images, both real and AI-generated, within 48 hours of a victim's request. The Federal Trade Commission (FTC) will enforce this rule. The bill also criminalizes the publication or threat of publication of such images, making it a felony punishable by up to two years in prison. This move aims to protect victims, particularly young women, from the exploitation facilitated by generative AI technologies. The act was inspired by cases like that of Elliston Berry, a 14-year-old Texas student who struggled for eight months to get deepfake images of her removed from Snapchat.
Bipartisan group of senators target deepfake revenge porn with new legislation https://t.co/zxaBJg1m13
Sen. Cruz Introduces Bill to Combat AI-Generated Deepfake Nonconsensual Explicit Images. https://t.co/PtfSFZ2UUF
I am leading a bipartisan group of senators in a bill to ban the distribution of AI-generated nonconsensual explicit imagery. Tragically, these fake images create real victims, including a student from Texas whose mom contacted my office for help. https://t.co/KONvxDhz8k https://t.co/JnHfvdOpfz
I introduced the Take It Down Act to make publishing deepfake nonconsensual intimate images illegal. My bill also requires Big Tech companies to take action on these grotesque images within 48 hours. This bill will protect families across the country. https://t.co/KONvxDh1iM https://t.co/crDh70tNvS
Sen. #TedCruz (R-Texas) introduced a new bill to protect and empower victims of #DeepfakeVideo technology. The bill also aims to assist victims of non-consensual intimate image abuse, commonly known as “revenge pornography.” https://t.co/zWw5voyx4w
When Elliston Berry was just a freshman in high school, a classmate victimized her by posting AI-generated nonconsensual intimate images of her online. Her family went through eight months of hell as they tried to get these images taken down. My bipartisan bill, the Take It… https://t.co/G1rEz8nZXp
If you post copyrighted material on social media, it will be taken down within minutes. The Digital Millennium Copyright Act ensures that. If passed into law, our TAKE IT DOWN Act would require social media companies and websites to similarly take down non-consensual deepfake… https://t.co/DlaVfY2sFM
🤖🇺🇸 Texas teen faces AI nightmare with deepfake nudes circulating social media. Her chilling experience highlights the urgent need for AI regulation. 🚨 https://t.co/7bdbHM1CY4
When Elliston Berry was 14-years-old, her classmate used AI to create deepfake nonconsensual intimate images of her. For eight months, those images remained on Snapchat despite Elliston's repeated requests for them to be taken down. I had my team call Snapchat, and within… https://t.co/X5Issxw6We
What happened to Elliston Berry when she was 14-years-old is tragic. One of her classmates used AI to create deepfake nonconsensual explicit imagery. I introduced legislation to ensure this never happens again. Elliston is a hero for telling her story. Listen 👇 https://t.co/PXvlQrk5hA
A Texas mom and her daughter tried to get AI-generated deepfake nonconsensual images taken off Snapchat for eight months. Snapchat did not respond until my office stepped in. I introduced the Take It Down Act to ensure that no other victims have to wait for these images to be… https://t.co/i0Kr0lSJFw
A Texas mom and her 14-year-old daughter tried to get AI-generated deepfake nonconsensual images taken off Snapchat for eight months. Snapchat did not respond until my office stepped in. I introduced the Take It Down Act to ensure that no other victims have to wait for these… https://t.co/zwCwlBTw7f
Elliston Berry is a high school freshman from Ft. Worth, Texas. She woke up one morning to learn that a classmate had manipulated an innocent photo of her into an explicit image using AI. The TAKE IT DOWN Act will ensure that non-consensual deepfake images like this that are… https://t.co/jRh7GDHPBD
When she was 14-years-old, Elliston Berry was the victim of AI-generated deepfake nonconsensual intimate images. It took eight months of her mom contacting Snapchat repeatedly before the images were finally removed. I introduced the Take It Down Act to require Big Tech… https://t.co/MwXo3hhooO
Elliston Berry is a 14-year-old Texas student. A classmate used AI to create deepfake nonconsensual intimate images of her. For eight months, those images remained on Snapchat despite Elliston's repeated requests for them to be taken down. I had my team call Snapchat, and… https://t.co/HXnlg5xPvv
The Take It Down Act, a bipartisan Senate bill led by Sen. Ted Cruz, would criminalize the publication or threat of publication of nonconsensual intimate imagery. https://t.co/GEdGeW2ipU
Aledo High victim of deepfake nonconsensual explicit imagery joins federal effort to address simulated images. U.S. Sen. Ted Cruz unveiled bipartisan legislation aimed at speedier online removal of nonconsensual intimate photos and videos. https://t.co/zdjxnhocaO
WATCH: Yesterday, a group of bipartisan Senators and I introduced the TAKE IT DOWN Act — legislation that will protect the victims of explicit AI deepfakes. Our bill will require social media companies to have in place processes to remove the images and will criminalize the… https://t.co/Rc1Pn3sL1M
Sen. Cruz is championing for victims of revenge porn. Discover how AI is being used maliciously in a bipartisan effort. Join the conversation and stand against this harmful practice. #AI #Technology #Cybersecurity @tedcruz https://t.co/YfoafVZB2z
Cruz on Fox: We rolled out bipartisan legislation called the Take It Down Act. And it does two things. It says, number one, if you post or share non-consensual intimate images, either real images or deep fakes, that it's a crime. It's a felony punishable by up to two years in…
The Biden Administration is like "deepfake revenge porn" on America. Cruz announces bill to crack down on ‘deepfake’ revenge porn https://t.co/0ZMZmXsdil
A bipartisan group of senators introduced legislation that would require social media sites to take down deepfake "revenge porn" and make publishing it a federal crime. https://t.co/3x0jAY08ds
The “TAKE IT DOWN Act,” if passed, would treat real photos and photos created using artificial intelligence the same way. https://t.co/1hflrqaAbM
"@ddale8, take it away" https://t.co/0hlVh2HuM7
Bipartisan Group of Senators Introduce 'Take It Down Act' Targeting Revenge and Deepfake Pornography https://t.co/6jKgwRS4Zm
A new bill seeks to hold social media companies accountable for policing and removing nonconsensual sexually explicit deepfake images on their sites. The measure would criminalize publishing or threatening to publish deepfake pornography. https://t.co/6bmr0eMuzI
New AI deepfake bill would require Big Tech to police and remove nonconsenual explicit images. https://t.co/g3r4IBLqzh
Deepfake sexual images on social media are a very real problem. I can’t imagine the humiliation of being a teenager and waking up to learn that someone has created explicit AI-generated images of you. The TAKE IT DOWN Act will protect victims of this heinous crime by deterring… https://t.co/3UIYbwfxh8
, @SenTedCruz introduced on Tuesday the Take It Down Act, which would criminalize the publication of real or fake non-consensual intimate images and give victims the legal right to force big #tech companies to take them off their platforms within 48 hours.… https://t.co/MTcfD05zTb
.@tedcruz and a bipartisan group of lawmakers have introduced the TAKE IT DOWN Act in an effort to combat the growing issue of AI-generated deepfake "revenge porn." https://t.co/MUI78MZ7eP
Bipartisan senators target deepfake revenge porn with new legislation, requiring websites to remove these explicit videos and photos. https://t.co/kluRM4ZKaw
Today, I joined with a group of senators to introduce the TAKE IT DOWN Act, legislation that will protect victims – particularly young women – from harmful deepfakes. https://t.co/1digBMyAkA
We are increasingly seeing situations where generative AI is used to create abhorrent sexual material. I joined with a group of Senators to introduce the TAKE IT DOWN Act, legislation that will protect victims, particularly young women, from harmful deepfakes.…
We are increasingly seeing situations where generative AI is used to create exploitative abhorrent sexual material. I joined with a group of Senators to introduce the TAKE IT DOWN Act, legislation that will protect victims, particularly young women, from harmful deepfakes.…
Sen. Cruz Holds Press Conference Unveiling Bipartisan Bill to Protect and Empower Victims of Deepfake Pornography https://t.co/QoY3J0ENNx
In recent years, we’ve witnessed a stunning increase in exploitative sexual material online, largely due to bad actors taking advantage of newer technologies like generative artificial intelligence. Many women and girls are forever harmed by these crimes, having to live with…
A bipartisan bill is seeking to hold tech companies accountable for the growing trend of creating deep fake "revenge porn." @emrwilkins breaks it down. https://t.co/dfQCFVxPtX
CONGRESS HAMMERS DEEPFAKES: Plus Generative AI Drives You:The "Take It Down Act," sponsored by Sen. Ted Cruz, mandates social media companies remove deepfake porn within 48 hours of a victim's request. The FTC will enforce this rule to protect victims. https://t.co/enIdrNqi6t
🤖🇺🇸 New Senate Bill Targets AI Deepfake Porn! Lawmakers propose new legislation to make Big Tech responsible for quickly removing nonconsensual deepfake images. This could be a game-changer in protecting victims, from celebs to students. More info: https://t.co/ifOBkUAxZC https://t.co/KacyL5umfu
New AI deepfake porn bill would require big tech to police and remove images https://t.co/DUVIWHKlWn
Government working ways to tackle deepfake menace. According to sources, government working on Digital India Bill. @AishPaliwal with more details. #news #Deepfake #Technology @snehamordani https://t.co/GyJFA1sopA