A recent investigation has highlighted the alarming spread of deepfake pornography, revealing nearly 4,000 celebrities, including high-profile women like Taylor Swift and journalist Cathy Newman, as victims. The investigation, covered extensively by Channel 4 News, exposed the ease with which deepfake nudes can be created and distributed, thanks to advanced AI technology. Nearly 200 apps have been identified that allow for the instant creation of fake nudes, contributing to a surge in deepfake images, with over 600,000 instances identified in 2023 alone. The issue has sparked a call for more effective action from both digital platforms and governments, with the UK's Online Harms Act (Bill C-63) being debated as a potential legislative response. This act, defended by Attorney General Virani Arif, aims to make the internet safer, especially for young people, by holding social media companies accountable for the content hosted on their platforms. The urgency of the situation is underscored by the tragic loss of young lives, prompting officials like Arif to advocate for the act as a means to protect children online.
"The latest research shows there're now around 200 apps where anyone can upload an image, and instantly create a fake nude. AI technology has advanced at such a rapid rate that it's cheap and easy for apps to make these images on a huge scale. In 2023, 600,000 deepfake images… https://t.co/OkGnwuaGe1
Another young life tragically lost to preventable online harms. I want to assure Harry’s family, and all Canadian parents, that keeping young people safe is our top priority. That’s why we tabled the Online Harms Act. We will do everything in our power to #ProtectKidsOnline. https://t.co/sfSW57Xts6
This is heartbreaking - and preventable. How many times do we have to say it? Social media companies MUST be held accountable for their unsafe products & how they facilitate sextortion of our youth. @viraniarif, when will Canada protect children online? https://t.co/kLqerlvbiP
The internet "frankly terrifies me," Attorney General @ViraniArif tells MPs in defence of censorship bill #C63: "We need to make the internet safe." https://t.co/FHQUqVNt8u #cdnpoli @MarcoMendicino @JusticeCanadaEN https://t.co/lZNPpYeOFV
A shocking new investigation has revealed just how prevalent deepfake pornography has become https://t.co/WLiSFXl0hC
Attorney General @viraniarif says the internet “frankly terrifies me.” @MinJusticeEn is defending Bill #C63 An Act To Enact The Online Harms Act. https://t.co/z8zHLG1t6H #cdnpoli https://t.co/GlB4Gu5JrV
Investigation reveals nearly 4,000 celebrities victimized by deepfake pornography https://t.co/QM6VbkScCU https://t.co/OOtPNZuCfF
#GeorgiaMeloni's Deep-Fake Porn Come Out | Father-Son Duo Made Multiple Videos | Details https://t.co/HihzgREsps
This is horrendous and absolutely why we need tough legislation on the creation of deepfake pornography. We sadly know all too well it will be women in the public eye and young people who will most likely be victims. UK law is far too slow on these types of developments. https://t.co/d2GzlNWiLc
Nearly 4,000 celebrities found to be victims of deepfake pornography https://t.co/P77lfZBgIH
Florist Sophie Parrish, thousands of high-profile women and me: all victims of #deepfake porn. Watch our exclusive investigation here, with big thanks to top team @jobrabkin @guybasnett @SophieBraybrook @EmilyGraceRoe : https://t.co/1WZABi1IMH
Deepfake porn: the UK celebrity victims – including me. Watch @cathynewman report. https://t.co/YH51ReGhcF
Amazingly courageous of @cathynewman for her report on @Channel4News this evening on deep fake porn. Very powerful.
A scarcely believable story presented with incredible bravery by @cathynewman. Deep fake porn and its creators are just vermin, and Apple and Google should be held accountable for hosting the software. #C4News
Huge kudos to @Channel4News and @cathynewman for covering the deepfake porn epidemic. I hope it prompts more effective action by both platforms and governments
Tonight on @Channel4News EXCLUSIVE on #deepfake porn & the millions of women targeted. We identify 1000s of victims in the public eye. And during our investigation the brilliant @EmilyGraceRoe @SophieBraybrook @jobrabkin & @guybasnett discovered I was one of them. Watch at 7
Can anybody tell me if @PierrePoilievre made a statement about the insanity that is Bill c-63 / Online Harms Act yet? https://t.co/bAa9Qwg7lo
Taylor Swift isn't the only victim of AI porn. Can the spread of deepfake nudes be stopped? https://t.co/hwTdcIIGuY #AI #MachineLearning #DeepLearning #LLMs #DataScience https://t.co/IpWjXeF6gi