New research by the Center for Countering Digital Hate (CCDH) has highlighted the alarming ease with which audio deepfakes of major politicians can be created. The report found that six popular AI audio cloning tools were able to generate convincing election disinformation in the voices of leading US, UK, and EU politicians in 80% of tests. This raises significant concerns about the potential impact on upcoming elections, with experts warning that these AI-generated deepfakes could mislead voters and threaten democracy. Specific instances include audio clips of Joe Biden warning of a bomb threat and Donald Trump admitting to lying, both easily fabricated using online AI tools. The report underscores the urgent need for measures to safeguard the integrity of democratic processes against such technological threats. Devin Coldewey and FastCompany have also reported on this issue, emphasizing the widespread concern. NPR noted the affordability and ease of creating such deepfakes.
As AI deepfakes cause havoc during other elections, experts warn the UK's politicians should be prepared https://t.co/G4ibVW6mZ4
Warning to UK politicians over risk of audio deepfakes that could derail the general election https://t.co/Y0sMMCf3LO
Warning to UK politicians! Be cautious of audio deepfakes that could disrupt the upcoming election. Experts urge preparedness for potential tech interference. What steps should be taken to safeguard democracy? #AICyberSecurity #TechInPolitics @SkyNews https://t.co/DsSc00pfi9
Warning to UK politicians over risk of audio deepfakes that could derail the general election - https://t.co/fcQNXjchkw
Warning issued vs deepfake campaigning in 2025 polls - https://t.co/mmAi4yKjL3 https://t.co/EpNpCpjVyc
AI voice cloning tools manipulated to imitate political leaders | Evening Standard https://t.co/MwPbtbpeuA
Convincing AI deepfakes of politicians are getting easier, report warns - Global News https://t.co/uzJlSR0UJO
As high-stakes elections approach in the U.S. and European Union, publicly available artificial intelligence tools can be easily weaponized to churn out convincing election lies in the voices of leading political figures, a digital civil rights group... https://t.co/6RZZAlHzNg
These audio clips appear to show Joe Biden warning of a bomb threat and Donald Trump admitting to lying — and they were easily created with online AI voice cloning tools. READ MORE: https://t.co/3rAwqEUe5Y https://t.co/3rAwqEUe5Y
AI voice cloning tools manipulated to imitate political leaders - Tech & Science Daily https://t.co/dvaUXaLdRg
Artificial intelligence has been used to create convincing voice clones of Rishi Sunak, Sir Keir Starmer and other politicians, heightening fears of election interference ⬇️ https://t.co/ZZ2ObsIYT1
🧠AI voice cloning tools manipulated to imitate political leaders @CCDHate 🔍Google defends AI search results after bizarre suggestions went viral 🤯Are AI-generated emojis coming to iOS 18? Plus more ✨ 🎙️Tech & Science Daily 👉 https://t.co/Fnr7zbAXHV https://t.co/2grr139Asm
Chilling warning over UK’s enemies using AI deepfakes to try rig election – experts say it’s a ‘recipe for a disaster’ https://t.co/HNvkMA3yy1
Audio deepfakes of politicians are cheap and easy to make - NPR https://t.co/9g6PxVa65I
AI-generated political deep-fakes disseminate disinformation and are a present danger to our democracies ahead of crucial global elections Check out the new @CCDHate report 👇 https://t.co/Uw8X7qb0BT
A new research paper by @CCDHate suggests it's alarmingly easy to create audio deepfakes of major politicians. My latest for @FastCompany https://t.co/put7YdGuAY
🚨NEW: CCDH found that 6 popular AI audio cloning tools can generate convincing election disinformation in the voice of leading US, UK & EU politicians in 80% of our tests. Election disinfo deepfakes can mislead voters & threaten democracy 🧵 https://t.co/QVYV0vq84z https://t.co/tHWss7fpxt
New research by the Center for Countering Digital Hate sounds the alarm on audio deepfakes ahead of a blockbuster political season. https://t.co/aHxroCisno
Wow, it's 2024 and voice cloning of political figures is still a walk in the park. Check out this eye-opening article by Devin Coldewey on how faked audio and video could play a major role in the upcoming election. Get the scoop here: https://t.co/VM7KIwvTWs