Nothing Is Safe In The Age of Generative AI Deepfakes, Not Even Live Calls Or Face ID Verifications
In just over a year since generative AI became mainstream, the world has become a scarier place. Fake explicit images with the click of a button, the ability to bypass identity verifications, and live video calls deceiving people into transferring millions… it’s all too real. Deepfakes, once just a tool for harmless pranks, now pose serious threats.
Ghost in the Machine: From static images and audio to live video, deepfakes have become more sophisticated. Now, they’re causing headaches for companies struggling to decipher what’s real and what’s fake. A recent KPMG report found that 92% of companies fear deepfake technology for its potential for fraud, extortion, and reputational damage. And worry they should, with deepfake digital identity fraud surging 10x, according to Sumsub, and the cases get worse by the day:
- According to 404, fake IDs are being created on underground sites like OnlyFake, successfully tricking platforms like cryptocurrency exchange OKX.
- Last week, fraudsters impersonated coworkers and the CFO of a company in a conference call, duping a finance clerk into transferring a whopping $25M.
Fighting Bad AI with Good AI
Concerns extend beyond corporate boardrooms to political arenas, with upcoming elections in the US, UK and India. Tech giants are trying to get ahead of possible disruptions by cracking down on the spread of generated content.
- Meta (NASDAQ:META) is labeling AI-generated images on its platforms ahead of elections, while TikTok and YouTube already require creators to flag such content.
- Microsoft (NASDAQ:MSFT) has developed tools to help politicians authenticate their media in an effort to curtail deepfakes.
GenAI vs. politicians: There’s no putting deepfakes back in the box, but while companies aim to blunt the impact of AI, US politicians are expected to ban generative AI’s use for robocalls. And they’re also considering regulating the technology — which might make it harder for bad actors to exploit its potential for evil.