In January of 2024, an employee gave twenty-five million dollars to scammers in the midst of a video call. The employee saw his/her boss and colleagues on screen. Everyone looked real. But all of the individuals were AI-generated.
Verification once used to be cheap. You see somebody's face, hear their voice, you know it's them. That verification cost was essentially nothing. Now? The same verification is worthless. To actually verify identity you need multiple channels, code phrases, callback protocols; all of which cost time and money. The cost of trust just got higher, and higher, for all people.
Videocalls will become obsolete as a method for verification. There are other ways to verify something, without having to rely on video and audio. Verification with public and private keys for example. I keep thinking that AI will make ID verification obsolete as well. How easy it is for AI to create a fake ID card with a fake picture of some non-existing person? What about AI-generated selfies of someone holding an ID card? Will third-party ID verification services like Jumio and Onfido be able to catch such fake AI-generated ID cards and selfies? Maybe the best method would be to use one AI technology to battle another AI technology. I'm sure that AI can definitely track fake AI-generated video and audio.
It might not be possible to track fake AI generated video and audio because while the developers try to make AI and authenticator tools better, the evil ones are also getting better at hiding the signs of fake videos and audios.
The best thing some tech companies are currently doing is to embed watermarking in current AI upgrades and innovations with the hopes that fake videos and audios would be reduced to a great minimum.