When you hear the word phishing, you probably think of email. And that’s exactly what the scammers want you to think so you won’t pay attention to their latest delivery mechanism: voicemail.
Phishing is generally an email that looks real, but isn’t, in an effort to get you to do something you shouldn’t. Now, fraudsters are using deepfake technology to generate audio that sounds real, but isn’t, in an effort to get you to something you shouldn’t. And that’s exactly what scammers did to the CEO of a German energy company.
According to an article on HotHardware, “a company CEO was tricked by scammers who faked the voice of the parent company CEO to get the executive to transfer $243,000 to an external account. The unwitting company CEO says that he suspected nothing once he heard the German dialect and voice patterns of his assumed boss.”
Deepfake technology is normally used to combine and superimpose existing images and videos onto source images or videos using a machine learning to create a believable fake video. In this case, deepfake technology was used to combine and superimpose existing audio and source audio to create a believable conversation.
“It’s unclear what software was used to fake the German CEO’s voice. However, with the scam actually succeeding and with such advanced software readily available, this sort of security threat vector will likely become more common.” Yikes.
Is there any technology available today to combat this sort of scam? Not yet. The only defense today is a heightened suspicion whenever any business communication requests money. Only now, instead of only being suspicious of emails requesting money, you also have to be suspicious of phone calls and voicemails requesting money. If it isn’t one thing, it’s another.