Why Deepfake Audio Technology Is a Real Threat to Enterprise Security

Why Deepfake Audio Technology Is a Real Threat to Enterprise Security

Just when you thought you had enough to keep you up at night, there’s another threat to add to the list of enterprise security nightmares lurking under the bed. The deepfake, once a threat only to celebrities, has now transcended into the realm of potential risks to the organization.


According to Axios, deepfake audio technology has already begun wreaking havoc on the business world, as threat actors use the tech to impersonate CEOs. Symantec has reported three successful audio attacks on private companies that involved a call from the “CEO” to a senior financial officer requesting an urgent money transfer. Just imagine how an incident like this would affect your company.


Make no mistake: The threat is real. Especially because we don’t yet have tools reliable enough to distinguish between deepfake audio and the genuine article. So what can the enterprise do? Are there any steps we can take to mitigate the risk?


Taking Social Engineering to the Next Level


Independent cybersecurity expert Rod Soto views deepfakes as the next level of social engineering attacks.


“Deepfakes, either in video or audio form, go far beyond the simple email link, well-crafted SMS/text, or a phone call that many criminals use to abuse people’s trust and mislead them into harmful actions,” Soto said. “They can indeed extend the way social engineering techniques are employed.”


Simulated leaked audio may happen sooner than later, possib ..

Support the originator by clicking the read the rest link below.