Beware of deep-fake voice verification on financial accounts

Voice Deepfakes Are Coming for Your Bank Balance

Artificial intelligence tools have given scammers a potent weapon for trying to trick people into sending them money.
Stacy Cowley

By Emily Flitter and Stacy Cowley, The New York Times, Aug. 30, 2023

A cutting-edge scam attempt has grabbed the attention of cybersecurity experts: the use of artificial intelligence to generate voice deepfakes, or vocal renditions that mimic real people’s voices…

Customer data like bank account details that have been stolen by hackers — and are widely available on underground markets — help scammers pull off these attacks. They become even easier with wealthy clients, whose public appearances, including speeches, are often widely available on the internet. Finding audio samples for everyday customers can also be as easy as conducting an online search — say, on social media apps like TikTok and Instagram — for the name of someone whose bank account information the scammers already have… [end quote]

Although text-to-voice programs are common, deep-fake voice mimicking is still very rare. This will probably grow exponentially in the future.

It’s probably wise to opt out of voice verification instead of two-factor authentication for any financial account. It may also be wise to reduce public recordings of one’s voice. And to establish a password for any relative who might be in danger of being spoofed.



Good idea, but much easier to immediately ask (whoever calls) a question to which only they know the answer. Impromptu stuff likes that kills deepfakes because the AI does not have any way to find the answer to an unexpected question.