New Alert: Criminals use AI and voice cloning to trick you out of your money.
Earlier this year, Microsoft unveiled a new AI system that can replicate a person’s voice by analysing just three seconds of speech. The quick replication of a vital part of someone’s identity using AI showed how fast this technology can be employed.
In March, security concerns were raised when Australian journalist Nick Evershed revealed that an AI version of his voice could grant access to his Centrelink self-service account.
Scammers are exploiting AI in ways beyond voice cloning. Although voice cloning is one such way, experts have observed other methods scammers use.
Centrelink and the Australian Tax Office (ATO) use “voiceprint” security systems that may be tricked. According to the investigation these systems use the phrase, “In Australia, my voice identifies me.”
Services Australia reported in its annual report for 2021-22 that voice biometrics had been utilised to verify more than 56,000 calls daily, which accounts for 39% of calls made to the primary business numbers of Centrelink. The report also mentioned that a voiceprint is as secure as a fingerprint.
The ATO stated that it is not easy for someone to impersonate your voiceprint and gain access to your personal information.
Dr Lisa Given, a professor of information sciences at RMIT University, suggests that AI-generated voices have the potential to convince individuals they are conversing with someone familiar to them.
“If a system can accurately replicate my voice tone and emotions, scammers may start using voice messages instead of text to mimic someone’s voice and make a convincing message,” Lisa said.
Consumers were warned by the US Federal Trade Commission last month about fake family emergency calls that use voice clones generated by AI. The FBI has also issued warnings about scams involving virtual kidnappings.
According to Mark Gorrie, the Managing Director for Asia Pacific at Gen Digital, a cyber security software company, AI voice generators will improve in deceiving people and security systems.
It is essential to be aware of the potential risks of AI-generated voices and other scams and take appropriate measures to protect yourself. Taking steps such as verifying information obtained over the phone with a trusted source, being mindful when giving out personal information and using strong passwords are just some ways to stay one step ahead of scammers.
More Stories
Killnet and AnonymousSudan Collaborate to Launch Cyber Attacks on Western Organisations
In recent news, it has been reported that two Russia-sympathetic hacktivist groups, Killnet and AnonymousSudan, have allegedly launched a series...
$4000 Gone In An Instant: Mother Defrauded in Facebook Marketplace Car Deal
A mother of four is warning others to be cautious after believing she had purchased a safe and dependable car...
Shocking Scam: Sydney Family Loses $200K Life-Savings in Suncorp Spoofing Fraud
A family from Sydney has lost their life savings worth $200,000 due to a fraudulent scam. Peter and Madison, who...
Mysterious Money Transfer Leaves Couple Speechless: How They Got an Unsolicited $4000
A young couple in Melbourne claims their bank is making up a personal loan they do not understand. Ashley and...
‘Impossible to Spot’ Delivery Scam Email Targets Australia Post Customers – Don’t Fall Victim!
Unsuspecting shoppers should be cautious as a parcel delivery scam that is hard to distinguish targets Australia Post customers. Email...
How A 17-Year-Old Is Protecting Businesses Against Increasingly Sophisticated Cyber Threats
Jackson Henry, a teenager getting ready for Year 12 exams, doesn't fit the typical image of a "hacker". In cybersecurity,...