rapid development Artificial Intelligence (AI) It brings both benefits and risks.
One worrying trend is the misuse of voice cloning. In a matter of seconds, a scammer can clone a voice and trick people into thinking a friend or family member urgently needs money.
News outlets including: CNNWe warn you that this type of scam has the potential to affect millions of people.
As technology makes it easier for criminals to invade our personal spaces, it is more important than ever to be mindful of our use of technology.
What is voice cloning?
With the advent of AI, image, text, voice generation and machine learning.
AI offers many benefits, but it also gives fraudsters new ways to exploit individuals for money.
“You may have heard the saying.deepfake,” AI is often used to generate fake images, videos, and even audio. famous person Or a politician.
voice cloninga type of deepfake technology, creates a digital replica of a person’s voice by capturing their vocal patterns, intonation, and breathing from simple audio samples.
Once voice patterns are captured, an AI voice generator can convert text input into a highly realistic voice that resembles the voice of the target person.
As technology has advanced, voice cloning has become possible with just a few operations. 3 second audio sample.
“Hello, is anyone there?” Using the same simple phrases can lead to voice cloning scams, so longer conversations help scammers capture more voice details. Therefore, it is best to keep calls brief until the caller’s identity is certain.
Voice cloning has valuable applications in entertainment and healthcare, allowing artists to work remotely with their voices (even after death) helps people with speech impediments.
However, this raises serious privacy and security issues, highlighting the need for safeguards.
How criminals are exploiting this
Cybercriminals use voice cloning technology to impersonate celebrities, authority figures, or ordinary people to commit fraud.
They create a sense of urgency, gain victims’ trust, and demand money through gift cards, wire transfers, or cryptocurrency.
procedure start Collect audio samples from sources like YouTube and TikTok.
Next, the technology analyzes the audio to create a new recording.
Once the voice is cloned, it can be used in fraudulent communications, often spoofing the caller ID to appear trustworthy.
Many voice cloning scam cases have made headlines.
For example, criminals have cloned the voices of: company director Organized a $51 million heist in the United Arab Emirates.
no way businessman in mumbai Became a victim of a voice cloning scam involving fake calls from the Indian Embassy in Dubai.
Recently in Australia, scammers have used the following voice cloning: Queensland Premier Stephen Miles They are trying to trick people into investing in Bitcoin.
Youth and children are also eligible. at kidnapping scam In the United States, a teenager’s voice was cloned and her parents manipulated into following her demands.
How widespread is it?
recent research The survey found that 28% of UK adults fell victim to voice cloning scams last year and 46% were unaware this type of scam existed.
This highlights a significant knowledge gap, putting millions of people at risk of fraud.
By 2022, nearly 240,000 Australians have reported falling victim to voice cloning scams, resulting in $A568 million.
How people and organizations can prevent this
The risks posed by voice duplication require: Multidisciplinary response.
People and organizations can implement a number of measures to prevent misuse of voice cloning technology.
first, Public awareness campaigns and education It can help protect people and organizations and mitigate this type of fraud.
Public-private partnerships can provide clear information and consent options for voice reproduction.
Second, people and organizations must find ways to utilize it. Biometric security through biometricsIt is a new technology that can recognize and verify real, not fake, voice. And organizations using voice recognition should consider adopting multi-factor authentication.
Third, strengthening investigative capabilities for voice cloning is another important law enforcement measure.
finally, Accurate and updated regulations Countries need to manage the associated risks.
Australian law enforcement agencies are recognizing the potential benefits of AI.
But concerns about the “dark side” of the technology have led to a need for research into its “criminal uses.”Artificial intelligence for victim targeting.”
There is also a need for intervention strategies that law enforcement can use to address this problem.
Such efforts must be connected to the overall picture. National Plan to Combat CybercrimeFocus on proactive, reactive, and restorative strategies.
The national plan sets out a duty of care for service providers, which is reflected in the Australian Government’s new legislation. legislation To protect the public and small businesses.
The bill aims to set out new duties to prevent, detect, report and block fraud.
This applies to regulated organizations such as telcos, banks and digital platform providers. The goal is to protect customers through prevention, detection, reporting, and disruption. Cyber fraud, including deception.
reduce risk
It is estimated that cybercrime causes significant losses to the Australian economy. 42 billion (Australian dollars)Public awareness and strong safeguards are essential.
Countries like Australia are recognizing the growing risk. The effectiveness of measures against voice cloning and other fraud will depend on adaptability, cost, feasibility and compliance.
All stakeholders – governments, citizens, law enforcement – must remain vigilant and raise public awareness to reduce the risk of harm.
Leo SF LinSenior Lecturer in Policing Studies; Charles Stuart University; Duane AslettSenior Lecturer in Policing Studies; Charles Stuart University; Gebereu Tulu MekonnenPolice Academy Instructor, Charles Stuart Universityand Mladen DjecevicPolice Academy Instructor, Charles Stuart University
This article is republished from: conversation Under Creative Commons License. read original article.