This is how they are using artificial intelligence to make phone scams

Criminals replicate the voice of a relative to rob victims over the phone. (Unsplash)

Wearing artificial intelligencecriminals are promoting a form of robbery through a phone call simulating someone else’s voice.

The method is the traditional scam in which a person listens to a relative who is in trouble and urgently needs money to resolve the situation, which is usually a traffic accident or an arrest.

It may interest you: This is how the artificial intelligence system that the Police will hire to protect itself from digital attacks would work

Criminals replicate the voice of a relative to rob victims over the phone. (Unsplash)

Platforms like ElevenLabs allow you to create a dialogue simulating any voice, it only takes 30 seconds of recording, so criminals can make a prior call to the family member to capture that information and then go after the victim.

what this says artificial intelligence it is the result of a text that the attackers create, taking advantage of the fact that the software analyzes each aspect of the voice such as gender, age and accent to find patterns and make the copy.

“Two years ago, even a year ago, you needed a lot of audio to clone a person’s voice. Now, if you have a Facebook page or if you recorded a TikTok and your voice is there for 30 seconds, people can clone your voice,” Hany Farid, a professor of digital forensics at the University of California at Berkeley, told the Washington Post. .

It may interest you: Top 5 technological profiles most sought after by entrepreneurs

The newspaper also ensures that the phone scams in which the voice of other people is imitated meant losses of 11 million dollars during 2022 in the United States, so it is a modality that continues to generate results.

ElevenLabs has already been pointed out for the problems that its tool can generate, since some time ago an audio was released that copied the voice of the actress, Emma Watson, reciting passages from Adolf Hitler’s “Mein Kampf”, which opens the door so that the voice of other celebrities can be used for more sensitive topics.

Criminals replicate the voice of a relative to rob victims over the phone. (Unsplash)

It may interest you: Through a video game, cybercriminals are controlling users’ computers

Although the artificial intelligence It takes the robbery modality to a very high point of deception, you always have to doubt in this type of situation.

The first thing is that in the event of a call, in which someone asks for help immediately, the name of a relative should never be given, because that will help the criminal to create a bond and start acting like that person, it is best to wait until he or she reveals their identity.

In case you give the name by mistake or the other party guesses it, the best alternative is to call that person directly and make sure that the situation is real. Criminals will always call from unknown numbers and create the excuse that it is because of the problem they are going through, that is why their cell phone does not work or they have detained it.

Another key point is never to give personal information such as address, telephone number, names, account numbers and emails because a money theft may not be carried out at that moment, but that data will be useful to the criminal in the future.

Source link

About TGS

Check Also

Don’t let El Peja let you down

Little do you believe that there will be unity among the militants of Morena Puebla? …

Leave a Reply

Your email address will not be published. Required fields are marked *