*Text originally published in Letícia Piccolotto’s weekly column in UOL/Tilt.
Can you imagine waking up one day and discovering that your face is circulating in a totally unknown video on the internet? Or listen to audio with your own voice, asking for information you never asked for? Well, it may seem like a fiction, parody or horror movie, but the truth is that this is very close to us and it is possible thanks to a technology called “deepfake”.
Deepfake (a term that, in Portuguese, could be translated as “deep lie”) is a technique that consists of creating forged and untrue content, which can be in both audio and video format, produced using artificial intelligence. (AI). In practice, they are artificial media developed based on a large volume of real files of an individual, through the use of machine learning algorithms; with them, artificial intelligence merges, combines, replaces or superimposes audio and images to create fake content in which people can be placed in any situation, assuming attitudes never taken, in a very convincing way.
In an age of disinformation — as I’ve discussed here — deepfake is a powerful threat that could contribute to an intensifying infodemic, an epidemic of misinformation. With lies and inconsistencies becoming even more refined, distorting the very essence of what we see and understand by reality, it will become even more difficult to distinguish the truth from the lie. Another derivation of these editing mechanisms is shallowfake (or “shallow lie”, in Portuguese) which also refers to manipulated content, but which does not use artificial intelligence in the creation process. Generally, shallowfake is a simple edit where a video or audio is taken out of context.
A famous case of the use of edits occurred during the United States elections in 2020, in which then-President Donald Trump shared on his social media videos manipulated on more than one occasion to attack his main opponent, Democrat Joe Biden. According to Agência Pública, Trump published a video of his Democratic opponent, in which he appeared to support his reelection. The report cites that, despite being an edited video, it is not a deepfake because it does not use artificial intelligence in the creation process. The material, however, was enough to confuse voters. Currently, the use of deepfake is not yet popular, but it has gained strength, mainly because of the applications that facilitate the creation of this type of content.
Anchors of Jornal Nacional were victims
Cases in Brazil have even been reported during the 2022 elections. The two events in question targeted journalists and presenters. The first deepfake identified altered the speech of Jornal Nacional presenter Renata Vasconcellos. The video very convincingly uses the voice of the journalist to show data from a fake poll of intent to vote. In another episode, in August of this year, the check site Comprova showed a video posted on TikTok that used the technique to be fake. In it, the anchor of Jornal Nacional, William Bonner, allegedly calls presidential candidates “bandits”.
In this case, a technique called Text to Speech (TTS) was used, capable of generating artificial audio from text content from a database with dozens of audios. The TTS techniques for creating audios are already available to the general public on free websites. It is necessary to be aware of this phenomenon, because, despite being incipient, it can be a potential risk for communication, as this tool spreads. What’s more, although the main target is public people, there is the potential risk that anyone will be exposed and suffer consequences from spreading fake news in their own name.
Is deepfake a crime?
Although there is still no specific regulation for this type of resource —which allows these programs to continue circulating—, the individual who makes changes with the aim of harming or defaming someone is indeed committing a crime, which can be classified as slander, defamation and libel, as provided for by the Penal Code. Crimes of forgery of a private document, ideological forgery, invasion of a computer device, among others, can also be framed. Another point of attention around deepfake is related to its tendency to have you as victims, especially women, involving sexual montages.
Actress Gal Gadot and singer Anitta, for example, are among the women victims of this crime. Unfortunately, this becomes yet another artifice that can haunt the already challenging participation of women in spaces of power and politics. Despite the mishaps, we see justice advancing in the protection of candidates and government officials, with the 2022 election being the first to consider political gender violence a crime. According to Law 14.192/21, the punishment for those who commit the crime can be up to 4 years in prison and a fine. If the violence occurs over the internet, the penalty is harsher, up to 6 years.
How do you know if there has been manipulation?
There are some strategies to assess whether the media is manipulated — such as the synchrony of speech with the lips, skin tone being out of tune and/or lost gaze. But experts point out that the task is increasingly difficult, as technology advances so that the details that could reveal the manipulation of the videos are increasingly subtle.
On the side of companies, which have also been victims of cybercrimes through deepfakes, the authorities’ guidance is to fight back with the same coin, that is, adopting technologies for protection. This is the case with blockchain, which can be used in a variety of applications, from legal aspects to voting to authenticate identity, artificial intelligence or multi-factor authentication and digital signatures. However, one of the key factors in combating deepfake attacks remains cybersecurity awareness training. Organizations need to train their employees to better protect themselves and the company against the many types of cyberattacks that are plaguing us.
It is necessary to bet on the role of education and the development of a critical sense, that is, to distrust and check the messages received before taking them as truth. Above all, we must each and every one of us be attentive and in search of the truth, however difficult it may be.
Check out the full text on UOL/Tilt.
>> Check out more texts published in Letícia Piccolotto’s weekly column on UOL/Tilt. <<