The war of fake news with the use of deepfakes has already started in the 2022 election. (PL) appears first.
The excerpt uses part of the August 15 edition of Globo’s newscast, according to UOL columnist Cristina Tardáguila. This is the first time that a deepfake of an electoral nature has gained notoriety in the presidential race, but one of the presidential candidates, Ciro Gomes (PDT), had already used the same device to criticize Bolsonaro. The pedestrian, however, did not spread false news, but only mocked the president about the scandals at the MEC (Ministry of Education).
Deepfakes are nothing new in elections in other countries. In the United States, the last presidential election had a marked presence of this modality of propagating false news. The trend is that Brazil will also face the problem in 2022, according to Nina da Hora, a researcher and computer scientist.
“Today, with the ease of creating deepfakes and sharing them on social networks, which are currently the main means of sharing news, it is dangerous for political campaigns to use them to delegitimize elections and candidates,” said Nina, to the UOLbefore the start of the election period.
To better understand what this type of technology is, tilt explains what deepfakes are and how to identify them.
What is deepfake?
The term first emerged in December 2017, when a Reddit user by that name began posting fake videos of famous celebrities having sex, such as Emma Watson and Emma Stone. With the help of artificial intelligence tools, he would put the face of whoever he wanted in existing scenes.
Many deepfake materials are created from open source programs — free for anyone to use — geared toward machine learning. The programmer provides hundreds of photos and videos of the people involved which are automatically processed over the network. In this way, the computer learns what a face looks like, how it moves and how it reacts to light and shadow.
The software needs to learn these characteristics of the face of the original video and the face that wants to be implanted, because only then can the program find a common point between the two faces. Once this is done, the system performs a kind of trick where the image of person B’s face is placed on the body of person A.
Deepfake uses and risks
In Brazil, the most famous example is the work of Bruno Sartori, who became known in 2019 on social media by creating parody videos with President Jair Bolsonaro. But while Sartori’s productions can still be identified as fake with some ease, the possibility of using this technology in cinema has already been raised by a major studio.
In 2019, in an interview with the YouTube channel of the website E-farsas, Bruno Sartori talked about the technique. “About a year ago, I needed to train a face for about 30 days. Today, the technology has evolved and in about three to four days, I can get the same result,” said the journalist.
Computer special effects to create faces and scenes in audiovisual are no longer new. For example, actor James Dean, who died in 1955, must be digitally recreated to star in the movie “Finding Jack” in 2020.
The problem lies in the ease with which deepfake can be produced today. Compared to what it was before, anyone with access to algorithms, knowledge of machine learning and a good graphics processor can create a pretty convincing fake video. Even a Chinese mobile app called Zao has already started experimenting with the technique.
In addition, deepfake has been used in revenge porn. A current concern is the growth of this type of material as a political weapon. In the United States, a video of former President Barack Obama hurling curses at then-President Donald Trump went viral in 2018. There are also fake videos of Trump talking about how algorithms helped him get to the White House.
How to recognize deepfakes?
Some tips help you to recognize if the video you are watching has deepfake manipulation or not.
- Pay attention to awkward movements and also to the disproportionate size of the face;
- Videos with bad images are more deceiving;
- If the video has audio, pay attention to the synchronization of the sound with the mouth;
- It is easier to perceive manipulation in videos watched in full screen and in better quality;
- Research the origin of the information given in the video. If the information did not come out in reliable vehicles, it is certainly false.
*With information from Juliana Arreguy, from UOL; and Vinícius Oliveira and Ellen Alves, in collaboration with tilt.