‘Deepfake’: ‘They put my face in a porn video’ | World

Browsing Twitter one night, Kate Isaacs, 30, came across a disturbing video among her notifications.

“Panic took hold of me,” recalls Kate, speaking publicly about what happened for the first time. “Someone took my face, put it in a porn video and made it look like it was me.”

Kate had been a victim of deepfake, when someone uses artificial intelligence to digitally manipulate an image — in her case, her face was superimposed on the image of a porn star.

The deepfake video she came across on Twitter — featuring Kate, who was already campaigning against non-consensual pornography — was made using footage from TV interviews she gave as an activist.

“I was devastated. I couldn’t think clearly,” she says. “I remember thinking this video would be seen everywhere. It was awful.”

In the past, celebrities and politicians were the most common targets for deepfakes — but not always pornographic ones, some just for comedic purposes. But over the years, that has changed: according to cybersecurity firm Deeptrace, 96% of all deepfakes are pornographic and non-consensual.

Like revenge pornography, deepfake pornography is a form of image-based sexual abuse, an umbrella term that includes capturing, creating, and/or sharing intimate images without consent.

In Scotland, it is already a crime to share images of a person in an intimate situation without their consent. But in other parts of the UK, this is considered an offense only if it is proved that such actions were intended to cause suffering to the victim – a loophole that often results in impunity.

But in the rest of the UK, the long-awaited nationwide Online Safety Plan is undergoing endless revisions and shelving.

Kate Isaacs founded the #NotYourPorn campaign in 2019. A year later, her activism contributed to adult entertainment site Pornhub taking down all videos on its platform submitted by unverified users — which was the majority.

Since 2019, Kate Isaacs has been active against revenge porn. — Photo: BBC

Kate concluded that she was the victim of a deepfake by someone uncomfortable with her activism.

But she had no idea who this person was, or who could have seen the video. She was also concerned that the video was convincing enough that people actually thought it was her.

“It was a violation: my identity was used in a way that I did not consent to.”

Some users also started leaving abusive comments, threatening her with many horrible things.

“You start thinking about your family,” she says, holding back tears. “How would they (relatives) feel if they saw this content?”

The threat intensified when Kate’s home and work addresses were posted in the comments, a practice known as doxing.

“I thought, ‘I’m in real trouble here, it’s not just some people on the internet talking…. There is real danger.'”

From her experience supporting people in similar situations, Kate knew exactly what to do if someone became a victim — but in that moment, she froze.

“I didn’t follow any of my own advice,” he recalls. “Kate, the activist, was very strong and didn’t show any vulnerability. And there was me, Kate, who was really scared.”

A colleague reported the video, the cruel comments and the doxing to Twitter, all of which were taken down from the platform. But once any deepfake has been published and shared online, it is difficult to completely remove it from circulation.

“I just wanted that video off the internet,” says Kate, “but there was nothing I could do about it.”

There is a market for deepfakes on online forums. People post requests for such videos to be taken of their wives, neighbors and co-workers, even their mothers, daughters and cousins.

Content creators respond with step-by-step instructions: what kind of footage they’ll need, advice on shooting angles, and payment methods.

A deepfakes creator operating in the south east of England, Gorkem, spoke to the BBC anonymously. He started creating celebrity deepfakes for his own amusement. For him, these artificial images allow people to “fulfill their fantasies in ways that weren’t really possible before.”

Afterwards, Gorkem went on to deepfake women he was attracted to, including coworkers he barely knew.

“One was married, the other was in a relationship,” he says. “Going to work in the office after I faked the images of these women was weird, but I managed it. I manage to act like there’s nothing wrong.”

Realizing that he could make money from what he calls a hobby, Gorkem started getting orders for custom deepfakes. Images of women on social media profiles give him plenty of material to start with. Gorkem says he recently deepfake a woman using a recording of an online meeting.

“With a good amount of video, (with the person) looking directly at the camera, there’s good material. So the algorithm just works from that and does the face reconstruction on the target video.”

He acknowledges that “some women” are psychologically harmed by being victims of deepfakes, but seems oblivious to the impact of treating them like an object.

“They can just say: it’s not me, this was faked. They should just recognize it and go on living their life.”

“From a moral point of view, I don’t think there’s anything stopping me,” he continues. “If I’m going to make money with an order, I’m in. I don’t care about it.”

In online forums, there are people who order ‘deepfakes’ using images of family members and acquaintances. — Photo: Getty Images via BBC

The pattern of deepfakes can vary widely and depends both on the experience of the person making the video and the sophistication of the technology used.

But the man behind the biggest porn site deepfake admits it’s no longer easy to tell whether you’re viewing manipulated images or not. His website attracts around 13 million visitors a month and hosts around 20,000 videos at the same time. He lives in the United States and rarely speaks to the media, but has agreed to speak to the BBC anonymously.

Deepfaking of “ordinary” women is a threshold, he says, for whom hosting pornographic videos from celebrities, social media influencers and politicians is justifiable.

“They are used to negative media, their content is available out there. They are different from normal citizens”, he argues.

“The way I see it, they’re able to handle it in a different way: they can just ignore it. I don’t really feel like consent is necessary. It’s a fantasy, it’s not real.”

Does he think he’s doing something wrong? Part of him is “in denial about the impact on women,” he admits, adding that his wife is unaware of his activity.

“I didn’t tell my wife. I’m afraid of how it might affect her.”

Until recently, deepfaking software was not accessible and easy for the average person to use. But now, anyone over the age of 12 can legally download dozens of apps and make convincing deepfakes with just a few clicks.

For Kate this is worrying and “really scary”.

“It’s not the dark web, it’s in the app stores. Right in front of us.”

She also fears that the UK’s long-awaited Online Safety Plan won’t keep up with the technology in time. Three years ago, when the project was introduced, creating deepfakes was seen as something that required a professional skill — it wasn’t enough to just download an app.

For deepfake creator Gorkem, criminalizing the activity would change things.

“If I could be tracked online, I’d stop there and probably find another hobby,” he says.

Being the target of deepfaking and doxing took a toll on Kate’s health and her ability to trust other people. She believes that those behind the attacks were not only trying to intimidate and humiliate her, but also to silence her. For a while, she walked away from her activism, questioning whether she could continue talking about misogyny.

But now she is even more excited. She realized that she cared too much to abandon her fight.

“I won’t let them win.”

Kate says deepfakes can be used to control women, and tech companies — including those that make apps that allow them to manipulate images — should be encouraged to impose restrictions and punishments.

“Any app must be able to detect sexual content.”

“If companies don’t invest the money, resources and time to ensure their app is not used to create sexually abusive content, they are being willfully irresponsible. They are to blame.”

*Neither Gorkem nor a deepfake site owner interviewed has anything to do with Kate Isaacs’ deepfaking, to the best of our knowledge.

– This text was published at https://www.bbc.com/portuguese/geral-63339308

Source link

About Admin

Check Also

Transition and tax denialism – Estadão

The month of December, before us, is perhaps the longest in recent times. It is …

Leave a Reply

Your email address will not be published. Required fields are marked *