BBC News world
A fake video of Kate was made. Someone used artificial intelligence to match their face with someone else's
Imagine if your face was digitally edited and put on a porn video and shared on the internet. A woman revealed the terrible incident that happened to her.
While scrolling through her Twitter account one day, Kate Isaacs was shocked to see a disturbing video in her notifications.
Speaking publicly for the first time, Kate said: 'I was horrified, someone had used my face and put it on a porn video and made it look like it was me.'
A fake video of Kate was made. Someone had used artificial intelligence to digitally merge her face with someone else's, on the body of a porn actress.
The video was made using footage from his television interviews. In this video, they were shown having sex.
"My heart sank," she says. I didn't understand anything. I just remember feeling like this video was going to go all over the place. It was very scary.
In the past, high-profile celebrities and politicians were the most common targets of deepfakes.Videos weren't always obscene, some were created for humorous purposes, but the situation has changed in recent years. According to cyber security company DeepTrace, 96 percent of deepfakes are obscene content created without a person's consent.
Like revenge porn, deepfake pornography is what is known as photo-assisted sexual exploitation, a term that includes taking, creating or sharing sexual images without consent.
In Scotland it is already an offense to share images or videos that show another person in an intimate situation without their consent, but in other parts of the UK, it is only an offense if it is proven That such actions were intended to cause pain to the victim.
It's a legal loophole that means video creators often don't face legal consequences.
After a long wait, the government's plans for a UK-wide online safety bill have stalled due to endless revision stages. The new rules will give the regulator Ofcom the power to take action against any website that is causing harm to UK consumers, regardless of where they are located in the world But earlier this month, Culture Minister Michelle Donnellan said He and his team are now working to ensure the bill is introduced.
Kate, 30, founded the #NotYourPorn campaign in 2019. A year later, the campaign succeeded in removing all videos uploaded by unverified users from the adult entertainment website PornHub, which constitutes the majority of the website's content.
So Kate believes that whoever was behind her deepfake was offended by her human rights activities and tried to intimidate her.
But she had no idea who the man was or who might have seen her video and when she could see that his face had been superimposed on the footage of a pornographic actress, the video looked real and she was worried. That others may be deceived.
This was a breach, my identity was used in a way I did not authorize."
Underneath the video, people started leaving nasty comments saying they would follow Kate to her house, rap her, film the attack and post the footage on the internet.
"You start thinking about your family."
Holding back tears, he said, 'If they saw this material, how would they feel?'
The threat escalated when both Kate's home and work addresses were given below the video.
I was very worried that who knows my address? Is it someone I know who has done this?'
'I was thinking I was in real trouble, not just a few people on the internet saying bad things, it was a real threat.'
Because of her experience, she now helps others. Kate knows what to do if someone does that, but at that moment she was on her own.
She says, "At that time, I did not follow the things that I advise myself today."
A colleague of his reported the video and the threatening comments to Twitter, but once deepfake content is published, it's impossible to stop its spread online.
"I wanted the video taken off the internet, but there was nothing I could do about it," says Kate.
What if our content becomes fake?
Collect Evidence: It may seem counterintuitive, you want everything to be erased but it is important to collect screenshots of downloaded videos, usernames and URLs. Keep them in a secure folder and protect it with a password.
Report to Accounts: Once you have collected the evidence, report it.
Contact the police: It is important that you list what happened when and share any evidence you have collected. Tell the police.
There is a market for deepfakes in online forums. People post requests to make videos of their wives, neighbors and co-workers and, unbelievably, even their mothers, daughters and cousins.
Content creators respond with step-by-step instructions on what kind of content they need, advice on what photography angles are best, and a quote for the job.
South East England-based deepfake content maker Gorchem spoke to the BBC. He started making deepfakes of celebrities for his own satisfaction.
They enable people to 'realize their imaginations in ways that weren't possible before,' he says.
Gorkem then turned to women he found attractive, including his day-to-day work colleagues whom he barely knew.
One of them was married, the other was in a relationship." He further says, 'After making fake content of these women, it felt awkward to go to work but I controlled my nerves. I can pretend as if nothing has gone wrong, no one will suspect.'
Realizing that he could make money from his hobby, Gorkem started taking commissions for custom deep faxes. Collecting footage from women's social media profiles, they would have had enough material to create this deepfake. He says he recently deep-faked a woman using a Zoom call recording.
He admits that some women can be psychologically harmed by deepfaking, but he seems apathetic about the possible effects.
From a moral point of view I don't think there's anything that stops me, if I'm going to make money on commission I'll do it, it's easy," he says.
The quality of deepfakes can vary wildly and depends on the skill of the person making the video and the sophistication of the technology used.
But the operators of the biggest deepfake porn website admit that it's no longer easy to know for sure whether you're looking at pornographic images.
About 1.3 million people visit their site a month and about 20,000 videos are watched at a time. He is based in the US and rarely speaks to the media but agreed to speak anonymously to the BBC.
They believe that fake content of ordinary women is forbidden for them, but it is permissible to make pornographic videos of celebrities, social media influencers and politicians.
He said that their (celebrities) content is available in the mainstream. They are different from ordinary citizens.
"I think they deal with it a little differently, they forget about it. I don't need their consent, this is a fictional content, not reality.'
Do they think what they are doing is wrong They are somewhat in denial about the impact on women. He revealed that his spouse does not know what he does for income.
"I haven't told my wife, I'm afraid of how it might affect her."
Not long ago, deepfake software wasn't readily available and the average person didn't have the skills to fake content, but now, anyone over the age of 12 can legally download dozens of apps. And can create credible fake content in few clicks.
For Kate, this is worrying and 'really scary'. They also fear that the online safety bill will not keep up with technology.
Three years ago, when the bill was first drafted, deepfake creations were seen as a professional skill that one had to be trained in and not just download an app."
Kate says she won't let those who do that win
We are many years behind and the contents (of the bill) are outdated, not much is included," she says.
But for creator Gorkem, decriminalizing deepfaking will change things.
"If I can be found online, I'll stop there and maybe find another hobby."
The whole affair took a toll on Kate's health and ability to trust other people. They say that behind these attacks they were not only trying to intimidate and humiliate them but also trying to silence them.
But now they are more excited than ever. She realized that she kept thinking more about getting away from it all.
"I won't let them win," she says.
She says deepfakes can be used to control women and tech firms, including apps that enable face swapping, should be encouraged to take safeguards.
"Any app should be able to detect sexual content."
"If companies don't invest money, resources and time to prevent their apps from containing sexually explicit content, they're being deliberately irresponsible." They are criminals.
Comments
Post a Comment