Tech horror: Deep fakes mean ANYONE could be star of ‘non-consensual pornography’

We will use your email address only for sending you newsletters. Please see our Privacy Notice for details of your data protection rights.

The quality of synthetic media, videos and images produced by artificial intelligence, has increased dramatically over the past few years. Computer algorithms can be used to produce increasingly realistic videos which appear to show people saying and doing things they never actually did.

Computer algorithms can be used to produce increasingly realistic videos which appear to show people saying and doing things they never actually did.

According to political commentator Nina Schick, author of ‘Deepfakes: The Coming Infocalypse’, this is already being used to produce non-consensual pornography without the subject’s permission or potentially even knowledge.

Speaking to about the increasing power of deep fakes Ms Schick said: “A deep fake is essentially a piece on synthetic media, that’s to say an image, a piece of audio, a piece of video or even text, that’s generated either wholly or partially by AI.

“And the ability of AI to generate actual fake synthetic media is something that’s only been around for about two and a half years and can be traced back to the revolution in deep learning over the last five years.

“So this is something that’s really, really cutting edge, really new, essentially a fake piece of media made by AI.

“We are increasingly facing a future where media and content production is going to be made by AI. So the future is completely synthetic.

“If I have a few images of your digital likeness, I can train the AI to look and act and sound like you in video and audio.”

Synthetic media could have a range of positive uses, such as allowing people to insert their avatars into customised films and video games.

However when used negatively, as deep fakes, it could produce powerful tools for political misinformation and non-consensual porn.

Ms Schick explained: “Non-consensual deep fake pornography has already spawned its own ecosystem online which is very interconnected.

“It’s becoming increasingly commodified, there are special deep fake porn creators who you can commission for your bespoke creations.

“There are several websites where most of the deep fake pornography online is hosted on.


Rise of ‘Deep Fake’ AI generated videos could ‘threaten democracy’ [SHOCK]
Artificial Intelligence: Expert warns of ‘potential disaster’ of bias [REVEAL]
Artificial Intelligence: AI surveillance could spell ‘end of the West’ [WARNING]

“But increasingly you do have entrepreneurial individuals who are teaching themselves how to use the deep fake software available and there will be more and more of that as, the generation side improves.

“You see them advertising their services on Linkedin, selling deep fake training tutorials on Instagram.

“They see that aside from porn there are many people who want to put their face in a film, make themselves the protagonist of a film.”

Much of the non-consensual pornography that has already been produced is targeted at celebrities, inserting them into professionally produced pornographic videos.

Targets have included British and American actresses such as Emma Watson, Daisy Ridley and Kristen Bell.

Speaking to Vox Ms Bell said she felt “exploited” after discovering she’s been featured in deep fake non-consensual pornography.

Ms Schick also warned deep fakes could be used to spread political misinformation.

She asserted: “It is going to be the most sophisticated tool of disinformation known to humanity thus far.

“When you look at the political application of this of course it’s going to be used as a tool of political propaganda.

“The other thing about deep fakes that’s really important to politics is that in a world where anything can be faked, including video of you saying and doing things that you never did, everything can be denied.

“That’s the first adverse impact I think we’re going to see in politics before deep fakes and synthetic media even become ubiquitous.

“You’re already starting to see that where authentic evidence is emerging of people having said or done something and it’s being dismissed as a deep fake.

“If you have no objective sense of what’s real and what’s not it’s very difficult to see how that isn’t an existential threat to liberal democracy. Great for authoritarians, very bad for democracy.”

Source: Read Full Article