AI, Artificial intelligence, Celebrities, deep fakes, deepfakes, Hacked, Information Security, Machine Learning, Naughty America, Nude Celebrities, porn, Privacy, Top News

Deepfakes porn service: Don’t worry, we’ll only use “consenting adults”

At the beginning of the year, much of the internet shuddered at the then-new concept of deepfakes: artificial intelligence- (AI-) generated videos that stitched random (mostly famous, with an odd proclivity toward actor Nicholas Cage) people’s heads into porn videos or into uttering things they’d never say in public.

Platforms banned it: Reddit, Twitter, Pornhub, and Gfycat, to name a few.

At least one blackmailer stole images and cast the unwilling in nonconsensual, fabricated adult videos before being arrested for it.

The US Defense Advanced Research Projects Agency (DARPA) focused its AI forgery detection work on the issue, citing the potential use of fake images by the country’s adversaries in propaganda or misinformation campaigns. One security researcher brought up the potential for police bodycam footage to be tampered with.

Then again, on the more gleeful, “ka-CHING!!!” side of the deepfakes coin, we have Naughty America.

The porn company last week launched a service that lets customers pay to customize their own deepfakes.

TL;DR: yes, that means you can buy your head stitched onto a porn actor’s body. Or, for that matter, maybe that of your cat, if the cat gives consent. Or, then again, maybe your head stitched onto the bodies of both/all parties in a given video, as suggested by one of our Naked Security staffers who prefers to stay anonymous … while Mark Stockley suggested that next year’s headline will be “Woman sued by cats.”

As well as the ability to switch faces with performers, deepfakery can also change the background in any given video. Of course, that background swapping means that Naughty America can put your favorite porn star anywhere, according to CEO Andreas Hronopoulos:

I can put people in your bedroom.

The Verge reports that simple edits to porn videos will cost just a few hundred dollars, while longer, more complicated changes could run into the thousands.

According to Variety, Naughty America has a script that will ask customers for footage of themselves. The script includes specific instructions for the facial expressions necessary for getting an optimal likeness. (Here’s a hint: don’t forget to blink. DARPA found that the lack of blinking, at least at this stage of the technology’s evolution, is a giveaway.)

What’s less clear is how the company plans to ensure that the video footage submitted by customers comes from consenting adults.

Variety reports that Naughty America will use its compliance department to make sure that “any and all footage used comes from consenting adults.”

Well, that’s pretty darn vague. It ignores the problem of how to determine whether the submitted content has been acquired with the consent of the party depicted. How will Naughty America figure out whether a given clip has been spearphished out of a victim, a la Celebgate?

Let’s hope that it doesn’t boil down to “because the customer said it was them.” Let’s pray that they know full well that on the internet, we are all dogs until proven otherwise.

Let’s also hope that the legal department of a porn company knows, by now, how to properly vet the age of a customer. Kids have enough to worry about when it comes to nonconsensual porn and the extortion demands that come with it – please, Naughty America, don’t give the ranks of underage tormentors another tool to make their demands even more threatening.

The Verge sent more questions about all this on over to Naughty America, but the company hadn’t responded as of Tuesday.


Source : Naked Security

Previous ArticleNext Article
Founder and Editor-in-Chief of 'Professional Hackers India'. Technology Evangelist, Security Analyst, Cyber Security Expert, PHP Developer and Part time hacker.

Send this to a friend