The naked truth about deepnudes

The stripping of dignity and democracy

deepnudes
A shoddily created deepnude of model Tyra Banks
Share
Text
Text Size
Small
Medium
Large
Line Spacing
Small
Normal
Large

Imagine this: One day in the near future, you innocently upload a picture to Facebook that shows you enjoying a summer picnic. A former friend with a grudge, or perhaps an ex-lover, copies the image, and uploads it to a software solution called DeepNude. The program strips the clothes from your fully clothed picture, and creates a new picture of you, naked.

The picture of you in your birthday suit gets uploaded to social media, and then distributed to friends and family, as well as enemies and complete strangers. Your life is now one of instant humiliation, embarrassment and shame.

The recently released app DeepNude is the latest shot against privacy from the digital Wild West. Its developer, after…

Imagine this: One day in the near future, you innocently upload a picture to Facebook that shows you enjoying a summer picnic. A former friend with a grudge, or perhaps an ex-lover, copies the image, and uploads it to a software solution called DeepNude. The program strips the clothes from your fully clothed picture, and creates a new picture of you, naked.

The picture of you in your birthday suit gets uploaded to social media, and then distributed to friends and family, as well as enemies and complete strangers. Your life is now one of instant humiliation, embarrassment and shame.

The recently released app DeepNude is the latest shot against privacy from the digital Wild West. Its developer, after being widely criticized for creating a product that could be used to harass and humiliate women, has taken the app down — but not before others had copied his software and distributed it around the web, where it is now widely available.

DeepNude is the latest variant in what is more generally known as the deepfake phenomenon, in which apps allow anyone to create fake anything about anybody else. Like so much digital technology, this new revolution has been driven in part by the porn industry.

It was porn that decided that VHS was a better solution than Betamax back in the day when video recorders were a thing. It was porn that created the cable TV industry. Today, porn accounts for 30 percent of all data transferred across the internet. Porn has also been a driver for the computer graphics industry. But it was mainstream media that created what may have been the first deepfake video.

In Star Wars: Rogue One (2016), a talking image of the recently deceased actor Peter Cushing, in the role of Grand Moff Tarkin, was inserted into the movie. The scenes were created entirely using computer graphics and were largely indistinguishable from the real thing.

In the three years since then, the deeply fake world has moved on a long way. Today, it is possible to take a few sentences of anyone’s recorded voice and manufacture whole sentences that would sound exactly like the real thing — and so accurately that it is now extremely difficult to prove that a conversation is fake.

It is also possible to take anyone’s image and, using Artificial Intelligence technology, to have their face say anything you might want. Researchers at the University of Washington demonstrated just how effective this can be by creating a fake video of President Obama that in both look and feel was exactly like the real thing.

To put this in stark terms, and stark naked terms, it is now possible to create a video of anyone you might know or anyone you might dislike saying anything you want and doing so in the nude. This means a new era of psychological warfare between countries. It gives new meaning to the phrase ‘revenge porn’. And it makes it difficult, if not impossible, to sift truth from fiction in the digital world.

The pace of these changes is truly extraordinary and difficult for any society — never mind legislators — to understand and control. No country has introduced legislation to manage the deepfake phenomenon. Even technology companies such as Google and Facebook are struggling to develop solutions that can detect the real from the fake.

President Trump may have created a popular mantra around the idea of ‘fake news’ but in the months to come he, along with many other politicians, is destined to become a victim of real deepfake — a new oxymoron for a new predicament. Should Russia choose to interfere in the 2020 presidential election, expect deepfake videos of Trump’s rivals saying or doing anything the Russian imagination can create. Denials of ‘fake news’ after the event will do little to undermine the damage already done.

Alternatively, activists from the left might want to truly humiliate Trump by creating deepnude images of a naked US president. That might elicit an epic tantrum from Trump, which may well repel many among the electorate.

Aside from the potential damage to reputations and democratic process, digital fakery and the future of the AI revolution raise serious issues. Currently, there exist no standards for deciding which technological solution is beneficial to society and which should not be allowed to proceed. There is no agreement of any kind for an acceptable set of ‘rules of the road’, either within countries or between nations. Instead, we have an anarchic free-for-all, where any developer can do anything he or she wants, whether or not the result will pose an unacceptable risk to society as a whole.