Gary Parkinson Media

View Original

Should I Worry About... Deepfakes?

Should I Worry About… Deepfakes?


It started as a joke about a film star, but now it’s enmired in involuntary pornography and political machination - and it will forever change the way we consume media...


What’s the problem?

In a world accustomed to Photoshopped imagery, the old phrase “the camera never lies” has long since lost its legitimacy with still photographs - but the same suspicion doesn’t always apply to moving video. That could soon change, due to deepfakes. 

Deepfakes use artificial intelligence to superimpose imagery, usually replacing or manipulating human faces. At its most benign, this can merely be amusing: one of its most popular early applications was to artificially insert Nicolas Cage - whose character in the 1997 film Face/Off underwent a facial transplant - into famous movie scenes from Indiana Jones to Forrest Gump. 

However, the technology has been used for much darker purposes, usually covertly rather than overtly. By late 2017, several female celebrities had been faked into pornography. By early 2018, deepfake apps had appeared and the technology was suddenly available to a mass market. 

“Nothing can stop someone from cutting and pasting my image or anyone else’s onto a different body and making it look as eerily realistic as desired,” said actress Scarlett Johansson, a frequent deepfake victim. “Trying to protect yourself from the internet and its depravity is basically a lost cause... The internet is a vast wormhole of darkness that eats itself.”


What’s the worst that could happen?

Deepfakes have been used for revenge porn and blackmail - and as the technology improves, fakers can use your widely-available digital footprints. One forum user wrote “I made a pretty good vid of a girl I went to high school with using only 380 pics scraped from insta & FB,” referring to Instagram and Facebook. The Guardian’s Arwa Mahdawi has written that the motivation behind deepfake pornography comes from a desire “to control and humiliate women.”

There have also been political applications. Again, some were obvious, if offensive - such as replacing Argentine President Mauricio Macri’s face with Adolf Hitler’s, or Angela Merkel’s with Donald Trump’s. In April 2018, BuzzFeed CEO Jonah Peretti and actor Jordan Peele released a “public service announcement” in which Barack Obama (actually voiced by Peele) appeared to say a string of offensive statements before warning about the dangers of deepfakes. 

Not every political forgery will be as explicit - in either sense of the word. Now that it’s relatively easy to put words in the mouths of leaders, from subtle misquotes to outrageous falsehoods, video footage can no longer be automatically regarded as trustworthy - yet it is also easier than ever via the echo chambers of social media to spread a malicious misrepresentation. 


What do the experts say?

“Right now it’s not so easy that anyone can create a really well-done deepfake that’s going to fool a lot of people,” says Shuman Ghosemajumder, Shape Security’s chief technology officer and former click-fraud czar at Google. But deepfakes can still be propagated by social-media bot accounts - between October 2018 and March 2019, Facebook removed more than three billion fake accounts and estimated that it still has 120 million fake active monthly users. 

Ghosemajumder thinks deepfake technology will soon be added to social media platforms “for amusement, much the same way that Snapchat filters exist.” In the long run, it will be “an AI vs AI arms race”: “Once they reach a level of sophistication that they’re going to be able to fool most human eyes, the only way to detect them is also going to be machine learning-based.”

In the meantime, he urges constant skepticism. “If a celebrity or a politician is in a video doing things that they claim not to have done, that they claim is a fake video, how do we actually verify whether or not their claim is true or if the video is true? I think that’s something that we’re still figuring out.”

Originally published by CGTN Europe, 15 Oct 2019