The ability to make video content of any person or event, has been a dream for storytellers and Hollywood. Now AI is making it a reality.
“The powers that be no longer have to stifle information. They can now overload us with so much of it, there’s no way to know what’s factual or not. The ability to be an informed public is only going to worsen with advancing deep fake technology.
Incriminating audio and video will hold even less weight than it already does. A government doesn’t have to lie to its people or censor its enemies when no one believes a thing to begin with.
We’re entering the Post-Information Age.”
― J. Andrew Schrecker
In the deep web world of deepfakes, an emerging AI technology that seamlessly fuses one person’s face onto another’s body in a video clip, there is no shortage of disturbing pornography made to look like celebrities.
There is also plenty of fake celebrity porn videos made using deepfake technology that depicts politicians and other newsworthy officials – including Vladimir Putin, Donald Trump, Barack Obama, Hillary Clinton, Emmanuel Macron, and Angela Merkel.
But deepfake makers have not only been creating fake celebrity porn for deepfake-obsessed Reddit users or sharing their creations with fellow deepfake artists on Discord channels. They have also used the tool to target journalists who report on deepfakes.
Freedom of speech is in jeopardy
AI researchers say malicious deepfakes could be used to threaten freedom of speech, especially in countries like Russia and China where deepfakes are already being weaponized for political propaganda.
“Putting fake news into deepfakes is an interesting idea,” says Fursy Teyssier, the developer of deepfake-generating AI tool FakeApp. “But it’s dangerous.”
A mischievous way to put fake news in deepfakes is by generating scenes and putting text on them.
Imagine a fake President simulated saying a racial slur or declaring war against another Country. Then imagine how quickly those videos would spread on social media.
A short history of deepfakes
The term deepfake was coined in 2017 by a Reddit user who posted a video clip of actor Nicolas Cage superimposed on actor Emma Watson’s face that made it appear as if they were romantically involved. The process has since become easier for people with no coding skills to do themselves thanks to FakeApp, which launched at the beginning of December.
Since deepfakes first started spreading online, the deepfake community has been self-regulating to keep deepfakes benign and mostly satirical.
Deepfakes are being used as extortion
“There is a code of ethics in deepfakes,” says Teyssier. “You don’t want to ruin the fun for everyone.”
This includes no blackmailing or selling deepfake clips that could be used for extortion purposes.
Teyssier, who lives in France, recently received an email from someone claiming to have deepfakes of Nicolas Cage engaging in sex acts with his wife, which he was willing to sell if Teyssier paid him $500 in bitcoin. While deepfake makers are trying to police their own communities, it’s much more difficult to stop deepfake makers who operate illegally outside deepfake forums and deepfakes channels.
“The deepfake community is doing a pretty good job self-regulating,” says Kevin Eykholt, CEO of deepfake news company Mad Network. “But deepfakes that are made by anyone who has the technical know-how, then distributed online or even worse for distribution offline, pose a serious threat to society.”
The use of deepfakes as weapons
Eykholt warns that deepfake technology could be used to cause serious harm to its subject.
In China, deepfakes have been weaponized against political dissidents in an attempt by the Chinese government to silence them. In Russia, deepfaking was recently used in an attempt to discredit journalist Yulia Latynina, who claims she was targeted with false deepfakes made to look like deepfake was coming from her ex-boyfriend.
A deepfake of Latynina engaging in sex acts with an alien creature was posted on her website after she denounced Putin for his alleged ties to organized crime.
‘Deepfakes are now being used by authoritarian regimes as a new method of censorship,’ says Eykholt. ‘It’s pretty scary.’
Deepfakes and ethical concerns
Deepfake makers are now worried that their community is putting innocent people at risk by releasing deepfakes into the wild before there is good technology available, along with laws and policies, to protect deepfake subjects from malicious deepfakes.
“There are deepfakery sites popping up all the time,” says Teyssier. “But deepfake makers cannot control deepfakery. It’s like they are unleashing plague.”
The deepfake community is discussing the porn deepfakes made of public officials, but deepfakes of everyday people are the true danger to the deepfake community. Teyssier, who has deepfaked celebrities including Lupita Nyong’o and Megan Fox, says he won’t deepfake an average person in part because it would be unethical.
“I won’t do ordinary people,” says Teyssier. “There needs to be some kind of ethical rules around deepfakes so that we don’t harm innocent people.”
Meanwhile, FakeApp is becoming more accessible to non-technical users with no scruples, which deepfake makers say is the real issue.
“The deepfake community knows deepfakes can be used as a weapon and they are trying to stop deepfakery,” says Eykholt. “But deepfakers don’t want deepfake technology limited to them. If we try and limit deepfake technology people will just use it for evil.”
FakeApp is an AI generative adversarial network (GAN) that was released at the end of January by former Uber software engineer Andrew Rabinovich who wanted the general public to have access to deepfake video-editing tools, but not with malicious intent.
‘I am starting this thread for requests from members of the deep learning community,’ wrote Rabinovich in a deepfake message board. ‘If you have deep learning projects that need massive amounts of data to work on, I will add them.’
Within 48 hours, FakeApp had been downloaded one thousand times. One week later it was downloaded 35,000 times and the deepfake community was buzzing about what deepfakes were possible with FakeApp.
There are already deepfake porn videos from FakeApp being shared across deepfake channels. In fact, deepfakers were sharing links to their latest deepfake creations via Reddit private messages hoping other deepfakers would get a thrill from seeing their recent deepfakes without giving away where they found the original video for free download purposes.
This is exactly why Teyssier set up his deepfake channel on deepfakes.club to provide a secure space for deepfakers to share their deepfake creations without giving away the source video for free download purposes, and it’s one of the reasons deepfake makers are beginning to think about encrypting all deepfake media before sharing online in order to protect deepfake subjects from malicious deepfakes.
Deepfakes are a new frontier in the realm of digital technology.
They have been used as extortion, but also for good causes such as creating footage from never before seen angles to show how war looks like or what it’s like inside an MRI scanner.
The videos can be created with very little effort and they could be manipulated to create fake news stories that would spread all over social media networks without much evidence needed.
With the ability to alter audio and video, anyone could be a victim or perpetrator at any moment. This should concern everyone who values their right to express themselves freely while not being censored by others.
If the freedom of speech is under attack already thanks to trolls, deepfakes will make things worse by making everything we see unreliable because there won’t be any way to know if images shown online are real or not anymore.
This phenomenon should worry anyone who cares about democracy and free speech at large.
There are ethical concerns about using this technology, but we need only look back into history when people were attacked with photoshopped images and videos before deciding whether or not these new developments will help society progress technologically or destroy our freedoms altogether.