If you‘ve seen Carrie Fisher in Star Wars: The Rise of Skywalker, a film shot after she died, you know how easy deepfakes are to bluff. Now imagine someone decided to put YOUR face on Carrie Fisher’s body. Suddenly you are a Hollywood film star! This is called a Deepfake.
Deepfakes refers to manipulated media produced by advanced AI. They can be created by anyone with an internet connection and a little time. Josh Clark and Chuck Bryant from the podcast “Stuff you Should Know”, explain the origin of deepfakes and how they can be dangerous to our personal lives and overall political climate.
Deepfakes use a piece of technology called the “Artificial Neural Network” or simply neural net. It uses its two parts—the Generator and Discriminator—to deep learn.
How does this work? Chuck and Josh illustrate this with the example of cats. If you tell the software to simulate a bunch of cats, the Generator goes through parameters you set to generate cats. At first they appear very simulated, and that’s where the Discriminator adjusts its parameters until it gets better and better and can no longer detect its own fakes. Imagine a piece of technology being able to fool itself!
Here’s where things get a little scary: people don’t just use deepfakes to simulate cats. There are more destructive uses of this AI.
Senator Marco Rubio has expressed concern for national security from deepfake videos that could trigger war and nuclear disaster. Imagine a foreign power sees a video of the U.S. President declaring war on their country or a video of a large explosion over one of their cities. Would they react too quickly and use a military response?
In court, Deepfakes are causing disruption. In 2019, a woman in a custody battle told courts she had a recording to prove how dangerous her ex-spouse sounded. Except it turned out she didn’t. She had manipulated the recording with widely available software to make it sound like he was making threats. Courts are grappling with how to verify evidence and hold fair trials.
I believe it will become increasingly crucial to have a clean chain of evidence with any digital file. If we can follow custody from a recording captured at an initial event to the later copy, transfer, conversion, and clarification of that file, it will ensure that no deepfakes are accepted in court.
The cat and mouse game between fakers and detectors will inevitably continue. Digital file analysts, programmers, and scientists helping with Deepfakes have their hands full.