“Deepfake” Technology Can Make Convincing Fake Videos of Real People "Deepfake" Technology Can Make Convincing Fake Videos of Real People

Unnatural Eye and Jaw Movement Are Ways to Identify a Deepfake

KARL WINTER: What if we told you these scenes of Carrie Fisher in Star Wars: The Rise of Skywalker are not real? Fisher’s body is created digitally around old clips of her face. This Richard Nixon speech about a failed moon landing is also not real. These are deepfakes — or digital recreations of people made with artificial intelligence. Syracuse University Master’s film student Sara Brinsfield says this type of technology could be dangerous.

SARA BRINSFIELD: I don’t want my deepfake to say something that I personally would not say, especially if it’s a bigger celebrity or a bigger actor or actress.

WINTER: Deepfakes today can allow actors like Fisher to appear in movies after their death, create fake models for advertisements, and even age or de-age people. Brinsfield says there are ways to spot the differences between a fake clip and a real one.

BRINSFIELD: Whenever you put the deep fake over someone or something, it loses that human quality
to it. That’s the best way I can explain it.

WINTER: In a normal human conversation, you probably wouldn’t be looking at my eye movement, or my jaw movement, or my facial expressions. But if you keep an eye on some of those things on a video you may see on the Internet, you may be able to catch a convincing
deepfake.

WINTER: Jason Webb teaches visual effects at Syracuse University. Webb agrees there are telltale signs to identify you’re being tricked.

JASON WEBB: I’m usually looking at the jawline, to see how their jawline moves. The second thing I’m also looking at, if I have any question of it, like the one of the most infamous ones, is a deepfake of President Barack Obama that showed how lifelike it could be, but if you look at the jawline, it was separating in a couple of spots, because they did not motion track
properly.

WINTER: Unnatural eye movement is another signal that a video may not be what it seems. That face swap, and this one, were cheap, quick, and not very convincing. But as machine learning improves and deepfakes get better, Webb says the counter-technology is struggling to keep up.

WEBB: We have a technology that’s blown through our expectations of what we can do. We used to dream the technology could not do. And now we can’t dream fast enough what the technology can do.

WINTER: For Brinsfield, the moral of the story is simple: don’t always trust your eyes.

BRINSFIELD: Media literacy is the biggest thing because I’m not believing everything that I see on the
internet.

WINTER: That way, media consumers can make sure the Age of Information does not become the Age of Disinformation. Reporting in Syracuse, Karl Winter, NCC News.

SYRACUSE, N.Y. (NCC News) – Fake videos of real people saying things they have never said are becoming more prevalent and convincing.

At the height of the Russia-Ukraine conflict, the Russian government produced a manipulated video of Ukrainian President Volodymyr Zelenskyy appearing to call for a ceasefire.

In April 2018, BuzzFeed released a fake video that appeared to show President Barack Obama delivering a disclaimer about internet hoaxes.

These videos, known as “deepfakes,” can be dangerous.

“I don’t want my deepfake to say something that I personally would not say, especially if it’s a bigger celebrity or a bigger actor, actress,” said Sara Brinsfield, a Master’s student studying Television, Radio and Film at Syracuse University. “So a lot of ethical rights and issues come into play there.”

The term “deepfake” refers to videos that depict digital recreations of people made artificially. The concept is fairly simple: An artificial intelligence model analyzes real photos and video footage of real people, then creates its own video versions of that person. A second AI model eliminates new versions that do not appear legitimate, until the creator is left with a somewhat real-looking video.

Today’s deepfakes can also use voice cloning technology to create mock speeches, like this one of President Richard Nixon.

Not all deepfakes are nefarious, as one brought Carrie Fisher back to life in Star Wars: The Rise of Skywalker. However, as the artificial intelligence becomes more sophisticated, it may need to be more regulated, Syracuse University adjunct professor Jason Webb said.

“We have a technology that’s blown through our expectations of what we can do,” said Webb, who teaches visual effects classes. “What we used to dream, the technology could not do. And now we can’t dream fast enough what the technology can do. So we’ve got to figure out how to best handle these situations, and we still have a lot of learning to do.”

Brinsfield and Webb provided a few tips about spotting deepfakes:

  • Look for dead eyes: Natural eye movement is difficult to replicate, and some deepfakes’ eyes do not move at all
  • Watch the jawline: When a face is superimposed onto a different body, the jawline may separate unnaturally
  • Do not trust everything you see on the Internet: Especially when viewing a video about a politician or celebrity, make sure it comes from a reputable source

 

Related Articles