Fake video could break your trust in the news entirely

Digital Video Portraits are already beating out deepfakes for creepy cultural dominance.

You're looking at a screen right now. Imagine, if you will, a video of a loved one pops up in the right-hand corner, a FaceTime or Skype call. Their image pops up and you see their familiar face. Instantly, your heart races just a little because you know it's them. 

But they don't look happy. They look sad. They're in trouble, they say, and they want you to wire them some money. They need it right now, they say, so you grab your credit card and fire off a money transfer to Western Union. Then they smile, thank you, the call ends. Then you get a call from your loved one asking you what you want for dinner. You ask them about the trouble they were in. "What trouble?" they say. And then it hits you, that you've been duped by what's being called a deepfake: a video likeness designed to mimic them exactly. 

They're becoming more and more common, and the technology is improving exponentially year after year. In 2017, a deepfake video of Obama made waves when it was revealed to be a made by A.I. In April of this year, actor Jordan Peele and Buzzfeed created an even more realistic deepfake video in which Obama (as voiced by Peele) calls President Trump a 'dipshit'. The whole thing was wrapped warmly as a warning against fake news.

But that's perhaps not the scariest part. The videos are getting better and better. 

Deep Video Portraits—developed by Stanford, the Technical University of Munich, the University of Bath and others—just needs a single minute-long video clip (or about 2,000 photographs) to draw from to create an almost indistinguishable fake video. It wouldn't be too hard—at all, really—to get a couple of voice actors together with Deep Video Portrait technology to create a video of Donald Trump and/or Vladimir Putin arguing for the mass extermination of a race of people. Deep Video Portraits are the much, much scarier older brother of deepfakes: they're harder to distinguish and easier to make. Even Michael Zollhöfer, the visiting Stanford professor who helped birth Deep Video Portrait, argues for better digital forensics once this technology becomes more mainstream

For example, the combination of photo-real synthesis of facial imagery with a voice impersonator or a voice synthesis system, would enable the generation of made-up video content that could potentially be used to defame people or to spread so-called ‘fake-news’. Currently, the modified videos still exhibit many artifacts, which makes most forgeries easy to spot. It is hard to predict at what point in time such ‘fake’ videos will be indistinguishable from real content for our human eyes. 

The recently presented systems demonstrate the need for sophisticated fraud detection and watermarking algorithms. We believe that the field of digital forensics will receive a lot of attention in the future. We believe that more funding for research projects that aim at forgery detection is a first good step to tackle these challenges. In my personal opinion, most important is that the general public has to be aware of the capabilities of modern technology for video generation and editing. This will enable them to think more critically about the video content they consume every day, especially if there is no proof of origin.

So as you can see, even the people that made the technology are aware of its dangers. The full paper is here should you want to read the whole thing. 

And I hate to point it out, or even give credence to it, but deepfakes are already wildly rampant in pornography. Whole websites are dedicated to fake celebrity pornography (all easily googleable, but it is absolutely 100% NSFW) and the results really and truly are uncannily accurate. Again, it's easy to assume that this could be done to anyone's spouse and used for blackmail. Not that I'm giving anyone ideas that haven't been actualized already; even Pornhub has blocked deepfakes

What does this mean for you? Perhaps invest in a digital video forensics lab. And, for what it's worth, perhaps trust more mainstream news sources, even if it means reaching across the aisle and dabbling in news from different bubbles. Live in a liberal bubble? Maybe check out the Daily Caller once in a while. Love Fox News? Watch CNN. Somewhere there's a middle ground that everyone is fighting to control. And, it might sound crazy, but fringe elements have far less to lose and more to gain from these fakes. 


NYTimes exposé reveals how Facebook handled scandals

Delay, deny and deflect were the strategies Facebook has used to navigate scandals it's faced in recent years, according to the New York Times.

(Photo by Chip Somodevilla/Getty Images)
Politics & Current Affairs
  • The exhaustive report is based on interviews with more than 50 people with ties to the company.
  • It outlines how senior executives misled the public and lawmakers in regards to what it had discovered about privacy breaches and Russian interference in U.S. politics.
  • On Thursday, Facebook cut ties with one of the companies, Definers Public Relations, listed in the report.
Keep reading Show less

Russian reporters discover 101 'tortured' whales jammed in offshore pens

Protected animals are feared to be headed for the black market.

Politics & Current Affairs
  • Russian news network discovers 101 black-market whales.
  • Orcas and belugas are seen crammed into tiny pens.
  • Marine parks continue to create a high-price demand for illegal captures.
Keep reading Show less

Unraveling the mystery behind dogs' floppy ears

Dogs' floppy ears may be part of why they and other domesticated animals love humans so much.

Photo by Jamie Street on Unsplash
Surprising Science
  • Nearly all domestic animals share several key traits in addition to friendliness to humans, traits such as floppy ears, a spotted coat, a shorter snout, and so on.
  • Researchers have been puzzled as to why these traits keep showing up in disparate species, even when they aren't being bred for those qualities. This is known as "domestication syndrome."
  • Now, researchers are pointing to a group of a cells called neural crest cells as the key to understanding domestication syndrome.
Keep reading Show less