Is it convincing? Well, you be the judge:
Our take: lips and voice not quite together (though it gets better after second 20 or so), lacks the expressive range of a normal human voice.
Still, the director of a Belgian political party, sp.a, said he needed only a few hours in Adobe AfterEffects to put together a prank video of U.S. President Donald Trump upbraiding the Belgian government, in a passive-aggressive way, for not doing more about climate change.
Could it pass as real? With a bit more polish, perhaps.
WATCH: U.S. President Donald Trump was close to apologizing Friday for sharing three unverified anti-Muslim videos from a British far-right group last November but stopped short of actually saying sorry.
In principle, with enough recordings of a person’s voice, you can chop up the individual syllables and reconstruct it to say whatever you want. Public figures, like presidents, have more recordings of their statements than the rest of us, so there’s lots of material to work from.
But for now, at least, the results fall down a bit on quality. Back in February, we tried out Lyrebird, a digital voice mimicking site, with similar results. It works well enough, in the sense that the syllables get correctly rearranged, but the result doesn’t sound quite right. (Lyrebird ignores punctuation when it turns text into voice, among other things.)
Here’s what it sounded like.
The Trump video’s also a bit blurry, but phones have got us used to shaky, in-the-moment video, which provides a bit of cover for a fake’s unavoidable flaws. When we’re expecting grainy authenticity, a fake video doesn’t have to be perfect to be convincing.
We’re accustomed to trust video, on the basis that it’s a lot harder to fake than other ways of conveying information. Video of a person shown speaking could be edited misleadingly, or taken out of context, but not straight-up created — until now.
The technology to change that isn’t perfect yet, but you can see where it’s going. (Despite the disclaimer at the end, the Belgian video fooled several people anyway.)
Is this the future of fake news? Well, it’s hard to see why it wouldn’t be, giving the temptations of making a public figure seem to say something discreditable or incriminating, and our relative lack of filters.
It would be wise to keep a wary eye out for the better-quality ones we can expect to come along.
WATCH: Several videos posted to social media which purported to show the Alaska earthquake and resulting tsunami were later proven to be misrepresentations.
© 2018 Global News, a division of Corus Entertainment Inc.