COLUMNS

When Seeing Isn't Enough to Believe

Dec 21, 2017 | 08:00 GMT

New artificial intelligence audio and video programs could revolutionize disinformation, enabling criminals and operatives alike to sow seeds of chaos like never before.

Advances in artificial intelligence video and audio programs are making it possible for filmmakers of all kinds to make any person, or at least a convincing digital replica, say and do whatever they want for whatever end -- be it entertainment or something more insidious.

(MF3d/iStock/Getty Images)

In 1896, cinema pioneers Auguste and Louis Lumiere debuted a short film showing a train pulling into a station. Audience members reportedly fled the Paris theater in terror, afraid they would be run down. Though the train on screen could do them no harm, of course, the experience, and the danger, nevertheless felt real. Today, filmgoers are a savvier lot, well-versed in the many feats camera tricks and post-production can accomplish. But a certain threat still lurks in the deceptions of the cinema -- only now the dangers are often more real than the objects and actors depicted on screen. On Dec. 11, Vice's Motherboard ran a story discussing how advances in artificial intelligence made it possible to digitally superimpose celebrities' faces onto other actors' bodies with an algorithm. The article focused on the technology's use for making fake celebrity porn videos, but the trick has come in handy for mainstream...

Keep Reading

Register to read three free articles

Proceed to sign up

Register Now

Already have an account?

Sign In