Using technology to alter an actor’s appearance in a movie is entertaining (and generally done with the actor’s consent), but similar techniques are being used to fabricate versions of real politicians saying and doing things that never happened.
There’s serious concern that false or misleading information based on such video deepfakes will influence the 2020 elections, and experts in government and academia are working to find ways to detect them.
Simple manipulated videos and photos have already been seen in the 2020 campaign. Selective editing with basic software can readily alter or obscure a politician’s meaning; changing or adding something new to a still image is as easy as firing up a photo editing program.
Deepfakes go a step further, using deep machine learning to enhance the accuracy of the resulting video.