You might just be asking what exactly even is deepfake? Well, deepfake is the technique of superimposing existing images and videos using machine learning. You could be hearing about it more recently since the actual term was coined in 2017 but deepfake has been commonly used by the film industry for many years.

The 1994 Academy Award-winning film Forrest Gump relied on expert video editing to insert actor Tom Hanks into actual footage reels of famous historical events. From meeting different U.S. presidents to standing next to Martin Luther King, Jr. during a well-known speech, the fabrication was both entertaining and poignant.

However, was it scary? Perhaps. Could we ever again trust what we saw with our own eyes?

A new potential threat called “deepfakes” might answer that question. The concept of a deepfake, a combination of the words “deep learning” and “fake,” is a real person’s face and voice, but they have been altered to speak someone else’s words in a recorded video. That video could then be widely shared, and unsuspecting viewers might not know the difference.

Some of the most widely viewed deepfakes involved celebrities who appeared to be starring in adult videos, except they had never actually filmed in those situations. The infamous deepfake video of former President Barack Obama portrays him using profanity while appearing to look directly into a camera for an interview, something he never recorded.

While those celebrity sex tapes and the Obama video got a lot of attention, the bigger concern is what happens when it is not a famous person and not obvious the video is fake? What happens when it is an executive within your company sending a video message over a messaging platform, telling you to change account numbers or passwords? What if it was your grandchild claiming to be kidnapped and needing ransom money right away? What if it was your face and voice, agreeing to have your account numbers changed or authorizing someone else to use your account?

When identity theft first began to be recognized as increasingly widespread crime, victims discovered that law enforcement agencies’ hands were tied. There were no laws enacted to protect victims. As laws changed around the country to respond to ID theft, more consumer protections were put in place.

Deepfake can be a crime depending on how it is used. Making an altered image of someone engaged in an embarrassing situation is becoming a crime in certain places under “revenge porn” laws. If the deepfake is used for things that are already a crime, such as stealing money from the victim’s bank account or workplace, then it could be covered under existing theft laws. If someone merely posts a video of your image and voice saying things you do not agree with, it might not fall under existing identity theft laws.

Fortunately, the chances that video editors with this kind of skillset will target individual citizens just for entertainment is small. What you have to be concerned with is its believability. As the old adage says, you cannot believe everything you read on the internet. Now that goes for what you see and hear as well. Make sure you are using caution and discernment before sharing content or making significant decisions based on video evidence because it could be a deepfake.

Contact the Identity Theft Resource Center for toll-free, no-cost assistance at 888.400.5530. For on-the-go assistance, check out the free ID Theft Help App from ITRC.


You might also like…

How To Spot a Fake Website

Things to Consider When Using VPN

New Tool Helps Consumers Make Sense of Data Breaches