First-Ever AI Fraud Case Steals Money by Impersonating CEO

Date: 09/16/2019

The first case of AI fraud has been reported after a perpetrator created an audio clip of a company’s CEO and used it to inform someone else within the company to release funds to the scammers.

In the world of artificial intelligence, a “deepfake” is a completely fabricated audio or video clip in which someone’s real voice or image is used in a situation the person was never in. With relative ease, skilled computer designers and editors can often create videos of a famous person saying or doing things they have never done.

Now being called a “vishing” attack, also known as voice phishing, this AI fraud case involves the head of a German company who supposedly contacted the CEO of one of its UK branches and requested a transfer of funds, stating that they would be reimbursed. The UK employee complied, sending around $243,000 to an account in Hungary. The callers made a total of three calls to the UK company but were eventually refused. Fortunately, the company carries insurance against this kind of AI fraud crime and it was covered.

While the entire point of a deepfake is that it is very difficult to discern from the real thing, there are things consumers and businesses alike can do in order to protect themselves from AI fraud.

Never comply with any kind of sensitive request without prior authorization.

It does not matter if the request comes as an email, a text message or now an audio-based call. Simply take down the caller’s name and the instructions and then verify it with the individual using a known contact phone number or in person.

Establish a company coding system for sensitive requests.

Institute a policy that all money transfers, file sharing or other sensitive activity must include the company “code word” in the instructions. The code should be changed frequently to avoid any threat from hackers.

Make sure that this information is shared throughout the company.

One of the best ways to pull off a successful phishing attack is to target a lower-level employee. It is important to make sure that everyone in the company knows and follows the security protocols.

If you are a victim of identity theft in need of assistance, you can receive free remediation services from ITRC. Call one of our expert advisors toll-free at 888.400.5530 or LiveChat with us. For on-the-go assistance, check out the free ID Theft Help App from ITRC.


You might also like…

Yahoo Breach Settlement Proposed for $117.5 Million

10,000 Breaches Later: The Benchmark Breaches That Created Systemic Change

Robocalls and What to do About Them

 

How much information are you putting out there? It’s probably too much. To help you stop sharing Too Much Information, sign up for the In the Loop.

Get ID Theft News

Stay informed with alerts, newsletters, and notifications from the Identity Theft Resource Center