A sophisticated type of digital manipulation called deepfakes emerged several years ago, and these techniques have quickly been improved by advancements in artificial intelligence tools. Deepfakes are generated by a kind of machine learning called ‘deep learning’, which is how they get their name.

In a deepfake, videos, photos, audio files, or even live conference calls with real people can be digitally altered to create a false reality. This means they can show people saying or doing things that never happened. Attackers could use a deepfake to create a convincing audio or video impersonation of a person and use it to trick a victim into providing money or information to the attacker. 

As a practical example, an attacker might create a deepfake voice message in which a company executive asks a manager to send funds to a supplier’s bank account. In the deepfake message, the impersonated executive claims that they recently lost their cellphone and so they’re calling from a new number. The manager thinks the executive’s voice sounds slightly different but assumes they just have a cold. After the manager goes ahead with the money transfer, they realise the message wasn’t genuine and the funds have vanished.

Deepfakes may sound like science fiction, but they are a real threat. In 2024, a finance worker in Hong Kong was tricked into transferring US$25 million to fraudsters who had invited the worker to a video call with a deepfaked CFO and some AI-generated employees. 

These kinds of events are not yet common here in New Zealand, but as the technology that enables them develops, we may see them happening more often.

Things to do
  • Be alert and trust your instincts: deepfakes may have small imperfections. A deepfaked voice may sound ‘off’, or a deepfaked video could have slightly unnatural human movement.
  • Be very cautious about messages from unknown origins. Always take time to verify from another source that a message is real before taking any action.
  • Have agreed offline processes for verifying important decisions made in a digital environment. For example, if you’re unsure about whether a message is genuine, call the person back on a number you trust and ask them.
  • Consider adding digital watermarks to your images and videos. These can increase the difficulty for criminals in altering your media.
  • Use privacy settings to restrict public access to your social media accounts as much as possible. This will give attackers less opportunities to get footage or samples they can use to impersonate you.
SEE ALL QUARTELY REPORTS
Top