AI Deepfake Video Phishing – Multinational Financial Firm Loses $25 Million

 
 

We wanted to provide our opinion on a press release that the Hong Kong police made on February 4th, 2024 regarding a successful theft of $25m from an investment firm’s Hong Kong office that used real-time AI deepfake video. A deepfake video phishing attack involves the use of AI to create sophisticated and convincing communication, in the form of real-time video and/or audio, that is designed to mimic the likeness and voice of real and known individuals.

In this case, after receiving a phishing email, an employee was tricked into paying out $25 million to scammers who impersonated his company’s Chief Financial Officer and other colleagues in a Zoom video conference call to confirm the transfer request.

An example of a deepfake video of Bill Gates: https://www.youtube.com/watch?v=WzK1MBEpkJ0

To help mitigate deepfake attacks, please review these recommendations: 

  • Establish clear verification protocols: Clearly define the procedures for verifying financial requests and verifying the identity of individuals involved in the request process.  One suggestion is to define a protocol to verify the authenticity of approval and requestor individuals in video calls by asking them to move their heads or answer non-public knowledge questions that confirm their identity. Consider using a pre-defined private keyword or phrase to help further validate the transaction between the approver or requestor if an electronic medium is used for the communication.

  • Employee training: Educate your team about the risk of deepfake video and audio attacks, and how to spot them. For example, look for unnatural eye movements, facial expressions, or lip-syncing characteristics. Be aware that the quality of deepfake video and audio is improving.Nunc parturient viverra at viverra quisque

  • Monitor your online content:  Limit the amount of personal information, photos, audio, and videos that could be used to create a deepfake impersonation. Adjust your privacy settings on social media and other platforms to restrict who can access your content.

If you would like to know more about this particular scam, please click the following link: https://www.cnn.com/2024/02/04/asia/deepfake-cfo-scam-hong-kong-intl-hnk/index.html 

If you have any questions or concerns, please do not hesitate to reach out to our support team. 

Thank you for your attention to this matter.

Best Regards,

— Your HalcyonFT Team

 
 
 

 
 

{ HALCYONFT UPDATES }

More Insights

 
 
 
 

{ CONTACT }

Connect with us to discuss what HalcyonFT can do for you

 
 
Previous
Previous

Halcyon Financial Technology Completes SOC2 Type 2 Audit

Next
Next

Microsoft O365 Copilot AI Availability