Blog Details

CEO Fraud with Voice Deepfakes

CEO Fraud with Voice Deepfakes

This type of high-impact social engineering attack is perpetrated by way of voice cloning that was achieved by the use of AI or Deepfake audio technology that has enabled criminals to create voice clones of well-known executives such as a CEO, CFO, and other executives, which will give the appearance of calling and leaving a voicemail for employees on an urgent basis (the voicemail will be delivered to employees within the Finance department, Treasury Department, or the IT department) to request urgent wire transfers, purchases of gift cards, cryptocurrency payments, or remote access credentials.

The number of these attacks has skyrocketed during 2025 to 2026, and they now rank among the fastest-growing and most costly forms of Business Email Compromise (BEC) attacks, with the average successful attack costing between $100,000 and several million dollars.

How an Attack Works
1. Voice Cloning Preparation: The Attackers will collect a real Executive’s voice for creating the voice clone by obtaining only 3-60 seconds worth of their voice from any of the following sources:
a) Public earnings calls, interviews, or podcasts made by the Executive
b) Any video of the Executive that they have posted or published to YouTube or LinkedIn.
c) Any recordings of any internal Company town hall meetings that have been leaked to or obtained from public sources.

2. Target selection Usually finance department (AP/treasury team) or IT/helpdesk. Attacker uses LinkedIn + company website + breached data to map reporting lines.
3. The call or voice message
a) Caller ID spoofed to look like the executive’s mobile / office number.
b) The voice has all the characteristics of a real-life person (i.e., stress, accent, rhythm).

c) Common reasons provided by the caller include:
1) "Due to short notice, if you could expedite wire transfer for $250,000 to this new vendor today."
2) "Since my phone is currently not working and I cannot use it as a means of communication, would you be able to buy Apple gift cards ASAP to allow me to make an emergency payment to a client?"
3) "We are currently being audited by an accounting firm and need to perform some compliance testing. Please let me know if you would be able to share your screen with me using the program AnyDesk so that I can perform my testing."

4. Pressure tactics:
a. Urgency ("go now before the market closes")
b. Confidentiality ("don't tell anyone")
c. Authority ("it comes straight from me");

5. Monetizing:
a. Wire to mule account to crypto mixer;
b. Gift cards are almost immediately exchanged for cash on the dark market; and
c. Remote access allows for data stealing and the harvest of credentials which leads to additional fraudulent activity.

Real World Impact and Cases Reported by Calendar Year (Patterns for 2025 - 2026)
1. February 2024 Hong Kong Finance Firm - Continued into 2025. Employee at that firm received video call from deep fake CEO and CFO on the video call enabling transfer authorization for a transfer of approx. HK$200 million (approx. US$25.6 million). This was the first publicly confirmed case of a large-scale executive fraud utilizing a deepfake.
2. Mid-sized US/EU issues in general - FBI IC3 and Europol have noted dozens of cases where a CEO's voice has been cloned and requested a transfer of between $100,000 and $2 million, or for gift cards in multiple instances throughout 2025.
3. Small-to-mid business pattern Attacker clones CEO voice from recent earnings call → calls AP clerk → “I’m stuck in a meeting , send $85,000 to this new vendor account.” → funds gone in minutes.

Indicators of a Possible Scam
1. Receiving an Unexpected Voice/Video Call from the CEO Asking for Cash, Gift Cards or Remote Access or Sensitive Information.
2. Urgency and Secrecy – “You have to do this immediately, don’t violate confidentiality, and don’t tell anyone because I’m traveling.”
3. Request to Circumvent Standard Process – No Purchase Order, No Approval Process and Unusual Payment Method Requested.
4. Caller ID Appears to be Correct – Caller Refuses to Provide Details to Verify Caller ID (“What was the code word used in last month’s board meeting?”).
5. Familiar Voice (Sounds Like the Caller) / Elements of Suspicion (E.g., Periods of Silence, Irregular Cadences, Irregular Background Noise).

Steps to Protect Yourself from a Possible Cyber Scam
1. Enforce a Mandatory Verbal Challenge Phrase All Staff Should Learn a Rotating List of Challenge Words/Phrases Every Month, So That Only the Actual CEO Will Be Able to Provide You the Information (E.g., “If You Are the CEO, Tell Me the Restaurant We Ate at After Last Month’s Board Meeting.”).
2. Require that All Transactions Greater Than $10,000 - $50,000 Receive Dual Approval / Verification via an Out of Band Phone Call to Mobile Number (Not the Incoming Caller’s Number).
3. Establish a “No Gift Cards/Criptocurrency” Policy No Gift Cards, No Criptocurrency, or Other Similar Transaction Methods Will Ever Be Used by the Company in Business Transactions.
4. Train Employees to Disconnect and Call Back When There is an Unusual Voice Call Request Hang up and Call Back the CEO Using the Official Mobile Phone/Teams Number.
5. Limit public voice exposure Executives: avoid posting clear solo audio/video on public social media. Use background noise or music in calls.

6. Technical controls
a) Deepfake detection tools (free: Hive Moderation, Resemble AI detector, Microsoft Video Authenticator).
b) Caller-ID authentication (STIR/SHAKEN where available).
c) Email / SMS filters that flag urgent internal requests.

Key Takeaways
CEO voice deepfake fraud in 2026 has become the most believable and costly type of social engineering attack. An individual only needs a few seconds of publicly available audio to virtually create an exact copy of someone's voice which will trick nearly everyone under duress. 

The best way to stop this type of fraud is to create an organization-wide culture that requires all employees to verify any urgent requests through an offline method that requires prearrangement of a specific challenge or to call back directly to the requestor, with absolutely no exceptions. Using a single question, "How do I verify this is you?" could prevent the loss of $1,000,000 if it turns out the request was not legitimate.

© 2016 - 2026 Red Secure Tech Ltd. Registered in England and Wales under Company Number: 15581067