FBI warns of deepfake spear phishing campaign targeting Americans
The FBI has issued a public warning about a sophisticated spear phishing campaign that uses AI-generated content to impersonate officials from the United States government. The campaign involves emails, phone calls, and video messages designed to manipulate victims into sharing sensitive information or transferring money.
In an alert published by the Internet Crime Complaint Center (IC3), the FBI explains how cybercriminals are combining social engineering tactics with deepfake technology to deceive and pressure their targets.
AI is reshaping impersonation scams
According to the FBI, attackers are pretending to be government representatives, including federal law enforcement and regulatory officials. They are contacting victims through phone calls, SMS messages, and email.
Some of the most convincing scams involve synthetic voice and video content. In these cases, the fraudsters use AI tools to create realistic videos and audio messages that appear to come from trusted authorities. The content often includes false claims of investigations or legal charges, pushing the victim into making urgent decisions.
This approach represents a significant shift in phishing tactics. By using artificial intelligence, criminals can now launch more personalized and convincing attacks at a much larger scale. Learn how phishing has evolved and how to recognize modern attacks in our comprehensive guide to phishing threats.
Familiar tricks with a modern twist
Although impersonation scams have existed for years, the use of deepfake content adds a dangerous new layer. Previously, attackers relied on fake email addresses or official-sounding language. Now, they can create what looks and sounds like real people.
In one reported incident, a victim received a video call from someone who appeared to be an FBI agent. The individual used professional language and referred to specific personal information, likely gathered from public sources or social media. Believing the call was real, the victim sent a large sum of money to a foreign account. AI is reshaping impersonation scams
According to the FBI, attackers are pretending to be government representatives, including federal law enforcement and regulatory officials. They are contacting victims through phone calls, SMS messages, and email.
Some of the most convincing scams involve synthetic voice and video content. In these cases, the fraudsters use AI tools to create realistic videos and audio messages that appear to come from trusted authorities. The content often includes false claims of investigations or legal charges, pushing the victim into making urgent decisions.
To understand the broader implications of this evolving threat, explore how deepfakes are shaping new cyber risks for businesses.
Cybersecurity starts with people
This type of scam highlights the importance of security awareness. At Moxso, we emphasize that even the best technical tools cannot fully protect an organization if its people are not prepared.
Attackers are targeting emotions such as trust and fear. When a video or voice message seems real, the pressure can cause people to act without verifying the situation.
Organizations need to focus on training employees to question unusual or urgent requests. People should be encouraged to double-check information using trusted sources and to be cautious when dealing with messages that demand immediate action or financial transactions.
How to protect yourself
The FBI advises taking the following steps to avoid falling victim to these scams:
Do not respond to unexpected calls, messages, or videos that claim to be from government officials.
Confirm identities using official websites or publicly listed phone numbers rather than relying on the contact details provided in the message.
Watch out for messages that use fear or urgency to demand immediate action, especially when money or personal information is involved.
Report any suspicious contact through the IC3 portal or by contacting local law enforcement.
What this means for the future
This campaign is another example of how AI is changing the cybersecurity landscape. Criminals are adapting quickly and using powerful tools to increase the reach and impact of their attacks.
At Moxso, we believe that understanding how these attacks work is a key part of protecting yourself and your organization. Security is no longer just about software and systems. It is also about human behavior, critical thinking, and the ability to recognize manipulation.

Sarah Krarup
Sarah studies innovation and entrepreneurship with a deep interest in IT and how cybersecurity impacts businesses and individuals. She has extensive experience in copywriting and is dedicated to making cybersecurity information accessible and engaging for everyone.
View all posts by Sarah Krarup