Scammers are using AI to impersonate senior officials, warns FBI
FBI警告称有犯罪分子利用AI生成的语音和短信冒充美国高层官员进行诈骗。目标主要为政府人员及其联系人。攻击者通过发送恶意链接窃取个人信息。公众需提高警惕,核实身份并避免点击可疑链接。 2025-5-22 09:57:40 Author: www.malwarebytes.com(查看原文) 阅读量:8 收藏

voice cloning for Vishing

The FBI has issued a warning about an ongoing malicious text and voice messaging campaign that impersonates senior US officials.

The targets are predominantly current or former US federal or state government officials and their contacts. In the course of this campaign, the cybercriminals have used test messages as well as Artificial Intelligence (AI)-generated voice messages.

After establishing contact, the criminals often send targets a malicious link which the sender claims will take the conversation to a different platform. On this messaging platform, the attacker may push malware or introduce hyperlinks that direct targets to a site under the criminals’ control in order to steal login information, like user names and passwords.

The AI-generated audio used in the vishing campaign is designed to impersonate public figures or a target’s friends or family to increase the believability of the malicious schemes. A vishing attack is a type of phishing attack in which a threat actor uses social engineering tactics via voice communication to scam a target—the word “vishing” is a combination of “voice” and “phishing.”

Due to the rapid developments in AI, vishing attacks are becoming more common and more convincing. We have seen reports about callers pretending to be employers, family, and now government officials. What they have in common is that they are after information they can use to steal money or sensitive information from the victim.

How to stay safe

Because these campaigns are very sophisticated and targeted, it’s important to stay vigilant. Some recommendations:

  • Independently verify the identity of the person contacting you, via a different method.
  • Carefully examine the origin of the message. The criminals typically use software to generate phone numbers that are not attributed to a specific mobile phone or subscriber.
  • Listen closely to the tone and word choice of the caller. Do they match those of the person allegedly calling you? And pay attention to any kind of voice call lag time.
  • AI-generated content has advanced to the point that it is often difficult to identify. When in doubt about the authenticity of someone wishing to communicate with you, contact your relevant security officials or the FBI for help.

If you believe you have been the victim of the campaign described above, contact your relevant security officials and report the incident to your local FBI Field Office or the Internet Crime Complaint Center (IC3) at www.ic3.gov. Be sure to include as much detailed information as possible.


We don’t just report on threats – we help safeguard your entire digital identity

Cybersecurity risks should never spread beyond a headline. Protect your—and your family’s—personal information by using identity protection.


文章来源: https://www.malwarebytes.com/blog/news/2025/05/scammers-are-using-ai-to-impersonate-senior-officials-warns-fbi
如有侵权请联系:admin#unsafe.sh