Scam compounds hiring “AI models” to seal the deal in deepfake video calls
好的,我现在需要帮用户总结一篇文章的内容,控制在100字以内。用户的要求是直接写文章描述,不需要以“文章内容总结”或“这篇文章”开头。首先,我得通读整篇文章,抓住主要信息。 文章讲的是东南亚的诈骗农场如何利用AI模型进行大规模诈骗。这些农场原本通过聊天操作员诱骗受害者,现在增加了所谓的AI模型,实际上是真人,通过实时换脸技术进行视频诈骗。受害者可能在视频中被诱骗转账或投资。技术的进步让这种诈骗更加普遍和难以察觉。 接下来,我需要提取关键点:东南亚诈骗农场、AI模型、实时换脸技术、视频诈骗、受害者受骗情况。然后把这些点浓缩成简洁的句子,确保不超过100字。 可能会遇到的问题是如何在有限的字数内涵盖所有重要信息而不遗漏关键细节。因此,我需要选择最核心的信息进行表达。 最后,检查语言是否流畅自然,确保符合用户的要求。 </think> 东南亚诈骗农场利用AI模型进行大规模视频诈骗,受害者被诱导转账或投资。这些AI模型实际上是真人,通过实时换脸技术伪装成虚构对象。随着技术进步,此类诈骗更难察觉且规模扩大。 2026-3-24 12:18:23 Author: www.malwarebytes.com(查看原文) 阅读量:6 收藏

Scam compounds in Southeast Asia have already become modern slave farms, trapping victims and forcing many of them to become scammers for them. Now they’ve added another type of worker to the mix: so-called AI models.

These professional scammers conduct video calls with their targets, charming them into handing over their cash. As reported in WIRED this week, recruitment ads describe roles handling around a hundred live video calls per day, promoting romance scams and crypto hustles in industrial-scale scam operations across Cambodia, Myanmar, and Laos. 

These scam farms already rely on chat operators to ensnare scam victims via messaging apps. Many of these operators are themselves victims of trafficking, forced to work long shifts under threats of violence. They develop relationships with victims over time, exploiting loneliness or financial worries. While they work to make a victim feel special, they’re actually juggling similar text sessions with dozens of people at once. Eventually, a victim may want a video call, either to meet their imagined sweetheart or to confirm an investment opportunity is legitimate (or both). 

Chat operators might not have the ability to charm victims on video, especially when they’re victims themselves, being made to work long shifts and are physically beaten.  So when a victim asks for a video call, the scam bosses call in a specialist “AI model” with strong interpersonal skills to charm the victim. Despite the name, they’re real people hired to appear on video calls. The AI deepfake software adjusts their looks to match the fictionalized person that the victim is hoping to see. 

Scam operations run recruitment ads for these models, and many seem willing to apply for these jobs. Humanity Research Consultancy, an investigative research group that tracks trafficking supply chains, identified a pitch from a 24-year-old Uzbekistani calling herself Angel. She claimed to speak four languages and to have a year’s experience as an AI model. She demanded $7,000 monthly for her services. 

The growth of scam compounds 

How do these scam compounds even exist? According to the Australian Strategic Policy Institute, Myanmar’s 2021 military coup helped fuel a fraud boom. Scam centers along the Thai border have more than doubled as crime syndicates move into that region, along with Myanmar, Cambodia and Laos. 

These scam centers are often tolerated because they line the coffers of local militia. But there have been some countermeasures. Raids and cross-border crackdowns have led to arrests and the movement of large numbers of suspects between countries, including operations targeting compounds such as KK Park in Myawaddy. Cambodia and Myanmar have also signalled increased efforts to tackle scam operations, although the networks remain highly resilient.

This kind of activity becomes easier as technology improves. Real-time face-swapping and deepfake tools are now good enough to support live video, not just pre-recorded clips. We’ve already seen real-time deepfakes used for everything from job interviews through to impersonating banking executives to scam millions. What’s new here is the scale: people handling dozens or even hundreds of calls a day for romance scams and crypto investment fraud shows that this is now a mass exploit. 

How to stay safe 

Here’s the problem with deepfake video: the common “tells” that let you spot it are evaporating. At one time a sure sign of an AI deepfake was someone with the wrong number of fingers or oddities in hairlines. You can up the ante in live calls by asking someone to turn sideways. Have them touch their nose, and wave their fingers in front of their face. It’s more difficult for deepfake software to handle that extra noise. 

But beware: the algorithms that produce deepfakes are getting better all the time, and more easily able to handle such tests. We’re at the point where this deepfake researcher says many more of us will be fooled by them this year. 

If you can’t fully trust what you see, fall back on what you know. Be wary of unsolicited contact, especially when someone quickly builds emotional rapport or introduces an investment opportunity. Even if a profile looks well-established or a website appears legitimate, take time to dig a little deeper.

Avoid sharing personal or financial information with someone you’ve only met online, and be wary of anyone who pushes you toward quick decisions or asks to move conversations off established platforms. The FBI has some sound advice on their website

The most dangerous part of this deepfake AI model trend is that it helps scam operations cross the final frontier. A live human can close a scam that a simple chat interaction can’t. That’s why people like Angel from Uzbekistan have a job, and why you need to be more on your guard than ever. 


We don’t just report on scams—we help detect them

Cybersecurity risks should never spread beyond a headline. If something looks dodgy to you, check if it’s a scam using Malwarebytes Scam Guard. Submit a screenshot, paste suspicious content, or share a link, text or phone number, and we’ll tell you if it’s a scam or legit. Available with Malwarebytes Premium Security for all your devices, and in the Malwarebytes app for iOS and Android.

About the author

Danny Bradbury has been a journalist specialising in technology since 1989 and a freelance writer since 1994. He covers a broad variety of technology issues for audiences ranging from consumers through to software developers and CIOs. He also ghostwrites articles for many C-suite business executives in the technology sector. He hails from the UK but now lives in Western Canada.


文章来源: https://www.malwarebytes.com/blog/news/2026/03/scam-compounds-hiring-ai-models-to-seal-deal-in-deepfake-video-calls
如有侵权请联系:admin#unsafe.sh