Scam inc has a new weapon

Scam Inc has a new weapon | The Economist April 10th, 2026

Scam Inc Has a New Weapon: How AI-Powered Malware Is Targeting Everyday Australians

The Economist has published a sobering investigation into the next phase of the global scam industry, and its new weapon is artificial intelligence. The report, titled “Scam Inc Has a New Weapon,” details how organised criminal networks are now deploying AI-powered malware at industrial scale, targeting millions of victims simultaneously with tools sophisticated enough to fool almost anyone.

At Scam Prevention Australia, we believe you need to know exactly what this looks like because it is already happening, and it is coming for people right here at home.

The Story at the Heart of the Article

The Economist centres its investigation on a real victim, a woman we will call Ms Nguyen, whose experience illustrates exactly how these new scams operate.

Ms Nguyen received a series of messages that appeared to come from a government agency. The messages were convincing. They included official-sounding agency names, reference numbers, phone numbers, and formal language. She was told she was under investigation and linked to criminal activity through someone else’s actions.

To clear her name, she was instructed to download an app.

The app appeared legitimate. It looked like something that could have been found on an official app store. But it was malware, malicious software that, once installed, handed the criminals behind it complete access to her device.

Within a short period, they had accessed her banking credentials, drained her accounts, harvested her contact list, and collected her personal photographs and data. She lost everything she had.

This Is Not an Isolated Case

What makes The Economist’s investigation so alarming is not just the story of one victim, it is the scale of what sits behind it.

Criminal organisations, the same sophisticated syndicates that The Economist has been tracking throughout its Scam Inc series are now running malware campaigns as an industrial operation. Artificial intelligence is being used to:

•      Continuously update and evolve malware so that security systems cannot detect it

•      Generate convincing, personalised communications that mimic government agencies, banks, and other trusted institutions

•      Deploy campaigns that can target millions of people simultaneously with messages tailored to appear legitimate

•      Create fake apps and platforms that pass basic visual inspection

Indonesian authorities identified thousands of malicious apps in a single operation. That is one country, one operation. Multiply that across the region and the world, and the scope becomes almost incomprehensible.

Where These Operations Come From

The Economist’s broader Scam Inc investigation has traced these operations to large-scale criminal compounds based primarily in Cambodia, Myanmar, and Thailand. These are not small-time fraudsters working alone from a bedroom. These are sophisticated, corporate-style criminal enterprises with hierarchies, departments, quotas, and technology teams.

The United Nations estimates that hundreds of thousands of people have been trafficked into these compounds and forced to work as scammers under threat of violence. Many victims of the scam industry do not know that the person “contacting” them may themselves be a victim.

These operations thrive because of corruption, geographic complexity, and the sheer speed at which the technology evolves. By the time authorities identify one method, the malware has already been updated.

The Scale of the Global Scam Industry

The Economist’s Scam Inc series has consistently highlighted just how large this industry has become. Consider these figures:

•      The global scam industry is estimated to steal hundreds of billions of dollars from victims every year

•      It now rivals the global illicit drug trade in size and is growing faster

•      AI-enabled scams are reported to be significantly more profitable than traditional fraud methods

•      The FBI recorded a 33% year-on-year increase in cybercrime losses in recent years, with AI-enhanced scams driving a growing share

These are not abstract statistics. Behind every figure is a person who lost their savings, their confidence, and in some cases their will to go on.

Why This Is Different from What We Have Seen Before

For years, the public has been told to watch for the obvious signs of a scam, poor spelling, suspicious email addresses, requests for gift cards. Those warnings still matter. But The Economist’s reporting makes clear that the industry has moved far beyond those early tactics.

AI allows criminals to:

•      Generate flawless, formal written communications in any language

•      Create deepfake voices and video to impersonate real people, including family members and government officials

•      Build fake apps that appear on official platforms and pass visual inspection

•      Run hundreds of simultaneous “relationships” or impersonations at the same time with no human effort required

The old model of a scammer as a lone individual sending clumsy emails is gone. What we are dealing with now is an industry.

What You Can Do Right Now

Understanding the threat is the first step. Here is what The Economist’s investigation — and our own work at Scam Prevention Australia — tells us about staying safe:

•      Never download an app from a link in a message. No legitimate government agency, bank, or institution will ever ask you to install software via a text message, email, or social media post. If you receive this instruction, stop immediately.

•      Verify independently. If you receive a message claiming to be from a government body or authority, hang up or close the message. Then call the agency directly using a number from their official website — not the number provided in the message.

•      Be suspicious of urgency. Scammers create fear and time pressure deliberately. Any message that tells you to act immediately, threatens consequences, or demands secrecy is a red flag.

•      Talk about it. The biggest protection we have as a community is awareness. Scammers rely on shame and silence. Share this article. Talk to your family. Ask questions.

•      Report it. If you believe you have been targeted or have fallen victim to a scam, contact Scams Watch (ACCC) or your bank immediately. Reporting helps protect others.

 

A Final Word

Ms Nguyen did nothing wrong. She received a message that looked real, from an organisation that appeared legitimate, asking her to take what seemed like a reasonable step. She was deceived by a system built by professionals whose full-time job is deception.

Falling for a scam is not a sign of stupidity or weakness. It is a sign that the criminals behind it are very, very good at what they do.

That is exactly why education matters. That is why Scam Prevention Australia exists. And that is why we will keep talking about this loudly, clearly, and without shame.

Kylee Dennis is the founder of Scam Prevention Australia and Two Face Investigations, a private investigation firm specialising in online romance scam identification. For more information visit www.twofaceinvestigations.au

Source: The Economist, “Scam Inc Has a New Weapon,” April 2026.

Next
Next

We Tell Victims to Report Scams. But Who Is There When They Do? What was the romance scam fusion cell suggestion?