Chainalysis Reports Surge in Crypto Scams Driven by Impersonation and AI-Enabled Fraud
Chainalysis reported that cryptocurrency scams and fraud generated an estimated $17B in victim losses in 2025, making it the largest year on record in its tracking, with at least $14B observed on-chain and expectations that totals will rise as additional illicit addresses are identified. The report attributes the increase to the continued industrialization of scam operations and infrastructure, including phishing-as-a-service, AI-generated deepfakes, and professional money-laundering networks, alongside major scam categories such as pig butchering/romance scams and HYIP-style schemes. Chainalysis also assessed that scam efficiency increased materially, citing a 253% YoY rise in average scam payment (from $782 in 2024 to $2,764 in 2025) and noting that AI-enabled scams can be significantly more profitable than traditional approaches.
A key driver highlighted was the rapid growth of impersonation scams, which Chainalysis said rose roughly 1,400% YoY, with average payments to those clusters up more than 600%. One example cited was an E‑ZPass-themed smishing campaign that used fake toll-payment texts and lookalike sites to deceive victims; Chainalysis linked this activity to the Chinese-speaking group “Darcula” / “Smishing Triad,” and referenced reporting and legal action describing tooling and templates used to scale these lures. Separately, reporting on AI deepfake impersonation shows similar social-engineering dynamics outside of “crypto-only” contexts, including deepfakes impersonating religious figures to solicit donations and promote fraudulent crypto-related offers, reinforcing the report’s broader finding that AI-assisted impersonation is increasing the reach and credibility of scams.
Timeline
Jan 13, 2026
Chainalysis publishes 2026 crypto scam findings
On January 13, 2026, Chainalysis published its 2026 Crypto Crime Report section on scams, detailing 2025's estimated losses, the rise of impersonation fraud, AI-enabled scam operations, laundering trends, and links to Southeast Asian scam compounds. The report also highlighted cases including E-ZPass smishing and a Coinbase impersonation scheme.
Dec 31, 2025
Law enforcement targets Southeast Asia-linked scam infrastructure in 2025
The sources describe major 2025 enforcement actions against scam ecosystems tied to East and Southeast Asia, including U.S. actions against the Prince Group and related infrastructure, as well as sanctions and asset seizures. They also note a reported U.S. Department of Justice seizure of $15 billion in Bitcoin tied to a Cambodian pig-butchering operation using forced labor.
Dec 31, 2025
Impersonation scams and AI-enabled fraud expand in 2025
During 2025, impersonation scams grew about 1,400% year over year and the average scam transfer rose 253% to $2,764, according to Chainalysis. The report also found AI-enabled scams, including deepfakes, phishing, and AI-assisted pig-butchering, extracted significantly more per operation than non-AI-linked scams.
Dec 31, 2025
Crypto scam revenue surges across 2025
Chainalysis estimated that cryptocurrency scam losses in 2025 reached roughly $17 billion, with at least $14 billion already observed on-chain. The report said the total may rise further as more illicit addresses are identified over time.
See the full picture in Mallory
Mallory subscribers get deeper analysis on every story, including:
Who’s affected and how
Deep-dive technical analysis
Actionable next steps for your team
IPs, domains, hashes, and more
Ask questions and take action on every story
Filter by topic, classification, timeframe
Get matching stories delivered automatically
Related Entities
Threat Actors
Organizations
Sources
Related Stories

AI-Enabled Fraud Scams Industrialized by Transnational Criminal Networks
**Transnational criminal networks** are increasingly industrializing online fraud with **AI-enabled social engineering**, according to reporting on scam compounds in Southeast Asia, an Interpol assessment, and policy commentary tied to a new US executive order. Fraud operations linked to *pig-butchering* and romance scams are using generative AI to improve language quality, deepfakes to impersonate trusted people, and low-cost "deepfake-as-a-service" offerings to scale deception. Interpol said AI-assisted fraud is **4.5 times more profitable** than non-AI schemes, while broader reporting describes these operations as structured, multinational enterprises that function like businesses and increasingly rely on automation, synthetic identities, and persuasive impersonation at scale. Reporting from Cambodia and the wider region shows scam operators are now recruiting "**AI face models**" to appear on high-volume deepfake video calls, including applicants from multiple countries seeking work in compounds associated with trafficking-linked fraud operations. The same ecosystem has been described as part of a broader organized-crime model involving forced labor, cryptocurrency investment scams, romance fraud, and impersonation schemes targeting victims globally. One reference on calculating AI ROI in enterprise cybersecurity is **not about this fraud campaign ecosystem**, and an EU sanctions announcement concerns separate state-linked cyber incidents rather than financially motivated AI-enabled fraud.
2 weeks ago
Surge in Deepfake-Driven Fraud and Synthetic Identity Threats
Artificial intelligence-powered scams, particularly those leveraging deepfakes and synthetic identities, escalated significantly in 2025. Experts warn that the quality and volume of deepfakes have reached a level where they are nearly indistinguishable from authentic media for most people, enabling fraudsters to deceive victims on a global scale. Voice cloning and visual deepfakes have been used to facilitate large-scale scams, while the emergence of synthetic entities has further blurred the line between real and fake identities, complicating fraud detection for financial institutions. The misuse of stablecoins and lax cryptocurrency oversight have created new avenues for cross-border fraud, with experts predicting these trends will intensify in 2026. Industry leaders emphasize the urgent need for improved data, reporting, and regulatory measures to counteract these evolving threats. The rapid proliferation of generative AI tools has enabled "pig butchering" scams and other fraud operations to target vast populations, underscoring the growing risk posed by synthetic media and AI-driven deception in the financial sector and beyond.
1 months ago
AI-Enabled Romance Scams Using Deepfakes and Fake Cryptocurrency Lures
A surge in **romance scams** is leveraging **AI-enabled impersonation** to make fraud harder to detect, combining manufactured intimacy with financial theft. Australian police warned more than 5,000 people they may have been targeted in a large-scale operation linked to overseas syndicates, where scammers used mainstream dating apps to initiate relationships and then steered victims into purchasing **fake cryptocurrency**. The playbook described includes rapidly escalating emotional commitment, isolating targets, and pushing conversations off-platform to apps like *WhatsApp* or *Telegram*, reducing victims’ access to in-app safety controls and reporting mechanisms. The fraud techniques increasingly rely on **deepfakes** and automated “AI personas,” undermining traditional verification methods such as requesting custom photos or relying on video calls as proof of identity. Reported tactics include real-time face-swapping and AI voice synthesis during video calls, long-running bot-driven conversations that build trust over months, and “celebrity” impersonation to intensify emotional leverage and extract larger payments. Despite the technology shift, the core mechanism remains psychological manipulation—using scripted narratives and social engineering to move victims from online rapport to off-platform communication and ultimately to financial transfers.
1 months ago