AI-powered scams exploiting voice cloning and deepfake technology are increasingly targeting elderly victims, with fraudulent services available for as little as $1.
In one case, a grandmother in Hunan received a call from someone mimicking her grandson’s voice, claiming he had been in an accident and urgently needed money. The distressed woman withdrew 30,000 yuan in cash and handed it to a stranger posing as a "relative of the police chief." This is just one example of how AI-driven scams are exploiting seniors’ trust in family members and unfamiliarity with emerging technologies.
Fraudsters use AI to clone voices or manipulate videos, impersonating relatives, doctors, or even celebrities to deceive victims. Investigations reveal that deepfake services—including voice cloning and face-swapping—are widely available on online platforms, with prices starting at just 1 yuan. Some vendors even offer "one-stop" fraud packages, including pre-written scripts and tools to bypass detection.
Beyond impersonating family members, scammers also create fake endorsements from celebrities like Guo Fucheng and Gong Li to promote dubious financial products. AI-generated "expert" videos can be commissioned for as little as 80 yuan, with some sellers offering customizable digital avatars of public figures.
Legal experts warn that unauthorized use of AI to impersonate others may violate肖像权 (portrait rights) and声音权 (voice rights), potentially constituting criminal fraud. Platforms are tightening rules, requiring AI-generated content to be labeled, but detection remains a challenge as deepfake technology evolves faster than countermeasures.
Authorities advise seniors to verify unexpected calls by contacting family directly and to avoid sharing sensitive information. Families are urged to educate elderly relatives about digital risks while respecting their autonomy.
As AI scams grow more sophisticated, experts emphasize the need for awareness, stronger regulations, and collaborative efforts to protect vulnerable populations.