AI Deepfake Romance Impersonation Scam

Verdict: Suspicious | Risk Score: 8/10 | Severity: high

Category: Investment

How AI Deepfake Romance Impersonation Scam Works

Overview: This scam employs advanced AI to impersonate attractive individuals or celebrities on dating apps and social media, luring Indian users into fraudulent cryptocurrency investments. The scam is treacherous because the deepfakes look and sound convincing, often exploiting feelings of love and trust to direct victims to hand over large sums of money—with little chance of recovery. How It Works: 1. The scammer creates a profile using convincing AI-generated images and deepfake videos or voices, sometimes imitating Bollywood stars or influencers. 2. Emotional rapport is built with victims using personalized AI chats, Indian language voice notes, and ongoing interaction. 3. Victims are urged to join 'exclusive' crypto investment groups or schemes, shown via fake dashboards or websites with real-time market data. 4. Scammer requests direct crypto transfers or wallet information, promising to double or triple the amount. 5. Victims' funds are immediately siphoned off, and scammers block or disappear. Occasionally, they follow up posing as 'recovery agents' demanding fees to recover your lost crypto. India Angle: This scam usually targets educated, middle-aged Indians on Facebook, Instagram, or professional networks. Deepfakes featuring Bollywood actors or NRI businesspeople are used to make the offer seem legitimate. With India’s increasing exposure to global crypto trends, even high-net-worth individuals are being targeted. Real Examples: - "This is Neha Sharma from Mumbai. I have a secret crypto tip—transfer ₹50,000 in Bitcoin today and double it!" - AI-generated video calls with faces that look like famous actors but avoid meeting in person - Messages: "Only for my closest friend, is this investment. Please don’t tell anyone." Red Flags: - Profiles with perfect photos but vague personal details - Too-good-to-be-true investment claims, especially from celebrities - Refusal or awkwardness around live video calls - Being asked to provide wallet details or send crypto directly Protective Measures: - Don’t trust unknown profiles, even with celebrity likeness - Never share your crypto wallet information or private keys - Verify the identity of anyone making investment offers - Use search tools to reverse-check profile pictures or videos - If in doubt, consult BharatSecure or call 1930 If Victimised: - Cease all further communication - Gather all relevant chats, videos, and transaction records - Lodge a complaint at cybercrime.gov.in and call 1930 - Notify your trading platform and bank Related Scams: - Celebrity endorsement crypto fraud - Instagram investment DMs - Recovery agent scam after investment loss

Visual Intelligence:

BharatSecure's AI has identified this as a used in scams targeting Indian users.

Who Does AI Deepfake Romance Impersonation Scam Target?

General public across India

Red Flags — How to Identify AI Deepfake Romance Impersonation Scam

  • Flawless photos or videos from strangers wanting privacy
  • Quick escalation from chatting to investment talk
  • Promises of doubled funds via direct crypto transfers
  • Reluctance to video call live

What To Do If You Encounter AI Deepfake Romance Impersonation Scam

  1. Do not click any links or share personal information
  2. Block and report the sender immediately
  3. Report at cybercrime.gov.in or call 1930
  4. Inform your bank if financial details were shared

How to Report AI Deepfake Romance Impersonation Scam in India

  • Call 1930 — National Cyber Crime Helpline (24x7)
  • File a complaint at cybercrime.gov.in
  • Contact your bank immediately if money was lost
  • Call RBI helpline: 14440 for banking fraud

Frequently Asked Questions

What is AI Deepfake Romance Impersonation Scam?
Overview: This scam employs advanced AI to impersonate attractive individuals or celebrities on dating apps and social media, luring Indian users into fraudulent cryptocurrency investments. The scam is treacherous because the deepfakes look and sound convincing, often exploiting feelings of love and trust to direct victims to hand over large sums of money—with little chance of recovery. How It Works: 1. The scammer creates a profile using convincing AI-generated images and deepfake videos or vo
How does AI Deepfake Romance Impersonation Scam work?
Overview: This scam employs advanced AI to impersonate attractive individuals or celebrities on dating apps and social media, luring Indian users into fraudulent cryptocurrency investments. The scam is treacherous because the deepfakes look and sound convincing, often exploiting feelings of love and trust to direct victims to hand over large sums of money—with little chance of recovery. How It Wo
How to protect yourself from AI Deepfake Romance Impersonation Scam?
Do not click any links or share personal information Block and report the sender immediately Report at cybercrime.gov.in or call 1930 Inform your bank if financial details were shared
How to report AI Deepfake Romance Impersonation Scam in India?
Report to cybercrime.gov.in or call 1930 (National Cyber Crime Helpline). You can also contact your local police station's cyber cell.

Verify Any Suspicious Message

Check any suspicious message, link, or call for free at bharatsecure.app. BharatSecure uses AI to detect scams in real-time and protect Indian users.