The "Digital Arrest Scam" exemplifies advanced AI-driven fraud, where deepfakes and AI bots create "mule accounts" to bypass traditional KYC. A notable case involved Mrs. Iyer losing ₹12 Lakhs to a verified yet fraudulent account. By 2026, 2024-era video KYC, relying on human agents, is defunct against real-time deepfake injections. VerifyeKYC counters this with AI-based "Passive Liveness Detection." This technology scrutinizes pixel-level data, analyzing light scattering and blood flow—elements impossible for deepfakes to perfectly mimic—flagging spoofs in milliseconds. Furthermore, "mule account detection" is crucial, moving beyond basic PAN verification to "intent verification." A fintech’s behavioral AI flagged a user ("Rahul") as a potential mule due to unusual micro-transfers from high-risk IPs, preventing larger transfers. In the gaming sector, "1:N Face Search" combats bonus abuse by identifying duplicate accounts created with the same face but different details. The future points towards "Continuous Behavioral Authentication," monitoring subtle biometrics like typing cadence. Businesses must evolve their KYC processes to verify reality, not just documents, to protect platforms and reputation against sophisticated AI fraud. AI effectively detects deepfake inconsistencies, weak KYC leading to mule accounts incurs regulatory penalties, and 1:N Search prevents bonus fraud by detecting duplicate faces.
For further actions, you may consider blocking this person and/or reporting abuse

Top comments (0)