The new AI deepfake tool cracks "exchange KYC" in one second. How to prevent it and how to affect the crypto?

This article is machine translated
Show original
The cybersecurity company Cato Networks recently published a report stating that a new AI deepfake tool called ProKYC has emerged, allowing criminals to bypass the advanced KYC verification of cryptocurrency exchanges, showcasing a "new level" of cryptocurrency fraud. ProKYC Tool Successfully Passes Bybit KYC Etay Maor, Chief Security Strategist at Cato Networks, stated that this new AI tool is a significant improvement over the old methods used by cybercriminals to bypass two-factor authentication and KYC. Compared to buying forged ID documents from the , this AI-driven tool allows fraudsters to create entirely new identities out of thin air. The new AI tool is specifically targeted at cryptocurrency exchanges and financial companies, whose KYC verification requires new users to provide government-issued ID documents (passports, driver's licenses) and real-time facial movement detection via webcam to determine if the user is a real person. The report provides a video of ProKYC, demonstrating how the tool can generate fake ID documents and deepfake videos to pass the exchange's facial recognition. In the video, the user creates an AI-generated face and integrates the deepfake face into an Australian passport template. Then, the ProKYC tool generates accompanying videos and images of the deepfake face, successfully passing the KYC verification of the cryptocurrency exchange . warns that with AI-driven tools like ProKYC, criminals now have a greater ability to create new accounts on exchanges, a practice known as "New Account Fraud" (NAF). The ProKYC website offers a suite of services including a camera, virtual simulator, facial animation, fingerprints, and verification photo generation, with an annual fee of $629. In addition to bypassing exchanges, ProKYC claims it can also bypass the KYC verification of payment platforms like and . How to Prevent AI Deepfake Fraud? Etay Maor states that correctly detecting and preventing this new type of AI fraud is challenging: creating a very strict biometric authentication system may lead to a high number of false positives, while loose controls may allow fraud to succeed. However, there are still ways to detect these AI tools, such as manually identifying abnormally high-quality images and videos, and looking for inconsistencies in facial movements and imagery. In the United States, the punishment for identity fraud is severe, ranging from 15 years in prison to heavy fines, depending on the nature and extent of the crime. In September, a report by the parent company of antivirus software , , and , Gen Digital, noted that cryptocurrency fraudsters using AI deepfake videos to lure victims into fraudulent token schemes have become increasingly active in the past 10 months. However, this "AI against AI" approach will inevitably increase the operating and KYC costs for exchanges. If the trend becomes more severe, the advantage of large exchanges that can afford advanced AI to prevent it will gradually become apparent, potentially impacting the survival of smaller exchanges.

Source
Disclaimer: The content above is only the author's opinion which does not represent any position of Followin, and is not intended as, and shall not be understood or construed as, investment advice from Followin.
Like
Add to Favorites
Comments