Original

Interview | Cointelegraph Exclusive Interview with CertiK: Threats and Prevention of AI Deep Fake Attacks

avatar
CertiK
09-15
This article is machine translated
Show original

Recently, CertiK engineers accepted an interview with Cointelegraph to discuss the threat of AI deep fake attacks to the crypto space and the preventive measures that users can take. The following is the full text of the news:

Security firm: AI deep fake attacks will not be limited to video and audio

As artificial intelligence (AI)-powered deepfake scams grow in popularity, the attack method could expand beyond video and audio, security firms warn.

On September 4, Gen Digital software company disclosed that criminal activities using AI deep fake technology to defraud cryptocurrencies increased in the second quarter of 2024. The company said that a fraud gang called "CryptoCore" has defrauded more than $5 million in crypto assets through such means.

Although this amount is relatively small compared to other attacks in the crypto space, security experts believe that AI deepfake attacks may expand further and pose a greater threat to digital assets.

AI deepfakes threaten wallet security

Web3 security company CertiK pointed out that as technology develops, AI-driven deep fake scams will become more sophisticated, and in the future such attack methods may not be limited to video and audio.

A spokesperson for CertiK explained that wallets using facial recognition technology could be targeted, with hackers using deepfakes to trick the system into gaining access:

“For example, if a wallet relies on facial recognition to protect critical information, it must assess the robustness of its solution against AI-driven threats.”

At the same time, the spokesperson stressed that members of the crypto community also need to raise awareness of such attack methods.

AI Deep Fakes Will Continue to Threaten Crypto Assets

Corrons, a security expert at cybersecurity company Norton, believes that AI-driven attacks will continue to threaten holders of crypto assets. He emphasized that crypto assets offer hackers significant returns and low risks:

“Crypto assets are extremely attractive to cybercriminals due to their high value and anonymity of transactions, as a successful attack can not only bring huge financial benefits, but the criminals are also relatively unlikely to be discovered and held accountable.”

Corrons also mentioned that the crypto space is currently under-regulated, which provides cybercriminals with more opportunities to commit crimes and less legal risk.

How to detect AI-powered deepfake attacks

While AI-driven attacks could pose a huge threat to crypto users, security experts believe there are steps users can take to protect themselves from such threats. A spokesperson for CertiK said education would be a good place to start.

CertiK's engineer explained that it is crucial to have a deep understanding of the threats you face and to master the corresponding prevention tools and services. He further pointed out that users should be highly vigilant when facing any requests from unrecognized sources. He advised:

"Users should be skeptical of any requests for assets or personal information from unsure sources. In addition, enabling security measures such as multi-factor authentication (MFA) can add additional protection to sensitive accounts and effectively resist these types of fraud attacks."

In the meantime, Corrons believes that there are some “red flags” that users can try to identify to avoid falling for AI deepfake scams, including unnatural eye movements, facial expressions, and body movements.

Additionally, the lack of emotion can be a major red flag. “If a person’s expression doesn’t match the emotional content of their speech, or if you notice unusual distortions of the face or stitching of the image together, these could be signs of a deepfake,” Corrons explains.

He also pointed out that unnatural body shapes, misalignment, and inconsistencies in audio are important clues for users to identify whether the content is an AI-generated deep fake.

Disclaimer: The content above is only the author's opinion which does not represent any position of Followin, and is not intended as, and shall not be understood or construed as, investment advice from Followin.
Like
Add to Favorites
Comments