Scammers accused of using deepfakes to dupe cryptocurrency projects

A documenter of web3 scams doubts the accuracy of the deepfake story.

Photo credit: Getty Images

Just when you thought cryptocurrency couldn’t get any dodgier, it turns out scammers have been using deepfake technology to impersonate an executive at the world’s largest crypto exchange.

Or did they?

In a blog posted to the company’s website, Patrick Hillmann, chief communications officer at Binance, said he wasn’t prepared for the onslaught of cyber attacks, phishing attacks and scams that target the community.

He said it made him understand the lengths that Binance goes to – including regular training and random security checks –  to ensure cybersecurity rules at the company are followed.

“However, criminals will almost always find a way to adapt to and circumvent even the most secure system,” he wrote.

“Over the past month, I’ve received several online messages thanking me for taking the time to meet with project teams regarding potential opportunities to list their assets on Binance.com. 

“This was odd because I don’t have any oversight of or insight into Binance listings, nor had I met with any of these people before.

“It turns out that a sophisticated hacking team used previous news interviews and TV appearances over the years to create a ‘deep fake’ of me.

“Other than the 15 pounds that I gained during COVID being noticeably absent, this deep fake was refined enough to fool several highly intelligent crypto community members.”

In the last week alone, website ‘Web3 is going just great’ – run by software engineer Molly White – has documented numerous scams, thefts, rug-pulls and more, costing millions of dollars.

The site tracks examples of how blockchain, cryptocurrency and other things in the web3 technology space often aren’t doing as well as proponents suggest.

White questioned Hillman’s story, wondering if it might be a case of “trying to cover Binance’s collective ass after being caught taking listing fees for tokens they never list”.

She noted the only evidence provided was a LinkedIn conversation that had been redacted, in which the unnamed person told Hillman “they impersonated your hologram”.

Wood said if the deepfakes were so good to be able to fool people it would be “remarkable”. 

“To date video deepfakes have mostly been limited to robotic-sounding and grainy pre-recorded Elon Musk impersonations, rather than anything that can respond naturally and quickly to a live conversation.

“But who’s to say, really – maybe deepfakers have made a considerable breakthrough with startling implications, and Hillman just didn’t feel it was important to elaborate on,” White concluded.

Add a Comment

Your email address will not be published. Required fields are marked *