ADVERTISEMENT
In car accident, need $30K: US man claims scammers used AI to mimic his voice to dupe parentsThe US man noted he never thought this could happen to someone he knew, since he'd warned others about similar instances in the past.
DH Web Desk
Last Updated IST
<div class="paragraphs"><p>A representative image showing a fake call using AI</p></div>

A representative image showing a fake call using AI

Credit: iStock Photo

A Florida man on X claimed that fraudsters used AI to recreate his voice and try and dupe his parents out of $30,000 (Rs 25 lakh).

ADVERTISEMENT

Jay Shooster, who is running for the Florida State House, detailed that scammers told his parents he was in a car accident and needed the money as bail to get out of jail.

"Today, my dad got a phone call no parent ever wants to get. He heard me tell him I was in a serious car accident, injured, and under arrest for a DUI and I needed $30,000 to be bailed out of jail. But it wasn't me. There was no accident. It was an AI scam," he wrote in a series of posts on the social media platform.

Shooster added that this call came days after he appeared on local television for election campaign. "Fifteen seconds of me talking. More than enough to make a decent AI clone," he added.

The US man noted he never thought this could happen to someone he knew, since he'd warned others about similar instances in the past.

"I've literally given presentations about this exact sort of scam, posted online about it, and I've talked to my family about it, but they still almost fell for it. That's how effective these scams are. Please spread the word to your friends and family," he said, adding, "A very sad side-effect of this voice-cloning tech is that now people in *real* emergencies will have to prove their identities to their loved ones with passwords etc."

Referring to this event, he said it is time for better AI regulations so such scammers can be stopped.

Shooster also portrayed a dystopian scenario where a person won't know if the voice on the phone is actually a loved one or not.

Several users commented on his post, saying such scams are becoming more common and harder to spot.

ADVERTISEMENT
(Published 02 October 2024, 13:39 IST)