welcomeToThat panicky call from a relative? It could be a thief using a voice clone, FTC warns-LoTradeCoin Wealth Hubwebsite!!!

LoTradeCoin Wealth Hub

That panicky call from a relative? It could be a thief using a voice clone, FTC warns

2024-12-25 22:07:37 source:lotradecoin permissions Category:Markets

For years, a common scam has involved getting a call from someone purporting to be an authority figure, like a police officer, urgently asking you to pay money to help get a friend or family member out of trouble.

Now, federal regulators warn, such a call could come from someone who sounds just like that friend or family member — but is actually a scammer using a clone of their voice.

The Federal Trade Commission issued a consumer alert this week urging people to be vigilant for calls using voice clones generated by artificial intelligence, one of the latest techniques used by criminals hoping to swindle people out of money.

"All [the scammer] needs is a short audio clip of your family member's voice — which he could get from content posted online — and a voice-cloning program," the commission warned. "When the scammer calls you, he'll sound just like your loved one."

If you're not sure it's a friend or relative, hang up and call them

The FTC suggests that if someone who sounds like a friend or relative asks for money — particularly if they want to be paid via a wire transfer, cryptocurrency or a gift card — you should hang up and call the person directly to verify their story.

A spokesperson for the FTC said the agency couldn't provide an estimate of the number of reports of people who've been ripped off by thieves using voice-cloning technology.

But what sounds like a plot from a science fiction story is hardly made-up.

In 2019, scammers impersonating the boss of a U.K.-based energy firm CEO demanded $243,000. A bank manager in Hong Kong was fooled by someone using voice-cloning technology into making hefty transfers in early 2020. And at least eight senior citizens in Canada lost a combined $200,000 earlier this year in an apparent voice-cloning scam.

"Deepfake" videos purporting to show celebrities doing and saying things they haven't are getting more sophisticated, and experts say voice-cloning technology is advancing, too.

Subbarao Kambhampati, a professor of computer science at Arizona State University, told NPR that the cost of voice cloning is also dropping, making it more accessible to scammers.

"Before, it required a sophisticated operation," Kambhampati said. "Now small-time crooks can use it."