Wednesday, January 15, 2025
31 C
Brunei Town
More

    Latest

    Unmasking the deception and threat of AI voice scams

    WASHINGTON (AFP) – The voice on the phone seemed frighteningly real – an American mother heard her daughter sobbing before a man took over and demanded a ransom. But the girl was an artificial intelligence (AI) clone and the abduction was fake.

    The biggest peril of AI, experts say, is its ability to demolish the boundaries between reality and fiction, handing cybercriminals a cheap and effective technology to propagate disinformation.

    In a new breed of scams that has rattled United States (US) authorities, fraudsters are using strikingly convincing AI voice cloning tools – widely available online – to steal from people by impersonating family members.

    “Help me, mom, please help me,” an Arizona-based mother, Jennifer DeStefano, heard a voice saying on the other end of the line.

    DeStefano was “100 per cent” convinced it was her 15-year-old daughter in deep distress while away on a skiing trip.

    “It was never a question of who is this? It was completely her voice… it was the way she would have cried,” DeStefano told a local television station in April. “I never doubted for one second it was her.”

    Smartphone recording in front of a voice cloning screen. PHOTO: AFP

    The scammer who took over the call, which came from a number unfamiliar to DeStefano, demanded up to USD1 million.

    The AI-powered ruse was over within minutes when DeStefano established contact with her daughter. But the terrifying case, now under police investigation, underscored the potential for cybercriminals to misuse AI clones.

    “AI voice cloning, now almost indistinguishable from human speech, allows threat actors like scammers to extract information and funds from victims more effectively,” Wasim Khaled, Chief Executive of Blackbird.AI, told AFP.

    A simple Internet search yields a wide array of applications, many available for free, to create AI voices with a small sample – sometimes only a few seconds – of a person’s real voice that can be easily stolen from content posted online.

    “With a small audio sample, an AI voice clone can be used to leave voicemails and voice texts. It can even be used as a live voice changer on phone calls,” Khaled said.

    “Scammers can employ different accents, genders, or even mimic the speech patterns of loved ones. (The technology) allows for the creation of convincing deep fakes.”

    In a global survey of 7,000 people from nine countries, including the US, one in four people said they had experienced an AI voice cloning scam or knew someone who had.

    Seventy per cent of the respondents said they were not confident they could tell the difference between a cloned voice and the real thing said the survey, published last month by the US-based McAfee Labs.

    American officials have warned of a rise in what is popularly known as the grandparent scam – where an imposter poses as a grandchild in urgent need of money in a distressful situation.

    “You get a call. There’s a panicked voice on the line. It’s your grandson. He says he’s in deep trouble – he wrecked the car and landed in jail. But you can help by sending money,” the US Federal Trade Commission said in a warning in March. “It sounds just like him. How could it be a scam? Voice cloning, that’s how.”

    In the comments beneath the Federal Trade Commission’s warning were multiple testimonies of elderly people who had been duped that way.

    “Because it is now easy to generate highly realistic voice clones… nearly anyone with any online presence is vulnerable to an attack,” Hany Farid, a professor at the UC Berkeley School of Information, told AFP.

    spot_img

    Related News

    spot_img