A Troubling Trend: Impersonation Scams Leveraging AI to Mimic Loved Ones’ Voices
They thought they were hearing their loved ones. They thought it was an AI scam
Card’s experience is indicative of an alarming trend in impersonation scams that are on the rise in the United States. Bad actors can now imitate voices more easily and cheaply, leading them to convince people, usually the elderly, that they are in danger. According to the Federal Trade Commission, in 2022 impostor scams ranked second among American frauds, with more than 36,000 people swindled. FTC officials reported that over 5,100 of these incidents occurred over the telephone, resulting in over $11,000,000 in losses.
Artificial intelligence has added a new level of terror, allowing criminals to mimic a voice using only a small audio sample. A number of online tools powered by AI can convert an audio file to a voice replica, allowing scammers to \”speak\” anything they type.
Experts claim that federal regulators, the law enforcement agencies and the courts lack the tools to combat the growing scam. The majority of victims are unable to identify their perpetrators, and the police have difficulty tracing calls or funds made by scammers in other countries. There is no precedent in law for the courts to hold companies responsible for the use of these tools.