Breadcrumb Trail Links
NewsCrime
Article content
The voice sounds real.
And it’s an emergency — send money to bail out a grandchild.
Turns out, a scammer has generated the voice using artificial intelligence in a targeted attempt to deceive a grandparent into handing over thousands of dollars in cash.
In the past few weeks, Lethbridge police say they have received reports of AI being used to mimic the voices of victims’ relatives, usually grandchildren.
Advertisement 2
Article content
“The fraudsters call the victim and, using their grandchild’s voice, state that they are in jail and need of $9,000-$10,000 for bail,” said Lethbridge police in a Wednesday news release.
Imitating legitimate local law offices, the scammers then arranged to have the money picked up at the victim’s home, say police.
The typical grandparent, or emergency, scam has been around for years. With that scam, criminals create a sense of urgency for the victim, usually a grandparent, to react to, said Sgt. Kevin Talbot with the Lethbridge Police Service’s economic crime unit. For example, the caller will tell a grandparent that a grandchild has supposedly been arrested, been in a motor vehicle collision or is in hospital. Money is urgently needed to help the grandchild, the scammer will say.
But in some cases now, the scam has become more sophisticated. Scammers will follow a grandchild on social media, collecting information about that individual and taking a voice recording if they can find audio or video online. Using that recording to mimic a grandchild’s voice, the scammer’s phone call to the victim makes the emergency seem more legitimate, Talbot said.
Article content
Advertisement 3
Article content
“Victims are being convinced quicker and don’t bother doing any follow up, because it sounds just like their grandchild,” said Talbot, adding that victims have told police the voice sounded exactly like their grandchild.
In one case, two southern Alberta men have been charged for their alleged roles in “a number of” recent grandparent scams in which AI was used.
Claresholm resident Jordon Ian Henderson, 31, and Stirling resident Johan Wiebe Klassen, 31, have both been charged with fraud over $5,000. After bail hearings, they were each released from custody and both are scheduled to appear in court on Oct. 23.
In this case, the two men responded to an advertisement placed on the classified ad website Kijiji to be a courier “without really knowing what they were getting themselves into,” said Talbot.
“The ad simply was, ‘Go pick up money, keep a little bit of money for yourself and deposit the money into a bank account,” said Talbot, adding police want to warn people about responding to these kinds of job ads.
The scammers could be anywhere in the world, and typically the couriers don’t even know to whom they’re sending the money.
Advertisement 4
Article content
With the technology available to criminals, it’s difficult to identify who made the phone call to the victim, Talbot acknowledged. And figuring that out can tax law enforcement resources.
But this use of AI is becoming more common, say police.
“We’ve reached out to other jurisdictions in southern Alberta and there’s been many, many cases that are very, very similar to this one,” said Talbot, adding there could be other cases linked to the two men charged.
To prevent being scammed, Talbot recommended that those targeted should check with their family to verify that what they’ve heard is actually the case.
“What scammers tell victims is to lie and to keep it a secret, and they do that because they don’t want you to check and see,” he said.
“Most of the time these kinds of technologies are sold on the cheap,” he said. “There are people who for the good of the community make available certain AI tools for people to use, but there are people who would take advantage of those and use (them) for nefarious purposes.”
While AI creations are sometimes poorly done, there are fake versions of songs so close to the real ones that it’s impossible to tell the difference between the two, said Boakye-Boateng.
“It will definitely get difficult to determine (the difference between real and fake), but I’m very sure that technology is also being developed to actually determine whether this is real or this is fake,” he said.
Article content
Share this article in your social network