Skip to main content
Ratehub logo
Ratehub logo

Rise of AI related financial scams

Lately my phone has been ringing constantly with scams. Sometimes it’s the “CRA” threatening to throw me in jail if I don’t pay up fast. Sometimes it’s a special on duct cleaning. Sometimes it’s a “problem with a package delivery.” 

According to a TransUnion survey, the amount of suspected digital fraud attempts increased by 202% in Canada between 2019-2023. Sixty percent of Canadians surveyed said they’d been targeted by a fraud attempt, and 10% fell victim to the attempt, the report says.

Data from the Canadian Anti-Fraud Centre (CAFC) says that over 40,000 Canadians lost a total of $554 million to fraud in 2023 – and experts worry the problem is about to get much worse as artificial intelligence makes it easier for scammers to trick their victims.

The rise of AI

Artificial intelligence, or AI for short, has been in use for years, but primarily in the background and out of reach. 

In law enforcement, AI-driven facial recognition technology has revolutionized investigations, helping to track and identify suspects more efficiently. On social media and e-commerce platforms, personalized AI-driven recommendations have been used to make our online experiences more enjoyable and efficient. And in healthcare, AI has excelled at analyzing medical images like X-rays and MRI scans, supporting healthcare professionals in making more accurate diagnoses.

In the past few years, however, AI has found its way out of proprietary spaces and become much more commonly accessible. Just about every video calling app now allows you to change your appearance with face filters. Tools like Speechify can learn and mimic your voice. And the language model ChatGPT is so effective it stymied the academic community by successfully writing a thesis paper about itself.  

If it exists, scammers will use it

If you’re old enough to remember when email first gained popularity in the late 1990s, you probably remember getting one from “Bill Gates” promising to share his infinite wealth with you on the condition you forward his message to your entire contact list without delay. Of course this was a hoax, but so many people believed it that Gates and Microsoft both issued a public response debunking the claim. 

That particular hoax was relatively harmless, but its ubiquity proves an important point: people are keen to exploit your ignorance of new technologies. Commonly accessible AI tools make this trend even more dangerous as they can be used by scammers to impersonate people you trust. And if you think you’re smart enough to spot the difference, tell me which of the six preceding paragraphs was written not by me, but by AI.

 

How are people getting scammed with AI?

The scams are much the same at heart, but AI is putting a new twist on old flim flams, making them feel more real and harder to spot. Some scams that are becoming more common include:

The call from a "loved one"

The scariest AI scams to date involve using technology to imitate someone you know using voice cloning technology. This bit of social deception involves taking recordings of the person’s voice from videos posted on social media, feeding it to voice cloning software, and then having the AI pose as that person. They may claim to have been in a car crash, in jail, or even kidnapped, but the remedy is always a hurried infusion of cash.

In previous iterations of this scam, the call would come from someone claiming to be an involved third-party, like a police officer or lawyer. Now, even a call that sounds like someone you know and trust can potentially be an AI impersonation.

Scammers can also use this grift to impersonate you on calls to customer service. With enough personal information and your voice model, fraudsters can potentially withdraw money from your account or charge large purchases to you by having the AI give consent on your behalf.

Old-fashioned scams with fewer spelling errors

A time-tested way to spot a fake email is to proofread it carefully – the newest email in my spam folder claims to be from the UPS “International Shipping and Logissics Service” and is obviously fake. They want me to click a link and schedule my delivery by providing my credit card information to cover the $1.99 shipping cost. I suspect that if I were to provide that information, a lot more than that would be charged to my card. 

With AI tools, scammers can write more realistic and believable emails to fool you. I gave the free version of ChatGPT the prompt, “Write a brief email to a customer explaining their delivery is being held pending payment of import fees totalling $34.87 and asking for prompt payment,” and it gave me this professional, eloquent, error-free message to copy and paste for my victim:

Subject: Action Required: Import Fees for Your Delivery

Dear [Customer's Name],

I hope this email finds you well. We want to thank you for choosing our services and placing an order with us. We are excited to deliver your package to you as soon as possible.

However, I regret to inform you that your delivery is currently being held at customs pending payment of import fees. The total amount of import fees comes to $34.87. These fees are charged by the customs authorities and are beyond our control.

To expedite the release of your package, we kindly request that you make the payment for the import fees at your earliest convenience. Once the payment is made, we will immediately process the customs clearance, and your delivery will be on its way to you without any further delay.

Please find below the payment instructions:

Amount: $34.87

Payment Method: [Provide the accepted payment method or a link to the payment portal]

Reference Number: [If applicable, provide any reference number required for the payment]

If you encounter any issues or have any questions regarding the payment process or your delivery, please don't hesitate to contact our customer support team at [Customer Support Phone/Email]. We are here to assist you and ensure a smooth delivery experience.

Your satisfaction is our top priority, and we appreciate your understanding and cooperation in resolving this matter promptly. Once again, we apologize for any inconvenience caused.

Thank you for your prompt attention to this matter. We look forward to delivering your package to you as soon as possible.

Best regards,

[Your Name]

[Your Title/Position]

[Company Name]

[Contact Information]

Phoney business emails

A friend of mine told me about a time his business was almost duped by a clever case of social deception. A co-worker stopped by his office asking to borrow the company credit card. Their boss had sent an urgent email and needed him to buy a few thousand dollars worth of gift cards right away. Fortunately, they figured out the email wasn’t really from their boss and the gift cards were destined for use by scammers.

These phoney business emails are common, but AI is giving them a new twist. Scammers can use AI to learn about the inside workings of your business from various online sources. They can then use the software to write more convincing messages and even reference personal details like your working relationship and recent projects you might have been involved in.

What’s more, scammers can use voice-cloning technology to impersonate your boss and potentially have you complete a transaction for them before you realize you’re being tricked.

Text and chat scams

My primary relationship with chatbots is trying to make them go away on every website I visit, but they’re also used to commit fraud. In this con, scammers use a chatbot to build a relationship with the victim before eventually asking for personal information, money, or investment in a get-rich-quick scheme. More often than not, they find their victims using fake profiles on online dating platforms.

This one has been around a long time, but AI is making it faster and easier for fraudsters to gain your trust. Rather than doing the dirty work by hand, they program a chatbot to automate the process of pretending to fall in love with you and taking your money.

 

How to recognize and avoid AI-driven financial scams

AI-driven financial scams are mostly the same as their old-school predecessors, with the added twist that they can look and feel a lot more real – especially if a scammer has used AI to copy someone’s voice. Avoid becoming a victim by remaining skeptical of any unusual requests, especially those involving money. Follow these tips to protect yourself:

1. Be on the lookout for common red flags

Whether or not AI is involved, the most common scams share a few common red flags. The Canadian Anti-Fraud Centre says fraudsters commonly employ these tactics to fool their victims:

  • Impersonation and spoofing: Pretending to be a person or company you know and trust, and altering things like caller ID numbers and email addresses to make their communications appear legitimate.
  • Threats and urgency: Threatening severe consequences for you or someone you know unless you comply with their time-sensitive demands. 
  • Emotional manipulation: Playing on your emotions to lower your guard, often by pretending to be a loved one, a romantic prospect, and/or by telling you a sob story.

Remember that emergency situations almost never require large sums of cash or gift cards; lawyers, police, and government agencies will never demand cash; and she doesn’t really love you if you’ve never met in person. (Sorry).

2. Verify unusual requests before taking action

If someone you know contacts you by phone or email with an urgent request for money, take the time to verify they’re who they say they are. The easiest way to do this is to ask a personal question only that person would know. Better yet, hang up and call them back using a number you know.

If a call claims to be from the authorities, such as the police or CRA, hang up and call the agency directly to verify the request. No legitimate government agency will penalize you for taking the time to verify their request. Remember the CRA primarily communicates by mail and their messages will appear in your CRA My Account.

3. Use your own information to verify requests

When verifying unusual requests for money, don’t fall into the trap of using information provided by scammers. The links, phone numbers, and email addresses provided are there to give you a false sense of security and make their claim look legitimate. Even worse, they can be used to complete a fraudulent transaction.

Instead, use your own information to verify their identity and claim. If a loved one asks for money, call them back using the number saved in your contact list. If a credit card company wants their minimum payment, log in to your online banking portal to see what you owe. If the CRA wants something, look up their customer service number and call to double check.

4. Report fraud attempts

You can help stop scammers by reporting their attempts, whether or not they succeed in taking your money. The CAFC provides helpful resources for why, when, and how to report scams.

 

The bottom line

Financial fraud and scams are all too common, and new AI tools are making it even easier for scammers to fool their victims. Be suspicious of unusual requests for money or personal information and take your time to independently verify them, even if they appear to come from people you know and trust.

 

Also read