Meet ‘Daisy’ an AI bot that wastes phone scammers’ time

Scammers beware, there’s a new AI ready to waste your time on the phone — and it’s intentionally programmed to sound like someone’s proud grandma.

Developed by UK provider Virgin Media O2 and announced on Thursday, “Daisy” is an AI-fuelled call answering service that aims to keep scam callers on the line as long as possible, meaning less time spent with potential human victims. It’s the same idea we’ve seen in a fair few time-wasting bots in the past — and it’s the signature strategy adopted by scam fighter Scamalot aka James Veitch.

O2 worked on the AI with YouTuber Jim Browning, whose scambusting work has seen him track and expose many a fraudulent scheme using various strategies.

SEE ALSO:

7 of the best robocall blocking apps and tools for avoiding phone spam

Described by the company as “head of scammer relations”, the Daisy AI is programmed to give rambling stories to callers — and I’m not going to lie, the details sound a little bit like age-based stereotyping of elderly women but who am I to say what a scammer will believe? According to O2, Daisy has told “meandering stories of her family, talked at length about her passion for knitting and provided exasperated callers with false personal information including made-up bank details.” The company claims Daisy “has successfully kept numerous fraudsters on calls for 40 minutes at a time.”

According to O2, Daisy is the result of multiple AI models that listen to the caller and make a live transcription. Then, the program generates an appropriate response from its language model, delivered in a human-like voice embedded with Daisy’s personality. This is the ad:

Mashable Light Speed

Why make the AI sound like the social stereotype of an elderly woman? The prevalence of telephone fraud in the UK is particularly felt by the elderly, with scammers targeting people over the age of 75 — and a huge majority of these scams are conducted by phone. This is not unique to the UK — according to the FBI Internet Crime Complaint Center (IC3), impersonation scams caused over $1.3 billion in losses for people in the U.S. in 2023, with call centres targeting older adults — “almost half the complainants report to be over 60 (40 percent), and experience 58 percent of the losses (over $770 million).”

Scam callers use social engineering techniques to try and convince people to hand over their personal details like banking, social security, and other ID details — so having an AI chat right back to them without the risk of losing real money seems like a pretty smart technique.

You won’t be able to interact with Daisy yourself (unless you’re a scammer). When I reached out to the company for further information, an O2 spokesperson told me, “The purpose of creating Daisy was to both waste scammers time and to create a campaign to educate the public on the danger of scam calls. The tool was purpose built to interact with scammers and so is optimised to do that rather than have general conversations. Opening the tool up to everyone would also require a huge amount of computing power, so right now this isn’t something Daisy is able to do.”

In the case that the scammer makes it through to you instead of Daisy, you can forward suspected scam calls and text messages to O2’s existing blocking service at 7726.

Unfortunately, the sheer, sinister creativity of scammers has no limit, so companies need to think equally if not more creatively to combat them.

In the meantime, go call your actual grandma and don’t dismiss her stories as “waffle”. She’s not Abe Simpson or an AI bot.

Topics
Artificial Intelligence
Cybersecurity

Leave a Comment