Police warn Alberta parents, kids of rise in AI-generated ‘deepfakes’

Posted June 17, 2025 1:34 pm.
Last Updated June 18, 2025 3:42 pm.
Police in Alberta are warning parents about the emerging online trend of creating and sharing deceptive media made with artificial intelligence.
“Deepfakes” are often hyper-realistic AI-generated images, video or audio that realistically depict someone doing or saying something they did not actually do or say.
The technology is becoming cheaper and easier to use, and anyone with an internet connection can access its capabilities.
That has ALERT’s Internet Child Exploitation (ICE) unit warning Alberta parents about deepfakes, especially with summer break around the corner and children spending more time online.
“Our team is hearing more stories about the negative effects of AI, especially when it’s used by someone with ill intent, each time we’re out in the community,” said Cpl. Heather Bangle with ICE. “It is imperative that parents are aware that this technology exists, especially with kids home this summer.”
Citing a non-profit that builds technology to defend children from sexual abuse, ICE says one-in-10 kids know of cases where their friends and classmates have created “deepfake nudes” of other kids.
“I’m not going to lie, it’s very difficult,” said Bangle. “Just the sheer volume and the sheer number of sextortion reports — and now with this ai coming out and these reports that we’re starting to see — it’s overwhelming.”
According to Cybertip.ca, some 4,000 sexually explicit deepfake images and videos of children and youth were processed last year.
“In Canada, AI generated child sexual abuse images still meets the definition of child pornography under the criminal code of Canada,” said Bangle.
Generative AI deepfakes are also increasing in frauds and scams, police warn, such as tricking grandparents for money by impersonating the voice of a loved one.
“Don’t accept random requests, if they’re asking too many personal questions, and of course — if they’re asking the child to send intimate images — those are all red flags,” said Bangle.
Several states in the U.S. have been enacting legislation surrounding the creation of deepfakes.
–With files from The Associated Press