2 C
New York
Thursday, December 4, 2025

Buy now

spot_img

Calgary-area teen accused of using AI to create child sex abuse material

Alberta’s Law Enforcement Response Team (ALERT) has charged a Calgary-area teen accused of using artificial intelligence to create material related to child sex abuse and exploitation.

Investigators allege AI technology was used to sexualize photos of teen girls who attended several high schools in the Calgary area.

However, ALERT is not identifying the schools in order to protect the identity of the victims.

Staff Sgt. Mark Auger of ALERT’s Internet Child Exploitation unit (ICE) says the investigation began in October 2025 after ICE received a tip concerning child sexual abuse materials being uploaded to a social media platform.

On Nov. 13, ICE officers, with the assistance of Calgary police, executed a search warrant on a Calgary home. During the search, they seized two cellphones, a tablet and a laptop as possible evidence.

Story continues below advertisement

A 17-year-old, who can’t be identified under provisions of the Youth Criminal Justice Act (YCJA), now faces charges of making, possessing and distributing child sexual abuse and exploitation materials along with criminal harassment.

When asked how the images had been altered, or “sexualized” as he described them, Auger responded: “If I was the offender, I would capture the picture I want, whether it’s on TikTok, Instagram, any website, pull it off and then I can use software to nudify (the image), which AI will then give a very accurate assessment of your body type, your skin color, and make it near impossible to distinguish the nude image with just my face attached.”


Staff. Sgt. Mark Auger described the alleged crimes as “the most extreme weaponized version of bullying” to a young, developing child.

Global News

“Our biggest takeaway is that we need people to understand that this is not a joke, it’s not a prank, this is the most extreme form of bullying and a criminal offence,” added Auger. “We will take steps to stop this behavior.”

Story continues below advertisement

He said such actions can have a “horrible impact” on the victims.

Get breaking National news

For news impacting Canada and around the world, sign up for breaking news alerts delivered directly to you when they happen.

“Teenagers are going through probably the most changes in their life with self-image, body image, social networks, and this is, as I said, the most extreme weaponized version of bullying to a young developing adult child. That is why we are very supportive on the onset and at the back end. Our investigators are now in touch with all the identified persons and their families to offer that support.”

The accused, who appeared in court Wednesday morning, has been released on numerous court-ordered conditions, including no contact with persons under the age of 16 unless incidental through work or school, and not to have any electronics capable of accessing the internet other than for work or school.

His next scheduled court appearance is Jan. 8.

Police are also asking members of the community to help support the victims by not sharing such images, refusing to condone such behaviour and reporting these types of images or this type of behaviour to police.

2:04
Alberta girl’s football coach accused of making child pornography with AI


Provinces including British Columbia, Manitoba and Quebec have laws in place that criminalize AI-generated deep-fake pornographic or intimate images from being posted or shared online without consent.

Story continues below advertisement

Alberta’s legislation that prohibits non-consensual posting and sharing of intimate images, passed in 2017, does not mention AI-generated or altered images.

Rising warnings of AI crimes

The Alberta case comes amid increasing warnings from law enforcement about the dangers posed by artificial intelligence.

The RCMP said last year that a “wave of AI-generated child sexual abuse material is coming” as the technology swiftly improves and criminals gain access to AI-generating tools.

After a 12-year-old boy in B.C. who fell victim to online sextortion died by suicide in 2023, experts told Global News that AI was further compounding a mental health “epidemic” caused by similar cases involving minors.

That same year, the U.K.-based Internet Watch Foundation warned that AI-generated deepfake images will overwhelm child exploitation investigators without government action.

Story continues below advertisement

In 2023, a 61-year-old Quebec man was jailed for using artificial intelligence to produce deepfake videos of child pornography. No actual children were depicted, but Steven Larouche had broken the law banning any visual representation of someone depicted as being under the age of 18 engaged in explicit sexual activity.

Provincial court judge Benoit Gagnon wrote in his ruling that he believed it was the first case in the country involving deepfakes of child sexual exploitation.

This past summer, a junior girl’s football coach in Lethbridge was accused by ALERT of using AI to create child pornography.

2:01
Taylor Swift deepfake images: Why people are concerned over pornographic AI photos


Both the RCMP’s National Cyber Crime Coordination Centre and the Canadian Centre for Cyber Security, in its latest national threat assessment, have reported a sharp rise in AI-facilitated crimes that have caused harm or “near-harm” since the technology exploded into the mainstream in 2022.

Story continues below advertisement

Evan Solomon, Canada’s first minister responsible for artificial intelligence, is expected to introduce a new bill dealing in part with online harms.

Solomon said in late October his upcoming privacy bill could include age restrictions on access to AI chatbots to protect children. His spokesperson said the bill will be introduced in early December.

U.S. lawmakers have also been seeking to crack down on AI-caused harms on children after cases involving minors who were allegedly encouraged by chatbots to commit suicide, or did so after engaging in sexually charged conversations with so-called “companion” apps like Character.AI.

U.S. President Donald Trump this spring signed legislation into law that criminalizes non-consensual deep-fake pornography and requires online platforms to remove such material within 48 hours of a report. Several states have enacted similar laws.

—With files from the Canadian Press

&copy 2025 Global News, a division of Corus Entertainment Inc.

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Stay Connected

0FansLike
0FollowersFollow
0SubscribersSubscribe
- Advertisement -spot_img

Latest Articles