Alberta falling behind on deepfake image protection, experts say

By Dione Wearmouth, Alejandro Melgar

With new advancements in AI technology making it easier than ever to create digitally altered, extremely realistic pornographic images, Alberta could be dropping the ball on protective legislation.

This comes as experts warn developments in AI are making it easier than ever to create realistic-looking digitally altered pornographic images. Some recent examples include the targeting of a group of high school girls in Winnipeg and even Taylor Swift.

Deepfake pornography is digitally altered pornographic images that can be used to attack and even extort victims.

Alberta passed its Protecting Victims of Non-Consensual Distribution of Intimate Images Act in 2017; however, Kristine Cassie, CEO of the Chinook Sexual Assault Centre, says the government should be urgently working to expand the act to include instances of digitally altered images, similar to what B.C. did on Jan. 29.

“This is highly damaging to people and we need to hold people to greater account and we need to see it as a crime,” she told CityNews.

“It’s great that we have the action in Alberta, but from my reading of it, it really has more to do with some of those restitution pieces and not enough to go with the criminalization.”


Watch: Privacy concerns grow over artificial intelligence following ‘deepfake’ photos of Taylor Swift


Cassie says people can experience great distress and anxiety, which can include suicidal ideation, shame, and social isolation.

In addition to wanting changes in provincial legislation to criminalize the act of making deepfakes, she says social media sites need to be held accountable and to have them detect and delete those images from their platforms. This also includes the use of pseudonyms online.

“This whole, being able to hide behind pseudonyms … is actually causing a lot of damage,” she explained.

“Free speech and what people may see as art is one thing, but it’s another thing when it’s causing damage and when it’s hurtful to other people.”

So far eight provinces have enacted intimate image laws, but only half of them refer to altered images.

The Liberal government says sexually explicit deepfakes will be addressed in upcoming legislation on online harms.

If passed, the long-delayed bill would establish new rules to govern certain categories of online content, including the non-consensual sharing of intimate images.

Geoffrey Rockwell, the Canadian Institute for Advanced Research (CIFAR) AI chair says altering images has been done online for years, but new AI technology is making it cheap and accessible and attracting people with malicious intent.

“What’s disturbing about it — deepfake porn — is that it’s being used by younger people to harass mostly women almost entirely women,” he said.

The professor of philosophy and digital Humanities at the University of Alberta also fears the trauma this can cause teenage girls who are predominantly victimized by this type of crime, along with women.

“People going through a period when they are becoming very aware of the male gaze or just the way women are represented, and then to be confronted with that personally where … the worst possible form of representation is all sudden being passed around by your schoolmates who are sneering about it,” he said.

“What’s disturbing about it, deep fake porn, is that it’s being used by younger people to harass mostly women almost entirely women.”


Watch: Teachers discuss challenges around artificial intelligence in Alberta classrooms


Benjamin Tan, a software engineering assistant professor at the University of Calgary, says education is the best remedy at this time, and to be aware of what you post online.

“The way we should go about this is really just to increase awareness that this kind of technology is out there, that it can be used. So maybe that’ll make you think twice a little bit about what kind of stuff that you’re sharing.

He also says people should scrutinize what they consume online, and note where it comes from.

“If we hold high standards to say, where we get our news, where we get our images, what kind of things we share amongst our various social groups, I think they can go a long way,” he said.

He adds the images tend to be shared among groups of people who may know the victim personally. In these instances, he says not to share it, and don’t consume it.

“Call people out when they do these kinds of things that they shouldn’t be,” Tan said. “Maybe in a couple of years, we’ll have a whole lot of other stuff, both technological solutions as well as regulatory solutions, to maybe help us out.”

Both Cassie and Tan say education and awareness are also helpful in combatting these forms of harassment.

In a statement to City News, the Ministry of Justice says no new intimate images legislation is planned at this time.

“Sharing intimate images without consent is an offence under the Criminal Code,” the statement reads.

Minister of Technology and Innovation Nate Glubish provided a statement Sunday in response to the publishing of this story.

“The government of Alberta is aware of the threats posed by deepfake technology and is working to address them, we will have more to say in the near future, as we work to modernize and strengthen privacy protections for Albertans,” his statement reads.

Alberta’s One Line for sexual violence is toll-free at 1-866-403-8000, and the Central Alberta Sexual Assault Centre 24-hour phone, text and chat support is at 1-866-956-1099.

-With files from The Canadian Press

Top Stories

Top Stories

Most Watched Today