Advertisement

Italy’s PM is suing over deepfake porn. Could it happen in Canada?

Click to play video: 'Taylor Swift deepfake images: Why people are concerned over pornographic AI photos'
Taylor Swift deepfake images: Why people are concerned over pornographic AI photos
Fake pornographic images of pop superstar Taylor Swift spread across social media this week, sounding the alarm about the rise of advanced artificial intelligence (AI) technology. Mike Drolet reports – Jan 27, 2024

The Italian prime minister is suing a man for allegedly creating a pornographic deepfake of her and uploading it to a U.S. website where it was reportedly viewed millions of times.

Deepfakes, which are realistic simulations of people commonly used to make fake pornographic videos showing female celebrities or politicians, are becoming a growing concern in countries including Canada.

One expert said they’re becoming more common because generative artificial intelligence (AI) makes them easier to create and that they contribute to misogyny, dehumanization and silencing of women.

“People (with deepfakes made of them) face significant reputational harm. They lose their sense of trust in other people,” Dalhousie Schulich School of Law Prof. Suzie Dunn told Global News from Halifax.

It’s a problem Canada is trying to help solve through the proposed Online Harms Act.

Story continues below advertisement

What happened in Italy

Italian Prime Minister Giorgia Meloni is suing two men who allegedly made pornographic deepfakes of her.

As reported by the BBC, she’s seeking 100,000 euros (about C$150,000) in damages for defamation. The charges haven’t been tested in court.

Click to play video: 'Implications of AI Deepfakes'
Implications of AI Deepfakes

The indictment claims a 40-year-old man created and posted the videos to a U.S. porn site where they were viewed millions of times over several months, according to the article.

Story continues below advertisement

The prime minister’s legal team told the British broadcaster she would donate the money to support women who have been victims of male violence.

Italian police are investigating the 40-year-old man and his 73-year-old father. Italian law permits some defamation cases to be tried in criminal, not civil, courts and those found guilty can serve jail time, the BBC report states.

What do Canadian laws say?

Dunn said current Canadian laws on the matter usually vary by province.

The country introduced some criminal provisions against sharing intimate images without someone’s consent in 2015 — which is before deepfakes were publicly available, Dunn pointed out.

And while deepfakes could potentially be prosecuted under extortion or harassment criminal laws, she said that has never been tested, adding that criminal law for now “is limited to actual, real images of people” and not AI-created content.

Story continues below advertisement

Some provinces, such as Saskatchewan, New Brunswick and Prince Edward Island, have introduced civil statutes since 2015 that refer to altered images – which include deepfakes — allowing them to ask a judge for an injunction (an order) to have them removed.

“What most people are looking for is to get the content taken down. And so that would be seen seeking an injunction,” Dunn said.

Manitoba introduced updated legislation this week and British Columbia has a “fast-track” option where people can request intimate (and altered intimate) images of them be taken down quickly, instead of having to wait for the typical weeks or months of court processing times.

The Online Harms Act, which the federal Liberal government tabled last month, calls on social media platforms to continuously assess and remove harmful content, including content that incites violence or terrorism, content that could push a child to harm themselves and intimate images shared without consent – including deepfakes.

Click to play video: 'Critics concerned about broadness of online harms bill'
Critics concerned about broadness of online harms bill

Platforms would need to remove content they or users flag within 24 hours.

Story continues below advertisement

A social media service, as defined by the bill, includes “an adult content service” that is “focused on enabling its users to access and share pornographic content.”

Dunn said she supported placing the onus on platforms because “they’re the people that we can go to for the swiftest results.”

“I think having this type of legislation that requires (the platforms) to assess and mitigate the risks on their platforms in the world that we live in today is an important measure for governments to take.”

A study from 2021 called “Image-Based Sexual Abuse” by a group of U.K. and Australia-based researchers found that many women who suffered deepfakes “have experienced significant, often all-encompassing harms – in large part because of the social stigma and shame around women’s sexuality.”

“Victim-survivors, for example, talked about how the harms are often constant, leaving them feeling isolated and under threat,” it said.

Dunn told Global News those creating deepfakes typically show “real aspects of misogyny, dehumanization (and) of objectifying (women).”

“Female leaders, whether they be celebrities, politicians, journalists, activists, if you’re in the public sphere, chances are there’s a sexual deepfake made of you and there’s a sexual deepfake made publicly of you available,” Dunn said.

“If (deepfakes) become a normal part of a female politician’s job, a lot of young politicians are going to think, ‘I don’t really want to have to face that.'”

Story continues below advertisement

She called the Italian prime minister’s case “courageous,” and said if Meloni’s case is successful, it could offer Canadian lawmakers and lawyers a method for prosecuting people who release deepfakes.

— with files from Global News’ David Baxter, Reuters’ Anna Tong and The Canadian Press’s Stephanie Taylor and Marie Woolf

Sponsored content

AdChoices