A Victoria-based lawyer is calling B.C.’s new laws aimed at protecting victims, whose intimate photos are used for A.I.-generated nude photos without their consent, the strongest in the country.
On Jan. 29, the Intimate Images Protection Act (IIPA) will come into effect in B.C. less than a year after it was introduced. The new set of laws helps victims who have had their intimate images shared without their consent, have an easier time to seek damages and orders to have those images removed.
A growing problem
“These A.I. tools being so powerful and widely distributed, I imagine it’s going to be a growing problem,” said Erik Magraken, managing partner at MacIsaac & Company.
Magraken, who specializes in injury law, says the new laws inside of the IIPA gives victims an easier opportunity to seek damages.
“It lets people take ownership of their intimate images even if they consented to them being shared before,” he said.
“You can use this legislation to have them removed… anybody who is distributing your images without your consent can be targeted by it. Internet companies, internet intermediaries, even Google, the way it comes up in searches, even that gets targeted by it.”
The act also lets victims file a claim with the Civil Resolution Tribunal (CRT) to have their images removed from platforms, which also carries a penalty if the orders aren’t followed.
According to the CRT, an online reporting portal will be available on Jan 29.
“Across North America, this may be the most powerful tool victims of this kind of image abuse have,” said Magraken.
Victims include Taylor Swift
Earlier this week, a scourge of pornographic deepfake images generated by artificial intelligence and sexualizing people without their consent has hit its most famous victim, singer Taylor Swift, drawing attention to a problem that tech platforms and anti-abuse groups have struggled to solve.
Sexually explicit and abusive fake images of Swift began circulating widely this week on the social media platform X.
“It’s often very private, [it] can be emotionally triggering if their picture is released,” said Brandon Laur, CEO of the White Hatter.
Victoria-based The White Hatter specializes in digital privacy and says the ability to have A.I.-generated nude images can be done in minutes, sometimes for free and only requiring an email.
Laur says his company experimented with 24 A.I. image apps, and seven of them were considered “user-friendly.”
“It can lead to harassment, intimidation, relationship challenges, so this is an extremely concerning topic, and it’s super easy to make them,” said Laur.
In Dec. 2023, RCMP in Winnipeg opened an investigation after A.I. generated nude photos of several Grade 7-12 female students began circulating on social media.
“A.I. is a nuanced, complicated new part of our world,” said RCMP Const. Dani McKinnon. “It’s certainly a newer form of surreptitious behaviour. We’re in uncharted territory.”
Broader use of act
Magraken says there’s an argument to be made that the new IIPA could let non-British Columbians to have their explicit images removed from the web.
“I think it’s written so broadly that even non-British Columbians can try and take advantage of it,” said the lawyer.
Magraken believes that anyone — including Taylor Swift — may be able to use the CRT’s upcoming reporting portal to have their explicit images removed.
However, he says that would ultimately be up to the CRT to decide.
“Even though she’s not from B.C, even she could take advantage of the way this is written because these tech companies are distributing her images, without her consent, and that’s accessible within the provincial borders,” said Magraken.
While he hasn’t had any victims of A.I. generated nude images come forward to his firm, Magraken worries of a rise of these cases in the near future.
With files from CBC’s Darren Bernhardt