Personal identifiers must be stripped in AI use, experts urge

·Telecom & Tech Reporter
·4 min read
Shot of Asian IT Specialist Using Laptop in Data Center Full of Rack Servers. Concept of High Speed Internet Connections with Blue Visualization Projection of Binary Data Transfer

Experts urge that the use of personal information in artificial intelligence must be banned, and they welcome the Office of the Privacy Commissioner (OPC) recommendations on how the country should incorporate the fair use of AI.

Some recommendations from the report released last week include allowing the use of personal information for AI that includes a societal benefit; ensuring there’s a “rights based framework” that puts privacy as a human right first, before allowing the use of AI; ensuring companies are held accountable when using AI and are able to demonstrate its use if asked to do so, and regulate organizations to protect privacy and personal information when designing a system that uses AI.

The OPC confirmed in an email that any personal information used for societal benefit should be “de-identified” data, which would be used for research and statistical purposes, without consent, to facilitate the training of AI.

“This can allow AI algorithms to learn and be based on data that are more representative, thereby increasing its problem-solving potential and accuracy,” the OPC said in an email. AI can help analyze patterns in medical images to help diagnose illnesses, or improve energy efficiency by forecasting power grid demand, the OPC said.

Ann Cavoukian, former information and privacy commissioner of Ontario, said in an interview that it was critical that no personal information is collected by companies, but that the use of de-identified data without consent is normal and key in AI.

“De-identified data is when you strip all personal identifiers from the data, then the data you have remaining is no longer considered personal information,” she said. “There is no privacy risk because the privacy risk is attached to the identifiable data.”

She said that if any company was thinking of using any kind of AI, regardless of the circumstance, they should use de-identified data.

“If I was commissioner I would say fine, you want to use this data you’re going to have to strip it of personal identifiers first, right at the time of collection,” she said. “Then you’d be my guest to use it for AI.”

The set of recommendations came after a public consultation launched earlier this year, the release noted.

“Artificial intelligence has immense promise, but it must be implemented in ways that respect privacy, equality and other human rights,” Daniel Therrien, the privacy commissioner, said in the release. “A rights-based approach will support innovation and the responsible development of artificial intelligence.”

The release warned that if AI is based on personal information it can “have serious consequences for privacy.”

“AI systems can use such insights to make automated decisions about people, including whether they get a job offer, qualify for a loan, pay a higher insurance premium, or are suspected of unlawful behaviour,” the release said. “Such decisions have a real impact on lives, and raise concerns about how they are reached, as well as issues of fairness, accuracy, bias, and discrimination.”

Stephanie Carvin, a security expert and associate professor at Carleton University, agreed in an interview that using de-identified data for AI has potential for benefiting society, however, the recommendations need to be explicit in what those benefits are.

“Are you using it for policing apps? You can make an argument that that is a societal benefit,” she said, adding that a policing app for surveillance should not be considered beneficial to society.

She also noted it is important to introduce AI legislation as part of amending the privacy act because it will give more direction to Canadian companies that want to use it.

“One of the things that have inhibited innovation in Canada is that Canada has not created the regulatory space for companies to be able to try things out and they’re risk-averse,” she said. “A lot of people may feel that companies don’t want regulation when it’s actually the opposite, they want the space, they want to know what the boundaries are so that they don’t cross it and it goes badly for them.

Download the Yahoo Finance app, available for Apple and Android and sign up for the Yahoo Finance Canada Weekly Brief.

Our goal is to create a safe and engaging place for users to connect over interests and passions. In order to improve our community experience, we are temporarily suspending article commenting