ML and Canadian Bill C-11

Canadians, and those who do business with us, might be interested in Bill C-11. The primary purpose of the bill is to create the Consumer Privacy Protection Act, moving consumer privacy out of PIPEDA and into it’s own Act.

The Bill includes new transparency requirements that apply to automated decision-making systems, which will include AI/ML. While the Bill is only at first reading, the apparent intent is to require businesses to be transparent about how they use algorithms to make recommendations or decisions about consumers. The Bill also proposes a right to request information on the automated decision-making system, including the source of information used:

If the organization has used an automated decision system to make a prediction, recommendation or decision about the individual, the organization must, on request by the individual, provide them with an explanation of the prediction, recommendation or decision and of how the personal information that was used to make the prediction, recommendation or decision was obtained.

Another interesting point is that the Bill includes exceptions to consent requirements.

The Bill hasn’t been to committee yet, so changes should be expected, but I thought it might interest the community. Here’s a link:

Is the expectation here that this information will be provided in the manual or some sort of agreement that the user approves at the start of device usage? Just trying to understand how this may be put into practice.

It’s early in the legislative process, so we don’t have a lot of details. However, my understanding is that the intent is to address the use of algorithmic decision-making that impacts consumers (for example a credit decision by a financial institution) as opposed to regulating products sold to individuals.

While the legislation is aimed at protecting consumers, the government has said that they also wish to enable “responsible innovation,” and that includes simplifying consent requirements and understanding that data sharing can sometimes be a good thing – for example by allowing businesses to disclose de-identified data to the public sector under certain circumstances.

I think this one will be interesting to watch at committee. I expect that social media and search engine companies will have a lot to say.