Conservative private member’s Bill C-412 “An Act to enact the Protection of Minors in the Digital Age Act and to amend the Criminal Code” is an attempt to thread the needle between online safety for children and everyone else’s fundamental privacy rights. The bill certainly avoids the worst abuses of the Liberal government’s own Bill C-63. But I don’t believe that, in it’s current form, C-412 is workable.
Because issues with the proposed legislation are mostly technical in nature, I’m going to see if I can locate my old IT admin’s hat and toss it on for this quick and dirty analysis.
Based on the current proposed legislation, I can see a number of potential problems. For instance:
The law would require platforms providing apps and services to identify whenever a child is trying to access their service:
(3) The operator must
(a) in restricting access to its platform or any of its content that is inappropriate for children, use computer algorithms that ensure reliable age verification and that preserve privacy;
It’s certainly true that code implemented within dynamic websites and phone apps will often track users’ behaviour using sophisticated (and scary) tools. But I’m not aware of any code that can reliably detect a user’s age based only on the actions of a single session. I’ve seen nothing to suggest that, say, mouse movements or keystroke cadence are anywhere near as useful as you’d need for this kind of identification.
Sure, a laptop’s browsing history can tell you a lot about the laptop’s owner, but how do you know that the person whose hands are on the keyboard for the current session is the same person who built up that history? Has a 12 year old kid never gained occasional access to a parent’s computer or phone?
In fact, I believe that this technology is still mostly science fiction.
But even if I’m wrong and the ability to identify the user behind a single session does exist, it would have cost a great deal of money to discover and it’ll surely enjoy military-grade intellectual property protections. There’s no way it’ll be available for smaller companies on the open market.
The legislation will also require invasive monitoring of all platform activity:
by taking reasonable steps in the design and operation of its products and services to prevent or to mitigate the effects of the following:
(a) physical harm or incitement of such harm and online bullying and harassment of minors;
The only way that “online bullying and harassment” can be detected is if all platform-based communication is recorded and parsed. Leaving aside the privacy issues involved here, detecting such events is a logistical nightmare. After all, “bullying” can be very subtle and can easily happen using language (or foreign languages) that monitoring software just won’t pick up. And this level of speech policing is a legal death trap for US platforms who are desperate to protect their status as common carriers.
Parental notification requirements are also impractical:
the operation of an account by a user whom it knows or should reasonably know is a minor without first verifying the contact information for any of the user’s parents through, for example, the appropriate Internet service provider
This places a HUGE burden on ISPs and on app developers or smaller platforms who might have only a tiny percentage of their users in Canada. More to the point, foreign companies are going to have no connection with Canadian ISPs and no way to legally access user information.
Let me put it this way: if my ISP shared my contact information and detailed family identification with foreign (or even domestic) entities, they’d be hearing from my lawyers.
Or let me put it this way: no Canadian ISP will ever do this.
The requirement for independent audits is a nice idea, but it goes way beyond what’s practical:
Independent review
13 Every two years, every operator must cause an independent review of its platform to be conducted, including in respect of the risks and harms it poses to minors and the cumulative effects the use of the platform has on minors. The operator must make the findings publicly available.
Perhaps Meta (Facebook) and Alphabet (Google) can pull this off. But the platforms we should probably be worrying about are early to mid-stage startups. Those startups will have a lot less money to play with, will not be based in Canada, and will have zero incentive to commit thousands of hours and millions of dollars to executing external audits. Audits, I might add, that would satisfy legislators representing only a tiny fragment of the company’s customer base.
I’d be curious to see how this legislation might have come out differently if anyone even loosely connected to the IT world had been involved. Or just perhaps there aren’t effective legislative answers to online threats. Perhaps we just need to do a better job preparing our kids to face a dark world through better parenting and education.
I reached out to the sponsor of Bill C-412 (Michelle Rempel Garner) for comment but didn’t hear back.
Every computer, phone, streaming device and smart TV has a feature that I wish politicians would review before introducing these "save the kids" bills; namely Parental Controls. This feature gives parents a lot of control over what type of contents kids can access on their devices. Parental Controls aren't perfect, but they are better and far less intrusive than some age verification vapourware.
Porn is bad for kids according to many politicians, but kids 16 and older can consent to sex with anyone, 14 and 15 year old kids can consent to sex as long as the age difference is less than 5 years, and 12 and 13 year olds can consent to sex as long as the age difference is less than 2 years. So I guess the message to teenagers is stop watching porn on the internet, instead go out and find someone you can legally have sex with.