Europe’s CSAM scanning plan looks unlawful, per leaked legal advice
A legal opinion on a controversial European Union legislative plan set out last May, when the Commission proposed countering child sexual abuse online by applying obligations on platforms to scan for abuse and grooming, suggests the planned approach is incompatible with existing EU laws that prohibit general and indiscriminate monitoring of people’s communications.
The advice by the Council’s legal service on the proposed Child Sexual Abuse Regulation (also sometimes referred to as “Chat control”), which leaked online this week — and was covered by The Guardian yesterday — finds the regulation as drafted to be on a collision course with fundamental European rights like privacy and data protection; freedom of expression; and the right to respect for a private family life, as critics have warned from the get-go.
The Commission countered these objections by claiming the plan is lawful since it will only apply what they couch as “targeted” and “proportionate” measures to platforms where there is a risk of online child sexual abuse taking place, along with “robust conditions and safeguards”.
The legal opinion essentially blasts that defence to smithereens. It suggests, on the contrary, it’s “highly probably” that a judicial review of the regulation’s detection orders — which require platforms to scan for child sexual abuse material (CSAM) and other related activity (like grooming) — will conclude the screening obligations constitute “general and indiscriminate” monitoring, rather than being targeted (and proportionate), as EU law demands.
On this, the legal advice to the Council points out that the Commission’s claimed “targeting” of orders at risky platforms is not a meaningful limit since it does not entail any targeting of specific users of a given platform, thereby requiring “general screening” of all service users.
The opinion also warns that the net effect of such an approach risks leading to