Signal CEO Meredith Whittaker has outspokenly critiqued a new legislative initiative proposed by the European Union that seeks to scan private communications for child sexual abuse material (CSAM). Whittaker asserts that these measures would drastically undermine the integrity of end-to-end encrypted (E2EE) systems.
The initiative was introduced by the European Commission in May 2022 to combat the dissemination of CSAM by scanning personal messages. The European Parliament, however, has largely resisted this measure, advocating for the exclusion of E2EE applications. In contrast, the European Council remains unwavering, with the most recent draft requiring messaging services to implement scanning technology aimed at upload moderation for CSAM detection.
Whittaker Criticises Implications for Encryption and Safety
Whittaker has raised serious objections, arguing that the Council’s plan repackages client-side scanning, which many in the cybersecurity community believe is incompatible with robust encryption methods. She stressed that enforcing such scanning would fundamentally degrade encryption and create exploitable vulnerabilities.
📣Official statement: the new EU chat controls proposal for mass scanning is the same old surveillance with new branding.
Whether you call it a backdoor, a front door, or “upload moderation” it undermines encryption & creates significant vulnerabilitieshttps://t.co/g0xNNKqquA pic.twitter.com/3L1hqbBRgq
— Meredith Whittaker (@mer__edith) June 17, 2024
European Parliament member Patrick Breyer from the Pirate Party has also denounced the Belgian proposal, arguing it essentially enforces the initial Commission draft. Breyer believes restricting messaging capabilities for users opting out of scanning is not acceptable in a contemporary digital context.
🇬🇧The number of false reports following the voluntary #ChatControl 1.0 scanning has risen massively, reports Der SPIEGEL (German): https://t.co/fKVsl5N8C0
Still EU governments want to make mass #ChatControl scanning mandatory for ALL providers on Wednesday – #StopScanningMe!
— Patrick Breyer #JoinMastodon (@echo_pbreyer) June 16, 2024
Data Protection Considerations
The EU data protection supervisor has cautioned that this proposal endangers democratic values prevalent in an open society. Law enforcement bodies, however, seem to advocate for E2EE apps to adopt scanning mechanisms. European police authorities have urged platforms to find ways to identify illegal content while maintaining encrypted communications.
The draft text from the Council tries to address the security risks, claiming the regulation would neither ban E2EE nor mandate specific technologies. This document, made public by civil society organizations, is not the most current version of the Council’s negotiations. When finalized, it will be published, leading to further talks between the Council and the new European Parliament.
While the Council acknowledges the necessity of E2EE in protecting fundamental rights, it insists that encrypted services should not become havens for unchecked CSAM distribution. Users who refuse upload moderation should still access the service without the ability to share visual media or URLs.
In the United Kingdom, the Online Safety Act has a comparable mandate requiring messaging platforms to employ approved technology to detect child abuse content if directed by the communications regulator. Currently, no certified technology meets these criteria.