First off, congratulations. We've won. Well, kind of. Apple officially delayed the implementation of CSAM initiative. The company's statement to the press was in its diplomatically nebulous fashion, but we can dare say, CSAM detection in the way it had been introduced to us was canceled.
Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.
How much "additional time" — not specified (though certainly not less than "several months"), what "improvements" — not specified either. And no speculations here, as Apple had described the CSAM detection algorithm as almost perfect and flawless, so who knows what they are going to improve.
Everybody was against it. Consumers, activists, human rights advocates, technology and privacy experts, including us. Now we know that we are not just some privacy fetishists, we had our point: CSAM detection had been loaded by risks, threats, and abuse potential.
Of course, it helped a lot that Apple had a big upcoming event this September. A new iPhone was to be announced, and it had been no good to announce it stuffed with a collection of child porn, even if it looked like a bunch of hashes.
But not only commercial companies try to get us rid of privacy by appealing to child protection. Governments and international regulatory bodies do it too, and they are not bound by the necessity to think about their profits and balance between gathering more information or gaining control over people, and not scaring customers away.
This summer the European Parliament approved a "temporary regulation" that allows commercial companies that host web-based services to scan users' communications for signs of child abuse without becoming privacy laws violators (specifically, not to risk breaking GDPR rules).
The results showed that 537 members of European Parliament (MEPs) voted in favor of the bill, with 133 against and 24 abstaining. Despite the result, European lawmakers warned that the rules are "legally flawed" and could crumble in front of a court.
MEPs also decried the pressure they were under to approve the bill, calling it "moral blackmail", the press reported.
"Whenever we asked critical questions about the legislative proposals, immediately the suggestion was created that I wasn't sufficiently committed to fighting child sexual abuse," Dutch MEP Sophia in ‘t Veld said a day before the vote.
So it won't be any EU officials to monitor EU users' messages and emails for illegal content (there also had been audio messages in the first edition of the bill, they were omitted in the final one). It would be the commercial companies, service providers that can't fight the desire to protect kids. The initiative had belonged to the European Commission, and the Parliament passed the bill unusually quickly.
The Parliament tried hard to deliver their position via the media. "Service providers should use the least privacy-intrusive technologies possible", they assure us. Do we believe them? Hm.
Not much is explained about how exactly the monitoring is designed and implemented. "Online material linked to child sexual abuse is detected using specific technologies that scan content, such as images and text, or traffic data. While hashing technology helps with images and videos, classifiers and artificial intelligence are used to analyze text or traffic data to detect cyber grooming". — clearly, it's left up to the companies to decide the specifics, and the implementations may vary significantly.
The new approach to child protection threatens the very existence of encrypted messaging. Back in May, when Facebook announced its plans to add encryption to the Messenger app, the European Commission warned it that this move would turn the social network into "a haven for the pedophiles".
The new rules will be in action for three years. And the permanent legislation that is now being developed to replace them raises even more concerns. Firstly, it demands that encryption technologies allow scanning texts, images, videos in messages, chats, and emails. Secondly, it implies monitoring not only pornography or abuse, but also grooming — the process of building relationships with children in order to exploit them. This is a quite vague definition, and questions are being asked on whether robots would be able to detect it correctly.
Thirdly, now companies scan for child abuse volunteerly, new laws will make it mandatory.
The good news is, officials all over the world have been trying to dig under encrypted messaging for quite a long time by now, but haven't yet found a way to rob us of it. They know all too well it would result in massive outrage and in migration of the actual criminal activity into darknet or less known platforms that are under the radar of regulators. They do not want to kill WhatsApp or even Telegram.
Besides, a united Europe doesn't look so much united when it comes to the opinion on encryption. Privacy is advertised as one of the main values of the European culture and politics. But child protection and the fight against terrorism are as well! No wonder there are signs of some regulatory schizophrenia. Just a few examples:
2017: "A European Parliament committee is proposing that end-to-end encryption be enforced on all forms of digital communications to protect citizens", BBC reports.
2020: "The terrorist attack in Vienna is used in the EU Council of Ministers to enforce a ban on secure encryption for services such as WhatsApp, Signal and many others in the rapid-boiling process. This emerges from an internal document dated November 6th from the German Council Presidency to the delegations of the member states in the Council", directly contradicting the previous statement.
2021: The proposal called for the creation of a "balance" between "security through encryption and despite encryption". The proposal called on EU member states to "join forces with the tech industry" to jointly create this balance, and to define and establish a regulatory framework as well as innovative approaches and best practices to respond to these challenges.
The last sentence of the quote above sounds like a plan for the next 30 years or so. EU institutions usually don't work very fast, especially when there is no consensus between them and with the nation.
What a beautiful wording. Of course, we can understand the EU's rulers' desire to squeeze between Scylla and Charybdis unscratched. But the wording is just a desperate oxymoron, if we've already turned to loanwords from Greek, the tongue of Europe's mother culture.
There can be no partial encryption. All backdoors and exclusive only-for-the-state-that-only-wants-you-good accesses to sensitive personal data will be abused and will fall into the wrong hands sooner or later (it happens all the time, why are we even wasting letters again on this).
Even if you are (and especially if you are) an absolutely law-abiding citizen, have never blown up a single plane, or offended a single child, there is no reason to become an object of intrusive surveillance. Often performed with the help of contractors that are not employed by a company and not bound by its data management standards. You should not become a victim of people's or algorithms' honest mistakes and false positives; your security, safety, wellbeing, and reputation are not to be threatened.
So do not think that if you live in London or New Deli, you shouldn't be curious about Apple's initiatives on scanning photos of US citizens. It is a global trend. Every government is intended to keep an eye on the people. To watch its little brothers. To correct them when they are wrong and to punish them when they misbehave. The developing technologies create a lot of new benefits and opportunities, risks and threats, but even their creators can't always tell those apart — the technologies develop too fast.
And do not count on the future discovery of the balance or some golden mean between "security through encryption and despite encryption". When in need for security and privacy, choose apps that see full-fledged encryption as an imperative, and user protection as a priority.