Apple may have some plans to scan devices for CSAM material, but the European Commission has put messaging services back in the spotlight with a move to force them to start monitoring for such material.
CSAM is emerging as a privacy test
When it comes to child protection, this is a good thing. The element of child sexual abuse (CSAM) is a much bigger problem than many people realize; The victims of this horrific trade end up living fragmented lives.
What’s up According to Uraktiv, The European Commission plans to introduce the necessary messaging services to scan for CSAM material. However, Europe seems to understand some of the arguments raised by privacy advocates against Apple’s original proposal and is emphasizing some restrictions, in particular:
- Scanning technology must be ‘effective’.
- It must be ‘reasonably reliable’.
- And it must avoid the collection of “any information other than relevant communications other than the information strictly necessary for identification.”
Of course, ensuring that the system is “reliable” is a challenge.
Just what is reliable?
When Apple announced its own platform for CSAM scanning on its platform, researchers at Imperial College London soon warned that the technology behind the system was easy to fool, calling it “not ready for hiring”.
Apple later backtracked on its plans and later launched a system to monitor such content in its messaging app. It has not yet expanded into the on-device analysis of human photo libraries, as it originally intended. Like other image archiving companies, this is possible by scanning photos stored on iCloud.
When it comes to Europe’s proposals, it is to be hoped that the ban on creating “reasonably reliable” systems will ultimately lead to some understanding of the evidence. While these restrictions do not completely rest on people’s minds, as the threat of abuse of such technology by repressive or authoritarian governments remains, it at least sets out dynamic measures that can be integrated around understanding what online privacy rights should be. Stay.
At the same time, EC proposals appear to be a threat to the use of end-to-end encryption, which Apple continues to argue for its protection.
Towards a digital bill of privacy rights
The lack of a clear and consensual set of rights to protect online privacy is becoming increasingly critical as the world becomes more connected. At the same time, Europe is pushing for regulations – such as mandatory sideloading – that could compromise privacy and security on devices. These two strands seem to be philosophically antagonistic, but it is possible that regulators and lawmakers, given the complexity of these issues, will begin to see a glimmer of light.
I think Apple is working to encourage this, because it seems increasingly important (even the World Economic Forum agrees) that an international standard has been created to define digital rights. And the need for that value is growing.
Europe understands this; It introduced a declaration on digital rights and policies for EU residents in early 2022.
When that happened, Margaret Vesteger, executive vice president of Europe Fit for the digital age, said in a statement: “We want secure technology that works for people and that respects our rights and values. Also when we are online. And we want everyone to be able to take an active part in our growing digitalized society. This declaration gives us a clear reference point in the rights and principles of the online world. “
What should those rights be?
Apple executives have been actively lobbying for a framework around such rights for some time. Since Apple CEO Tim Cook’s powerful speech on digital surveillance in 2018, the company has consistently and (mostly) consistently lobbied for agreements on personal privacy. Cook’s company is working to provide such rights on a unilateral basis, but also calls for universality in such protections. Apple argues for the following four pillars:
- Users should have the right to minimize personal data.
- Users should have the right to know what information is being collected on them.
- Users should have the right to access that data.
- Users should have the right to protect that data.
While we are all aware that some business models will be forced to change as a result of such a set of rights, the introduction of some digital certainty will, at the very least, help promote an equal playing field in technology.
And the need for a well-balanced balance between personal rights and collective responsibility seems stronger today than ever before.
Follow me TwitterOr join me on AppleHolic’s bar & grill and Apple discussion group on MeWe.
Copyright © 2022 IDG Communications, Inc.