How would you feel if a stranger pawed through the photos on your iPhone? What about if they read your iMessages?
The answer is probably, “Not great.” That is, however, exactly what Apple has announced it will begin to do. Any time a user uploads photos to the iCloud service, Apple will compare the files to a blacklist to make sure it’s not child pornography.
Let’s get this out up front — child pornography is a despicable blight and should be treated as such. It is possible, however, to think that child pornography is bad and that Apple has made a terrible mistake in implementing this feature. It’s a bad idea for multiple reasons.
The thing about access to individuals’ data is that it isn’t actually possible to make it accessible to only one party. If Apple can screen your data, regardless of their intentions, so can malicious actors and the surveillance state apparatus. All access points to a person’s data are weak points for unauthorized intrusion. The more access points there are, the weaker the overall security is. An access point explicitly designed to be used without the data owner’s knowledge or permission isn’t simply like leaving the doors to your house unlocked. It’s like leaving the doors open before leaving on vacation and then publicly posting about it on social media.
If Apple can implement a program to look for particular strings of data, called hashes, to make sure it’s not on the naughty list, it can look for any kind of hash data. Today, it’s child pornography. Tomorrow, it could very well be anything the government or the public or even just Apple management deems objectionable.
Apple is a global company with global reach. While, in the past, the company has gone to great lengths to protect the privacy of its customers, both domestically and internationally, this current about-face sends a strong signal that it will no longer do that. It opens the door for gross human rights violations by those looking to stifle dissent or simply control behavior.
It isn’t only foreign governments that use surveillance to target people for reasons other than what was originally stated. The enormous US surveillance apparatus was developed to “fight terrorism” after 9/11. These days, it’s mostly used to prosecute completely unrelated crimes as a result of data rummaging, if there are any charges brought at all. It’s not a stretch to think Apple’s iCloud screening program could be used by government actors to check if a user has violated any of the literally innumerable federal laws on the books.
While the number and scope of laws is a separate issue, giving government easier access to private data to prosecute one thing — that never stays just one thing — is never a good thing for individuals’ constitutionally protected right to be secure in their papers and effects.
Yes, Apple is a private company, and their user agreements can, within some limits, be whatever they want it to be. Customers are free to either put up with it or to take their business elsewhere. The fact that Apple is using this screening program to check for actual criminal activity blurs that line between public and private actor, but customers are still able to stop using Apple products and services if they choose.
This entire situation does highlight the need for consumers to be aware of the risks they run with their data before deciding to entrust it to a third party as well as the need for robust guardrails for government agencies that want to access that data through the third party.
Thankfully, Utah is taking digital privacy protection seriously. Libertas proposed model legislation on protecting privacy that became this year’s HB 243, sponsored by Rep. Francis Gibson, which sets up a review process for government use of technology that could potentially violate individuals’ privacy. This enables the Legislature to institute limits on and oversight for what may or may not be done with surveillance technology by government.
So much of our vital information is digital. Without robust protections for that data, we all run the risk of being constantly and comprehensively watched, measured against an unknowable list of what is and is not allowed.
It’s a shame that Apple has discarded its principles regarding privacy, but that doesn’t mean the concept should be abandoned altogether.