Update 9/4: Apple has temporarily paused but not abandoned the implementation of these features due to the widespread criticism (read: potential lost sales), and will be announcing changes at a later date. It is somewhat refreshing to see this much widespread backlash at a time when mass surveillance has largely become the norm.

Original Post: Earlier this month Apple revealed two major new technologies that will be included in the next major release of iOS and MacOS, both of which Apple says are being done to protect minors. Unfortunately there are some serious privacy concerns being raised about how this could be abused, and what this means for the future of privacy on Apple devices. A letter to Apple urging them to drop these features has been signed by 90 policy groups around the world including the ACLU, EFF, and CDT.

  1. On-device scanning for CSAM (child sexual abuse material) at the time of upload to iCloud.

Apple is stating that to combat CSAM, images uploaded to iCloud will, through cryptographic means, be checked against a database of known images on the device at upload time. Microsoft and Google similarly check photos stored in their cloud environments against a CSAM database, so that end result is not new. At a certain threshold, matches will be manually reviewed and then sent to the relevant authorities if they are, in fact, CSAM.

  1. On-device scanning of iMessage content for sexually explicit imagery, and the ability for parental controls to send alerts if a child account is sending or receiving such content

Apple has built machine learning technology that will be able to detect new explicit imagery, instead of simply match against a database of known images. There are obvious benefits to parental controls here, but they do come at a cost. Namely, the potential for abuse.

While Apple says these changes have been done in the name of protecting the children, consider that both features are of limited effectiveness for their stated purpose, and potentially ripe for grave abuse. If implemented as described, the CSAM scanning will likely be of use only to catch lower-level offenders who are uploading known CSAM to iCloud. It will do nothing to catch producers of CSAM, nor those distributing it, nor anyone who simply un-checks “sync to iCloud”. Apple could have chosen to perform this check at their datacenters but instead have intentionally chosen to build the check into the devices themselves. The iMessage scanning, while arguable more useful for its intended purpose, will be easily avoided by using any different app to send explicit content, and it too operates on the device.

What Apple has effectively done, is create a massive surveillance machine and told the world they will only allow it to run in “safe mode” protecting the children, and nothing else, they swear. Once this technology is built, Apple can control the definitions for the kind of contents it searches for, and adjust the scope of what content is scanned. It is very difficult to see how governments would not quickly attempt to order Apple to use its new powers to search devices contents for all manner of objectionable material, particularly in countries less free than the United States. And even here, if this technology had existed in 2002, would the government not have attempted to order Apple to begin scanning devices for evidence of terrorism? They created a multi-billion dollar illegal phone and internet surveillance program after all. The fourth amendment is supposed to protect us from warrantless searches by the government, but not from private companies. Apple had previously argued they couldn’t search their devices because they did not create the technology to do so. If these changes are made, Apple will only be able to say they will choose not to. And this does not even touch on the technical problems that may exist, such as hash collisions being possible.

I will be frank. I do not believe Apple truly cares about the privacy of its users. I believe they care about the perception of privacy insofar as it can be used to market their products to customers. Apple products for the time being are still more private than their competition from Microsoft and Google, but this is a disappointing turn from a company that many believed truly cared, even after they ceded to pressure from the FBI and opted to not enable end-to-end encryption for iCloud data. The day after the announcement of these changes, Apple distributed an internal memo which included a statement from NCMEC calling the widespread criticism “the screeching voices of the minority”, if that is any indication as to how they view those of us who believe privacy is a fundamental human right, as this screeching voice certainly does.