Update: Following Apple’s August 2021 announcement of its plans to scan devices for child-sex-abuse material, the company has said it would delay implementing this system in light of criticism from customers and concern groups. The information below reflects the initial announcement only.
News of Apple’s plans to scan users’ photos hit headlines last week, igniting debates over privacy, “back door” surveillance, and the need to fight illegal content. We explain the controversy and how this affects you.
What did Apple announce?
Apple has announced several methods it will use to help combat the spread of child-sexual-abuse images (known as CSAM). The most controversial one works by scanning Apple devices for such images.
How exactly does Apple’s photo scanning work?
With Apple’s upcoming operating system updates, the company will install databases of CSAM as strings of numbers (called database hashes) onto Apple devices. The system will automatically compare images on your device before they get uploaded to your iCloud Photos account against the databases.
If a certain number of photos (Apple has said about 30) match ones in multiple databases in at least two separate countries, a human will review the images and report them to the authorities. The system does not learn anything about photos that are not flagged as matches.
Whose devices does this affect?
For now, this will affect you if you are based in the U.S., use an Apple device, and use iCloud Photos. It will come into effect once you update to the next operating system: iOS 15 and iPadOS 15 on iPhone and iPad, macOS Monterey on MacBook, and watchOS 8 on Apple Watch.
Has this type of technology been used before?
Yes, companies such as Facebook and Google scan photos uploaded to their platforms for CSAM. The difference in this case is Apple is scanning photos on devices rather than on their cloud platforms.
Why are privacy advocates objecting?
Although the new feature is meant to stop the spread of CSAM, privacy advocates see the method as increased surveillance—a cost that’s too high, particularly because technologies, once established, can potentially be expanded to other uses.
The Electronic Frontier Foundation writes that Apple “has created an infrastructure that is all too easy to redirect to greater surveillance and censorship.” It notes that while there are some safeguards in place in the U.S. to prevent abuse of this program, this is not the case in every country.
Countries have been for years appealing to Big Tech to give them access (or so-called back doors) into encrypted communications. These demands undermine the concept of encryption, which is meant to keep messages private from anyone who is not the sender or recipient, including the company that hosts the service.
The EARN IT bill, also purported to fight CSAM, received similar backlash when it was introduced last year.
Why is this a big deal for Apple and its users?
So this latest move has made waves, and Apple might have chosen this course because of pending rules surrounding CSAM in the U.S. and abroad, which could involve hefty fines for companies.
Another reason this is a big deal: the sheer number of users Apple has around the world. It raises questions over how much choice consumers have over privacy when our lives are so entrenched in the products of a small number of technology companies.
How can I prevent Apple from scanning my photos?
You could choose not to use Apple products. If you do use Apple products, you could disable iCloud Photos, since only photos being uploaded to iCloud Photos are scanned. See the link below for alternatives.
Other resources on staying private:
- Top cloud services to store photos online (free and paid)
- The best way to secure your Apple devices
- Ultimate guide to mobile security for iPhone and Android devices
- Explainer: iPhone’s new App Tracking Transparency features
- Big Tech alternative: Install LineageOS on an Android phone