When Apple announced a test of a new tool aimed at combating child exploitation last month, it made headlines — and not in a good way. The feature’s potential privacy implications were quickly decried by critics, and Apple is now taking a long break before moving forward with its plans.
The company announced on Friday that it will take a break from testing the tool in order to gather more feedback and make improvements.
The plan revolves around a new system that, if implemented, will scan iOS devices and iCloud photos for images of child abuse. It includes a new opt-in feature that will warn minors and their parents about sexually explicit image attachments in iMessage and blur them.
“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material,” the company said. “Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”
Last month, Apple held a series of press calls to explain the planned tool, emphasizing that consumers’ privacy would be protected because the tool would convert photos on iPhones and iPads into unreadable hashes, or complex numbers, which would be stored on user devices. Once the photos were uploaded to Apple’s iCloud storage service, those numbers would be matched against a database of hashes provided by the National Center for Missing and Exploited Children (NCMEC). (Apple later stated that other organizations, in addition to NCMEC, would be involved.)
Apple’s review team would be notified only after a certain number of hashes matched the NCMEC’s photos, allowing it to decrypt the information, disable the user’s account, and alert NCMEC, which could inform law enforcement about the existence of potentially abusive images.
Many child safety and security experts applauded the plan’s goal, recognizing a company’s ethical responsibilities and obligations regarding the products and services it creates. However, they warned that the efforts could jeopardize people’s privacy.
“When people hear that Apple is ‘searching’ for child sexual abuse materials (CSAM) on end user phones they immediately jump to thoughts of Big Brother and ‘1984,’” Ryan O’Leary, research manager of privacy and legal technology at market research firm IDC, told CNN Business last month. “This is a very nuanced issue and one that on its face can seem quite scary or intrusive.”