After fielding a torrent of backlash over privacy issues, Apple officials on Friday announced they’ve put the company’s new anti-child exploitation safety tool on hold.
“Last month, we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of child sexual abuse material,” reads a statement released by Apple. “Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements.”
The tool automatically scans iOS devices and iCloud accounts for photos that depict child abuse. Should parents opt in, it would notify them of any explicit photos that have been sent or received, in addition to blurring the questionable material, according to Apple.
Is this any different than giving a new phone app permission to access your data?