Apple was developing a system that scanned the iPhone for images and videos about child abuse.
However, its use and operation were unclear.
Apple has removed all references to the scanner it was developing to identify files related to child abuse on iPhones.
Those in Cupertino have removed the mentions with the iOS 15.2 update , a patch that also brought new protections for children that include a warning about explicit content in iMessage and protection for searches through Siri, Spotlight and Safari.
What was the child abuse scanner about?
In August, different engineers found fingerprints from an iPhone scanner that would search for illegal material such as child pornography .
Using algorithms, the team’s system would search for matches between the images on the phone and the material already found by police investigations of child pornography, which will serve as clues.
Apple had received a great deal of criticism for its announcement that it would check iCloud photos for images depicting child sexual abuse material. People were clearly more concerned that it was an invasion of their own privacy than a way to protect children.
For this reason, this update was suspended , but not canceled, by Apple.
“Previously, we announced plans for features designed to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of child sexual abuse material. Based on feedback from customers, advocacy groups, Defense, researchers and others, we have decided to take more time over the next several months to gather information and make improvements before launching these critically important child safety features, ”the company noted.