Apple plans to scan iPhones and iPads for child abuse imagery
The Apple Corporation is planning to test a new iOS and iPadOS technology, that allows it to find images of child abuse and ban the sharing and distribution of such content. This was recently announced on Apple’s website.
The company claims that the technology was developed with the user’s privacy in mind. Images on iCloud will not be scanned, but, instead, the technology will be using cache-sums (digital fingerprints) of images. It will then correspond the found imagery with a database of already known abuse images.
This database was, in fact, already handed over to Apple, by the NCMEC and other organizations that are involved with child protection.
Apple has stated that the company itself will not be able to interpret the data until an account reaches a threshold number of images. The company claims that the new technology has a one in a trillion chance to make a mistake.
After the threshold is reached, Apple will manually check every report to verify the data and block the account. After that, the data will be handed over to the NCMEC. The user will be able to contest this decision if they believe their account was banned wrongfully.
On the 5th of August Financial Times released an investigation, that Apple plans to install software called neuralMatch on the devices of its US users. It will allow the company to scan images on the devices to search for sexual abuse imagery.
On the very same day, a professor of the John Hopkins Information Security Institute, an expert in online security, Mathew Green shared the latest development by Apple, calling it a “really bad idea”.
I’ve had independent confirmation from multiple people that Apple is releasing a client-side tool for CSAM scanning tomorrow. This is a really bad idea.
— Matthew Green (@matthew_d_green) August 4, 2021
Financial Times does make it a point, that Apple’s main goal is to contribute to the safety and well-being of our society.
“We want to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of child sexual abuse material (CSAM)”
Still, most concerned with online safety, see this as a gross invasion of privacy, and Big Tech is once again abusing its monopoly over users.