Gadgets News

Apple Reportedly Plans To Start Scanning US iPhones For Child Abuse

Apple is reportedly planning an update that will allow iPhone to scan for child sexual abuse images. According to , the company informs security researchers about neuralMatch, which “continuously scans photos that are stored on a US user’s iPhone and are also uploaded to their iCloud backup system.”

The system will “proactively alert the reviewer group if it believes illegal images have been found,” and reviewers will alert law enforcement if the images are reviewed. The report says the neuralMatch system, which has been trained using the database of the National Center for Missing and Exploited Children, will only run on iPhones in the United States.

The move would be somewhat perplexing for Apple, which has previously opposed law enforcement to protect user privacy. In 2016, the company clashed with the FBI after it refused to unlock an iPhone belonging to the man behind the San Bernardino terrorist attack. CEO Tim Cook said at the time that the government’s request would have far-reaching implications that could effectively create a loophole for increased government oversight. (The FBI eventually turns to an outside security firm to unlock the phone.)

Now, security researchers are raising similar concerns. Despite broad support for scaling up efforts to tackle child abuse, researchers who spoke with FT said it could open the door for authoritarian regimes to spy on their citizens, as a system designed to detect one type of image could be extended to other types of content, such as terrorism or other content perceived as “anti-government.”

At the same time, Apple and others have faced growing pressure to work with law enforcement. Social media platforms and cloud storage providers such as iCloud already have systems for detecting child sexual abuse images, the report points out, but extending such efforts to images on a device would be a significant shift for the company.

Apple declined to comment FT, but the company may release more details on its plans “as early as this week.”

Update 08/05/16 at 4:00 PM ET: Apple has confirmed plans for start testing a system that can detect child sexual abuse images stored in iCloud Photos in the United States. “Apple’s known CSAM detection method is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations, ”the company said. wrote in a statement. “Apple is also converting this database into an unreadable set of hashes that is stored securely on users’ devices.”

The update will be released at a later date along with several other child safety features, including new parental controls that can detect candid photos in children’s messages.

All Engadget recommended products are handpicked by our editorial team, regardless of our parent company. Some of our stories contain affiliate links. If you buy something from one of these links, we may receive an affiliate commission.


Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button