Tech

Apple to scan all iPhones and iCloud accounts for child abuse images

Hot potatoes: Apple announced plans to scan all iPhones and iCloud accounts in the US for Child Sexual Abuse (CSAM) material. While the system can be useful for criminal investigations and has been praised by child protection groups, there are concerns about the potential implications for security and privacy.

The neuralMatch system will scan each image before it is uploaded to iCloud in the US using an on-device matching process. If he believes that illegal images have been found, the team of experts will be alerted. If child abuse is confirmed, the user account will be disabled and the US National Center for Missing and Exploited Children will be notified.

NeuralMatch was trained using 200,000 images from the National Center for Missing and Exploited Children. It will only tag images with hashes that match the hashes from the database, which means it doesn’t have to identify innocent stuff.

“Apple’s known CSAM detection method is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple continues to transform this database into an unreadable set of hashes that is stored securely on users’ devices, “the company said in a statement. Web site… It notes that users can appeal to have their account restored if they believe it has been flagged in error.

Apple already checks iCloud files for known child abuse images, but spreading this to local storage has troubling implications. Matthew Greene, a cryptography researcher at Johns Hopkins University, warns that the system could be used to scan other files, such as those that identify government dissidents. “What happens when the Chinese government says, ‘Here is a list of files that we want you to scan,” he asked. “Apple says no? I hope they say no, but their technology won’t say no. “

Additionally, Apple plans to scan encrypted user messages for sexual content as a safety measure for children. The Messages app will add new tools to alert children and their parents when they receive or send sexually explicit photos. But Green also said that someone could trick the system into believing that the harmless image is CSAM. “The researchers were able to do this quite easily,” he said. This could allow an attacker to frame someone by sending a seemingly normal image that starts the Apple system.

“Regardless of Apple’s long-term plans, they sent a very clear message. In their (very influential) opinion, it is safe to create systems that scan users’ phones for prohibited content, ”added Greene. “Whether they are right or wrong in this matter does not matter. It will break the dam – governments will demand it from everyone. ”

New features are coming to iOS 15, iPadOS 15, macOS Monterey and WatchOS 8, all coming this fall.

Credit to masthead: NYC Russ




Source link

Leave a Reply

Your email address will not be published.

Back to top button