Gadgets News

Apple Announces New iPhone Features to Detect Child Sexual Abuse

Following a report on the work the company has been doing to create a tool that scans iPhone for child abuse, Apple released posted a post which provides more details on his child safety efforts. With the release of iOS 15, watchOS 8, and macOS Monterey later this year, the company says it will introduce a host of child safety features in messages, photos, and Siri.

For starters, the Messages app will include new notifications that will alert children and their parents when they send or receive sexually explicit photos. When someone sends an inappropriate image to a child, the app blurs it and displays multiple alerts. “It’s not your fault, but confidential photos and videos can be used to harm you,” says one of the notices in a screenshot posted by Apple.

As an added precaution, the company says Messages can also notify parents if their child decides to go ahead and view a confidential photo. “Similar protections are available if a child attempts to send sexually explicit photographs,” says Apple. The company notes that the feature uses machine learning on the device to determine if a photo is explicit. What’s more, Apple doesn’t have access to the messages themselves. This feature will be available for family iCloud accounts.

Apple will also unveil new software tools in iOS and iPadOS that will allow the company to detect when someone uploads content to iCloud that shows children engaging in sexually explicit activities. The company says it will use this technology to notify the National Center for Missing and Exploited Children (NCMEC), which in turn will work with US law enforcement. “Apple’s method of detecting known CSAM [Child Sexual Abuse Material] designed with user privacy in mind, ”the company says.

Instead of scanning photos as they are uploaded to the cloud, the system will use an on-device database of “known” images provided by NCMEC and others. The company says the database cannot be read using image hashing, which turns photos into a sort of digital fingerprint.

A cryptographic technology called private set intersection allows Apple to determine if there is a match without seeing the outcome of the process. If matched, the iPhone or iPad will generate a cryptographic security voucher that encrypts the download along with additional details about it. Another technology called secret sharing threshold allows a company to not see the content of security vouchers unless someone downloads an unspecified CSAM content threshold. “The threshold has been set to provide an extremely high level of accuracy and guarantees less than one case per trillion a year that this account is mis-tagged,” the company said in a statement.

Developing …

All Engadget recommended products are handpicked by our editorial team, independent of our parent company. Some of our stories contain affiliate links. If you buy something from one of these links, we may receive an affiliate commission.

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button