Tech

Why Apple’s iOS 15 Will Scan iPhone Photos and Messages

Apple, the company that proudly advertised its bona fide user privacy in a recent preview of iOS 15, recently unveiled a feature that seems to go against its privacy principle: the ability to scan iPhone photos and alert authorities if any of them contain sexual content. Child Abuse (CSAM). While tackling child sexual abuse is objectively a good thing, privacy experts aren’t happy with how Apple has decided to do it.

Apple’s new “enhanced child protection” may not be as bad as it sounds if the company keeps its promises. But it is also another reminder that we do not own our data or devices, even those that we do physically. You can buy an iPhone for a substantial amount, take a photo with it and put it in your pocket. And then Apple can, figuratively speaking, go into this pocket and into this iPhone to make sure your photo is legal.

Apple announced last week that new photo scanning technology for CSAM will be rolling out to users with upcoming iOS 15 and macOS Monterey updates. Scanning images for CSAM not news – Facebook and Google have been scanning images uploaded to their platforms for years, and Apple already has access to photos uploaded to iCloud accounts. Scanning photos uploaded to iCloud for CSAM detection would make sense and meet the requirements of Apple’s competitors.

But Apple is doing something a little different, something that seems more aggressive, although Apple says it should be less so. The images will be scanned on the devices themselves, not on the servers you upload your photos to. Apple also says it will use new tools in the Message app that will scan photos sent to or from children for sexual imagery, with the ability to tell parents of children 12 and younger if they’ve viewed those images. Parents can choose these features and all scanning takes place on the devices.

In fact, a company that has taken not one but two highly publicized positions against the FBI’s demands to create a loophole for the phones of suspected terrorists. seemingly created a back door… It’s not immediately clear why Apple is taking this step at this time, but it may have something to do with laws pending abroad and potential laws in the US. Companies can currently be fined up to $ 300,000 if they find CSAM but do not report it to authorities, although they don’t have to look for CSAM

After backlash after the initial announcement of new features, Apple released an FAQ on Sunday with some clarifying details on how its scanning technology works on a device. Essentially, Apple will download a database of known CSAM images from the National Center for Missing and Exploited Children (NCMEC) onto all of its devices. CSAM has been converted to strings of numbers, so no images are downloaded to your device. Apple technology scans the photos in your iCloud Photo Library and compares them to the database. If it finds a certain number of matches (Apple did not indicate which number), the person will review it and then report it to NCMEC, who will pick it up from there. It does not analyze photographs to look for signs that they may contain CSAM, as the Messages tool does; it just looks for matches with known CSAM.

Additionally, Apple states that only the photos you choose to upload to iCloud Photos are scanned. If you turn off iCloud Photos, your images won’t be scanned. Back in 2018 CNBC reported which was roughly 850 million iCloud users, of which 170 million paid for additional storage (Apple offers all iPhone users 5GB of cloud storage for free). So a lot of people can get hurt here.

Apple says this method has “significant privacy benefits” over simply scanning photos after uploading them to iCloud. Nothing leaves the device and is not visible to Apple unless there is a match. Apple also claims it will only use the CSAM database and refuse any government requests to add any other types of content to it.

But privacy advocates believe the new feature will open the door to abuse. Now that Apple has established that it can do it for some images, it will almost certainly be asked to do it for others. Electronic Frontier Foundation easily sees a future in which governments will force Apple to scan user devices for content banned in their countries, both in iCloud photo libraries on devices and in user messages.

“This is not a slippery slope; it is a completely built system that just waits for the slightest change from external pressure, ”the EFF said. “After all, even a well-documented, elaborate and narrowly constrained backdoor is still a backdoor.”

The Center for Democracy and Technology said in a statement to Recode that Apple’s new tools are a serious concern and represent an alarming change from the company’s previous stance on privacy. He hoped Apple would reconsider this decision.

“Apple will no longer offer fully encrypted messaging over iMessage and will undermine the privacy previously offered for storing iPhone users’ photos,” CDT said.

Will Cathcart, head of Facebook’s WhatsApp encrypted messaging service, criticized Apple’s new action on his Twitter thread:

(There has been a controversy between Facebook and Apple since Apple introduced its anti-tracking feature on its mobile operating system, which Apple introduced as a way to protect the privacy of its users from companies that track their app activity, Facebook in particular. , you can imagine that the Facebook executive was thrilled to have the opportunity to speak out about Apple’s own privacy concerns.)

And Edward Snowden expressed his thoughts in the form of a meme:

Some experts think Apple’s move could be good – or at least not as bad as it seems. John Gruber wondered if it could give Apple the ability to completely encrypt iCloud backups from government surveillance, and also be able to claim that it is tracking its users’ content for CSAM.

“If these features work as described and Only as described, there is almost no cause for concern, “Gruber wrote, acknowledging that there are still” legitimate concerns on the part of credible experts about how these features might be misused in the future. “

Ben Thompson of Stratechery pointed out that this could be Apple’s way of getting ahead of possible laws in Europe requiring ISPs to look for CSAM on their platforms. In the States, American legislators tried to get through their own legislation, which would presumably require Internet services to monitor their platforms for CSAM, or otherwise lose their Section 230 protection. It is possible that they will re-introduce this bill or something similar to Congress.

Or maybe Apple’s motives are simpler. Two years ago by the New York Times criticized Apple, along with several other tech companies, for not doing as much as it could to scan its services for CSAM, and for applying measures such as encryption that made such scanning impossible and CSAM harder to detect. According to the Times, the Internet was now flooded with CSAM.

Apple did not object to being accused of protecting the data of dead terrorists, but it may have been viewed as contributing to child sexual abuse.




Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button