x

#AskApplesutra

Hey Siri, I have a question for Team Applesutra

    Thank You We’ll get back to you faster that you can say iPhone 12 Pro Max!

    Apple Plans to Scan Pictures

    Apple’s Controversial CSAM Updates: All this Privacy Talk, Then Apple Wants to Scan Your Pictures

    Spread the word

    If all this time you’ve been thinking of Apple as the one company in the sea of others who actually gives a damn about your privacy, let us disillusion you of this notion. We’ve ourselves remarked numerous times that the company seems to care about preserving its users’ privacy. However, as you might know if you’ve been following us for a while, it’s been a little difficult to believe that recently. And now that Apple has revealed its plans to scan pictures of iPhone users… Well, it’s safe to say we’re not very impressed.

    Decoding Apple’s New CSAM Scanning Features

    Before we go into the whole thing, let us provide a bit of context. Apple recently announced three new features meant for preventing the spread of Child Sexual Abuse Material (CSAM). The features are as follows:

    Expanded Guidance for Siri and Search

    Users will now be pointed to resources and guidance when making searches related to CSAM online, with Siri or otherwise. This includes when users directly search for resources to keep children safe, such as how to file a report, and also intervening when users search up CSAM content.

    For the latter, Apple says that it will inform users that “interest in this topic is harmful and problematic, and provide resources from partners to get help with this issue.” Looks good up until this point, right? Here’s an important feature here that will point people in the right direction.

    This fairly uncontroversial update will be coming with iOS 15, iPadOS 15, watchOS 8, and macOS Monterey later this year. Now let’s get on to where we began to feel troubled.

    Detection of Explicit Pictures in Messages and Photos

    Now, here’s where things get a little more concerning. The other two features enable machine learning to detect explicit photos stored in your iPhone and Messages app. The former, upon finding a specific number of matches to CSAM content on your device, will alert Apple. From there, it will be verified by a human, and if found to be CSAM, reported to relevant authorities.

    As for Messages scanning, Apple is introducing a feature that alerts children when they’re about to view explicit pictures. They’re also notified that if they choose to view it, their device will send an alert to their parents.

    Now, while these may sound necessary precautions to some, there’s a legitimate reason why people have been so against this.

    The Swift Backlash Apple, for Some Reason, Did Not Expect

    For years, Apple has strongly opposed the idea of building a backdoor into their devices to allow access to anyone—even law enforcement. Naturally, this led to a belief that Apple is willing to fight for its users’ privacy. How are users, then, supposed to react to the company itself building a backdoor, however well-intentioned, into its device?

    For one, the image scanning will happen right on your phone. Apple says that it designed this with user privacy in mind, and that the technology “determines if there is a match without revealing the result”. It “encodes the match result along with additional encrypted data about the image”, which is then uploaded to iCloud Photos as a voucher alongside the image. If the number of these vouchers reaches a certain threshold, only then will this be reported and verified by a third-party.

    Which brings us to our next point: Apple says users who have iCloud Photos disabled will be able to opt out of this feature—in which case, how is that helpful at all for detecting child sexual abuse offenders? Such users can easily turn off iCloud Photos, and then this would all be for nothing.

    The only thing this serves to build is a case for those who for years have been asking for a way to be let into the iPhone walls without consent of the users. Consider arguments like: if you can do it for CSAM, then why not [insert reason]?

    So, What Now?

    Despite backlash and even concern from its own employees, the company seems bent on continuing with applying the changes. The three features will roll out later this year with the launch of iOS and iPadOS 15. All we can do is now wait and watch, and see how this turns out in the long run.

    In the meantime, you can learn more about these features from the six-page FAQ Apple uploaded. What are your thoughts on Apple’s controversial plan to scan pictures stored in users’ phones? Tell us in the comments below.

    Write a comment

    Leave a Reply

    Your email address will not be published. Required fields are marked *

    One More Thing
    Come say Hi
    AskApplesutra

    Stop by, say hi, and make our day!

      Thank You We’ll get back to you faster that you can say iPhone 12 Pro Max!