I love Apple as much as the next girl, and its stance on privacy is a big reason why. We all know that Apple makes a lot of noise around privacy, calling it ‘a fundamental human right’. But how much do we know (and I mean really know) about Apple’s privacy issues and practices?
Down the Rabbit Hole, We Go
To be honest, recent reports about Apple’s privacy issues have left me feeling a bit iffy. I’ve always wondered if the tech-giant really does stay true to its values that it so publicly flaunts, but I never actually got to the bottom of it for myself. So I decided it was time. And for the next couple of days I went down a rabbit hole of articles, videos, news clippings, and a fair share of conspiracy theories on Reddit.
But before I get into what I found, let me backtrack a little.
In the past, Apple has gone above and beyond to protect their user’s personal data. For instance, they scramble and encrypt the data used to store fingerprints and face ID, then store it on a special chip in the device that only the user can access. Apple Pay also employs similar measures, making sure that your credit card number is never transmitted to the payee or even your bank. And when you agree to share your personal data with app developers, like your contacts, photos, and notes, Apple doesn’t even see the data that’s being shared.
If we look back, we can’t deny that Apple’s history with data has been to err on the side of caution; sometimes even going so far as to put it before convenience. However, that doesn’t seem to be the case anymore.
In 2015, the FBI retrieved an iPhone 5C from one of the terrorists at the San Bernardino attack. The phone was valuable as it could potentially name other suspects who were involved in the attack. The only problem was that the phone was locked and set to erase all data after ten failed attempts. So the FBI turned to Apple, who then famously refused to provide a way to circumvent their security, effectively reinforcing the idea that they take privacy very seriously.
Fast forward to 2020, and it seems Apple has had a change of heart. According to a report by Reuters, six sources confirmed that Apple dropped plans to let iPhone users fully encrypt backups of their devices on iCloud after the FBI complained that it would get in the way of investigations.
What’s even more disappointing is that Apple changed their position on this subject two years ago and nobody knew about it. It just goes to show how much Apple has been willing to help law enforcement officials, despite presenting themselves as a defender of their customer’s information in high-profile legal disputes with the government. So I think it would be fair to say that most of Apple’s news about privacy has been hokum, to a certain extent.
The Guardian broke a story last year that convulsed the internet. They found out that Apple had been employing contractors to listen to and ‘grade’ Siri recordings without the user’s knowledge. These contractors regularly overheard a lot of confidential stuff including, medical information, drug deals, and even recordings of couples having sex! For a CEO that once made a plea to end the technology industry’s collection of user data calling it ‘surveillance’ and stating that ‘these stockpiles of personal data serve only to enrich the companies that collect them’, this change is ideology seems pretty hypocritical to me.
Even after the tech-giant issued a full apology, The Guardian’s whistleblower revealed that nothing has changed and that Apple ‘keeps ignoring and violating fundamental rights and continues massive collection of data’.
So does Apple sell your data? We don’t know for sure. Is this an isolated case? Certainly not. Amazon, Google, and Facebook have all admitted to similar practices. But the reason this rubs me the wrong way, particularly when it comes to Apple, is because they specifically trade on privacy as a selling point. And it goes without saying that for millions of iPhone users around the world who are paying a premium for a device that they believe protects their privacy, it is very worrying.
Even before Apple had a chance to recover from that blow, the company was hit by a massive zero-day hack just a few days later, which impacted every iPhone launched since 2013. The vulnerability was remarkable in its scope and impacted all versions of iPadOS as well as iOS 11, 12, 13, and 13.5, making these devices potential targets for hackers with less charitable intentions.
What’s scary is that this is just one part of a larger picture of iOS warnings. In April, ZecOps discovered a Mail vulnerability that affects every iPhone ever made. On May 24, Motherboard revealed that hackers have had their hands on Apple’s iOS 14 code for months. This carries enormous security risks and with Apple needing a win after several iOS 13 problems, iOS 14 risks being launched with even more issues than its predecessor. Let’s just pray this doesn’t happen.
In a 2018 article, Bloomberg’s Sara Frier asks the question, ‘Is Apple Really Your Privacy Hero?’ The reason I bring this up is because this article’s premise received some serious backlash. She said that Apple should take responsibility for the data that customers willingly give to the developers of the apps on their store. Which, I admit, sound a bit cuckoo, but I can also kinda see her point.
If Apple really did care about their user’s privacy, they could take any number of actions to keep those privacy violators off their platforms, far away from their customers. But they don’t. And stop me if I’m wrong but leaning on the policy as a prerequisite for action while grandstanding about the urgency of doing so makes it a lot worse.
Well, that was my take on Apple’s privacy issues. Now I know I may be way off the mark here, I wouldn’t paint myself as any kind of expert on the subject. But this is what I gleaned from my research. If I missed out on something feel free to take to the comments section. Just remember to be gentle.
What actions do you recommend Apple take to keep privacy violators away?