Apple's privacy play: A good start, but far from perfect

Apple’s senior vice president of Software Engineering Craig Federighi boasts of a safe and secure Siri during Apple’s 2015 Worldwide Developers Conference.

Apple’s senior vice president of Software Engineering Craig Federighi boasts of a safe and secure Siri during Apple’s 2015 Worldwide Developers Conference.

Unless you've been living under the proverbial rock, you've probably noticed how, in the last year or so, a parade of Apple executives has worked hard to bring the problem of digital privacy--and their company's solution to it--to the forefront of our attention.

It's not hard to see why this makes sense from Apple's point of view. It only takes a few minutes spent navigating the Web before you start noticing ads that seemingly follow you everywhere you go, and appear to know a bit more about you and your habits than you feel comfortable with. Add to that the endless leaks of classified documents that show how keen our governments are on keeping close tabs on their citizens, and you end up with a marketing opportunity that plays right to Apple's strengths.

Focusing on privacy allows Apple to align its marketing strategy with the needs of its customers through a value proposition that is simple and easy to understand: The folks from Cupertino sell us hardware, we pay for it, and that's the end of it--there is no need for Apple to try and make money by mining us like so many deposits of data that can be repackaged and sold to third parties, either directly or indirectly.

Speaking from a position of privilege

As much as I like Apple's position, I feel a twinge of guilt at the fact that the company's brand of privacy comes--quite literally--at a price. Not everyone can afford a Mac or an iPhone, and it's probably fair to say that billions of people in today's world are only able to access the Internet thanks to "free" services that rely on data mining to sell ads.

It's obviously not Apple's job to figure out a way for other companies to balance their need for revenue with the rights of their users, but it's also unconscionable to blindly accept the vision of a future in which only the well-off can afford to protect their lives from unwanted attention.

Digging a little deeper, Apple's marketing blitz has a more significant problem: It requires actually trusting Apple, and that's a much harder thing to do than it seems.

Can you really trust a company?

Regardless of how much effort it puts into creating a privacy-conscious environment, Apple will always need to collect some data from us. For example, using iCloud for email means that all your messages have, for practical reasons, to go through the company's servers in unencrypted form. Even where end-to-end encryption can be used, as is the case with Messages, the mere fact that a message is being exchanged between two users could, under the right conditions, be a very valuable piece of information.

Now, it seems easy to me to trust Tim Cook and his colleagues to handle this data the right way--they have, so far, managed a fine job of running one of the world's largest companies in a very ethical way, and they seem genuinely committed to making a difference in the way user information is gathered and used.

The problem is that these fine folks are not Apple: They are just Apple's management. Eventually, they will be replaced by other managers who may, or may not, share their view on how the privacy of their customers should be treated--and those new managers will still be sitting on a considerable cache of data that encompasses the digital lives of billions of people.

This may seem like splitting hairs, perhaps, but recent history is rife with examples in which the marriage of technology and data has, in the wrong hands, resulted in human suffering and the subtle disenfranchisement of minorities. More mundanely, we have the recent case of Radio Shack trying to sell its customer list as a bankruptcy asset, in clear violation of its own privacy policy.

Data-hungry is not necessarily evil

Opposite Apple's corner, we can find entities like Facebook and Google. The latter's very mission statement says that its goal is to "organize the world's information and make it universally accessible and useful"--a goal that can, clearly, only be achieved if the company manages to collect that information in the first place.

I'm not a huge fan of ad-driven systems, but I don't entirely hate them. As I mentioned earlier, services like Gmail and Facebook have opened the Internet to many people who would be otherwise shut off of the incredible opportunities that the digital highway has to offer. They have also pioneered much of the "big data" work that is rapidly bringing about a revolution in everything from healthcare to traffic and transportation.

The problem with these companies is not that they are "evil," but, rather, that they trivialize the importance of privacy. The average Internet user is not equipped with the expertise required to understand that posting to Facebook or doing a search on Google really means inserting new rows in a giant database that never forgets, and that these "free" services are paid for by the data unwittingly fed into their systems.

Privacy, front and center

In the long term, I don't see that much practical distinction between Apple's model and Google's or Facebook's. Both require us to trust third parties over which we have no control with varying amounts of our most intimate information, and hope that they won't, at some point in the future, either change their minds or be forced to use the data in ways that will harm us.

On the plus side, this renewed focus on privacy is forcing all of us to face the importance of our digital footprint and the consequences of unbounded access to large amounts of personal data. Hopefully, what started as a marketing war between leading technology companies will soon turn into a society-wide debate on how to tackle this problem in a way that puts control over our Internet lives where it belongs: In our hands.

Tags privacyApple

Show Comments