Electronic privacy? There's no such thing

You will never be secure if you labor under the delusion of privacy

Most people suffer from the delusion of privacy. They think it can be guaranteed somehow for their various electronic gadgets. But that is a delusion, and sadly even many in the information security field don't know it. Still, it's surprising how strong the desire to believe otherwise is, and how tech companies will sometimes try to feed that illusion.

Take the news that the encryption in Apple's iMessage can potentially be cracked. I was surprised, but not because the encryption could be cracked. That's a given, no matter the encryption algorithm. I was surprised because I didn't know that iMessage used point-to-point encryption. I just assumed that Apple could always read my messages. Call me uninformed for having missed that news, but what I think is that I was actually better informed than those people who saw Apple's promise that it couldn't decrypt iMessage traffic and let the delusion of privacy lull them into thinking that was really true. Believe me, we'd all be better off if we just acted on the theory that there is likely to be a back door every time.

Don't get me wrong. The fact that iMessage uses encryption is refreshing. Such encryption will do a lot to protect most of us in most of what we do (but more on that later). What is not refreshing is that Apple at best implied and at worst misrepresented that its encryption was uncrackable. Any computer professional in this day and age who thinks that any form of electronic communications is completely secure really doesn't know his profession.

OK, I used to work at the National Security Agency, where I was taught that there is no such thing as unbreakable encryption -- just encryption that is strong enough. We used a relatively easy-to-describe formula based on how long information needed to be kept secret. For most time periods, you could come up with an encryption method and algorithm that (supposedly) couldn't be broken for that amount of time.

Using that rule of thumb, you could use relatively weak encryption for plans for a military battle that would happen within a week, but encrypting satellite communications would be trickier, and a lot chancier. After all, a satellite has a long lifespan, and meanwhile down on Earth, there will be exponential computing advances making it easier to break any encryption algorithm used. Complicating things, a satellite's hardware couldn't be replaced (unlike, say, the gear for naval communications). You would have to use encryption that is well beyond what is considered state of the art. But the NSA would never tell itself that the advanced encryption had solved the problem and guaranteed that the communications would be secure. It realizes that even state-of-the-art encryption will inevitably be broken. You just had to hope that, by the time it was broken, nobody would care about the underlying data.

We would all be better off if we stripped the scales from our eyes and admitted that digital privacy is an illusion. Then we could turn our attention to the value of the data involved and do what we think is appropriate for each situation, while never forgetting that nothing is completely safe forever.

Sure, it would be nice if nobody could read our communications without our say-so, but that isn't the case, and it never has been. And for the most part, such impossible security isn't actually necessary. Yes, it is creepy to think that people can read your private communications, but it's likely that very little of it really needs to be kept completely secret. Maybe none of it. You aren't the NSA. As for your company's communications, those also vary in their need to be protected.

Just remember: All encryption can be broken, and the more widely the encryption algorithm is used, the more likely it is to be broken or otherwise compromised. If the NSA isn't the culprit, it could be the Chinese, the Russians, the French, the Koreans, the Japanese, organized crime or computer enthusiasts with too much time on their hands.

No measure of security will ever be perfect. (Repeat this as a mantra as you try to shed the delusion of privacy.) It's not just that encryption can be broken. Your phone could be stolen, someone could shoulder-surf you as you enter or read data, your device could get infected with malware designed to steal data. Any of those things make encryption useless.

I got started on this topic because of something involving iPhones, but the hardware isn't the problem here. There are no alternatives that can be considered secure. BlackBerry's messaging system is possibly the most secure commonly used texting system, but even it was compromised when several governments around the world required the company to help them crack the encryption.

Security is about risk management. Start with the assumption that your communications can be compromised. The questions then become, "How likely is it that someone will attempt to do so?", "How will they do it?" and, most importantly, "Does it matter if they are successful?" Once you have answered those questions, you can decide how much effort to put into protecting your communications.

I don't think highly of any vendor that would say it provides a level of security that it doesn't. But anyone who believes a promise of perfect security is a fool.

Ira Winkler is president of Internet Security Advisors Group and author of the book Spies Among Us. He can be contacted through his Web site, irawinkler.com.

Read more about security in Computerworld's Security Topic Center.

Tags privacyApple

Show Comments