Ken van Wyk: Is Apple getting serious about security?

There have been a couple of glimmers of hope lately, but company has a long way to go

There was a disturbance in the force this past week. Did you notice? Apple publicly made a positive security move. Seriously. (Stop laughing!)

The company invited some security folks to take a preview look at its upcoming Lion version of OS X . Could it really be that Apple has woken up and smelled the security coffee? Say it's so. It sure would make a lot of security folks happy.

And now, combine this positive step forward with the fact that Apple reportedly hired recently a senior executive who will be in charge of security across the corporation, and it could be the beginning of something long needed from One Infinite Loop.

Let's hope so. After all, Apple sure has a long way to go.

Most anyone who has read my columns or my Web site knows that I am quite fond of many of Apple's products. I run my small business entirely on Apple gear, and I firmly believe my company is better off for it.

Recently, though, I've spent a considerable amount of time taking a pretty deep dive into iOS security, and some of my realizations have shaken my faith in Apple to its core.

I've always felt that if Apple really decided to take security seriously, it would do a brilliant job, since it's brilliant at so many other things. It's just that it hasn't seemed to take security seriously -- at least until now (perhaps).

To illustrate, let's explore a couple of Apple's blunders a bit.

All current iOS devices have a 256-bit hardware AES encryption module that is used to encrypt sensitive user data. Every iOS device contains a unique 256-bit key. Sounds bulletproof, doesn't it? Well, here's the rub. The key is protected by the device's PIN, and the PIN can be disabled in a matter of seconds using forensic or jailbreaking software. So, user data on a lost or stolen iPad can be trivially obtained by the new "owner" of the device.

This also sounds good: Apple provides iOS developers with an easy-to-use API for protecting sensitive files. Basically, just tell the operating system that a file requires protection, and the operating system uses its AES-256 encryption to protect that file from unauthorized disclosure. But since that encryption is only PIN-protected, Apple's security measures sound promising but come up short.

And here's another one. By default, when a user presses the home button while running an app, the operating system stores a JPG image of the user's screen, complete with any sensitive information that may be on the screen at the time. That JPG file isn't even encrypted, and it can be copied off an iOS device in seconds via a free tool like iPhone Explorer and a USB cable, even if the device is PIN-locked.

Even the keychain mechanism that applications can use to store users' log-in credentials and other small pieces of sensitive data has been shown to be easy to break in most cases.

The bottom line is that any software developer who uses Apple-provided security mechanisms in an iOS application exposes users' sensitive data to compromise. It is up to the application developer to use non-Apple security mechanisms to protect sensitive data.

In fact, in order to write an iOS app that is even reasonably secure, the iOS developer has to spend a significant amount of time learning where all the mines are in the minefield they're about to traverse. Then, they have to go out of their way to write code that specifically turns off some of these "features" like the screen shots. All of this, when they should be focusing on positive prescriptive actions to protect their users' sensitive information.

No, this is not an encouraging outlook at all. Especially since Apple's own documentation and product descriptions tout the security of the AES-256 hardware encryption, file protections, etc. I've even found no shortage of Web sites, forums and Apple documentation that recommends using these security features to protect user data. On the surface, these things all sound secure, but the reality is that they're not.

That is why, after learning of all of these blatant security fails, I knew it would be tough to continue to be an Apple fan as well as a security guy. The two notions seem incongruent.

And then I saw a few outward glimmers of hope. Hiring an executive who will be in charge of security is one good step. Engaging with the community of people who have repeatedly discovered horrific product weaknesses in their products is another one.

I'm not going to get my hopes up too much, though -- particularly fresh on the heels of dealing with the iOS security nightmares I've described here. It's still far too soon to tell. Let's see how things progress over the next year or so.

And in the meantime, let's continue to make double sure that we're protecting our own and our customers' information properly, even when the tools we have for the job fall short of being up to the task.

With more than 20 years in the information security field, Kenneth van Wyk has worked at Carnegie Mellon University's CERT/CC, the U.S. Deptartment of Defense, Para-Protect and others. He has published two books on information security and is working on a third. He is the president and principal consultant at KRvW Associates LLC in Alexandria, Va.

Read more about security in Computerworld's Security Topic Center.

Tags mobileAppletelecommunicationMobile and Wireless

Show Comments