CIO

Five things you should know about iOS security

Security is an extra-hot topic these days, as all sorts of government agencies short on letters but long on budgets keep getting accused of spying on their own citizens, and debates rage on whether what look like accidental bugs may actually turn out to be quite intentional.

In the midst of all the ruckus, Apple has updated its iOS Security whitepaper, a longstanding document outlining the thought processes and technologies that go into keeping its mobile platform as secure as possible. Here are just a few of the most interesting tidbits from this latest revision.

More keys than a hardware store

Remember how, back in June of last year, Apple published a statement in which it claimed that iMessage data is "protected by end-to-end encryption"? Well, it seems the company really meant it.

Like many Internet services, iMessage depends on public-key cryptography to function. This technology uses two very long numbers, based on cryptographically secure random data, that can each be used to decrypt information encrypted with the other. When you activate messaging on each of your devices, iOS generates a pair of keys and sends one of them--appropriately called a public key--to Apple, tucking the other one securely away in its local memory storage.

Now for the fun part: When someone wants to send you a message, they ask Apple for all the public keys that belong to all your devices, and then use each to encrypt a separate copy of the message, sending the encrypted messages to the company, which, in turn, forwards them to the appropriate device using iOS's push notification system.

Crucially, each device's private key, without which data encrypted using the corresponding public key cannot be decrypted, is never transmitted across the Internet, which means that the company really cannot read your messages, just as it claims.

Note, however, that this system is not absolutely secure. In theory, since Apple controls the directory of public keys used by all the devices, it could surreptitiously add--or be forced to add--its own set of public keys (for which it knows the corresponding private keys) and thus gain access to every last bit of information that is exchanged on its messaging network. Therefore, it would perhaps be fair to say that iMessage is secure as long as you trust Apple--just like your mail is secure as long as you trust that the mail service won't photocopy it on the way to its recipient, or that your phone company isn't rerouting your calls to someone who just pretends to be your mom.

There's fruit in them cables

As long rumored by various outlets, Apple's Lightning cables really do include a special authentication chip that iOS can use to verify that they were produced by an authorized manufacturer.

According to the security whitepaper, the chip is actually manufactured by Apple and contains a special digital certificate known only to the company. Manufacturers that participate in the Made for iPhone Program receive the chips directly from the Cupertino giant and incorporate them into their products during production.

Interestingly, the authentication process is not limited to cables: Any authorized accessory that wants to communicate with iOS--including those that use Bluetooth and Wi-Fi connections--must incorporate it, or risk that the operating system will refuse to play nicely with it.

While this process undoubtedly allows Apple to keep a tight rein on its lucrative accessory licensing business, weeding out third-party manufacturers who produce sub-par hardware or refuse to pay up the necessary royalties to receive the company's official blessing, the authentication chip also helps keep things secure by reducing the likelihood that malicious actors will be able to inject malware into our phones and tablets just because we want to recharge them.

Siri knows everything, but she ain't talkin'

Siri, Apple's digital assistant, also gets some time in the security spotlight. The company has laid out the steps that it takes to balance the service's effectiveness with the privacy and security of its users.

As you can imagine, Siri is complex enough that much of its work takes place on Apple's own servers rather than on each individual device. This allows the company to offload the most taxing aspects of the assistant's functionality, such as turning audio into actionable text, and makes it possible for the service to be updated outside of iOS's traditional upgrade cycle.

Clearly, this means that your device must send Apple a fair bit of information in order for Siri to work--starting with a full recording of your voice, which is transmitted alongside your name and rough geographical location whenever you request your digital assistant's services.

In order to protect your privacy as much as possible, however, Apple uses a mechanism called progressive disclosure to limit the amount of information that reaches its server. For example, if you need to find a restaurant near you, Siri's servers may ask your iPhone to send them a more accurate location; if you want the service to read your email or SMS message, the remote portion of the system will simply delegate the task to the device itself, so that your private data never has to leave the confines of your handset or tablet.

Apple also outlines what it does with your data once it gets hold of it: Information like transcriptions and locations are discarded after ten minutes, while recordings are kept for a period of up to two years--after six months, however, they are scrubbed of all digital data that could be used to identify their source. Presumably, the company uses them to help improve its voice recognition software, particularly when it comes to non-standard words like proper names and music or movie titles.

More than a pretty chip

The CPU that resides inside every iPhone 5s, dubbed the A7, contains all sorts of technological goodies. Among them is a special co-processor, dubbed the "secure enclave," that is designed to help provide iOS with an extra-secure area of memory.

Each enclave is primed with a unique digital identifier when it's manufactured. Not even Apple knows this number, which means that whatever information is stored in the enclave cannot be pried out of it without your explicit permission--even if a sophisticated hacker were to phisically steal your device.

The enclave also gets its own secure operating system, boots separately from the rest of the device, and uses a special technology to ensure that the software it is running was officially sanctioned by Apple. All communication with the enclave takes place in a securely encrypted area of memory, which is re-encrypted with a different key every time the device on which it resides boots.

All this paranoia is a good thing, because the enclave is used to store some of the most sensitive information that makes its way onto your device, such as the digital information required to unlock your iPhone with your fingerprints when you use Touch ID.

Keychain sync could probably withstand a nuclear attack

It seems that Apple designed iCloud Keychain so that it would be able to withstand just about everything short of nuclear winter--perhaps explaining why it took so long for the feature to return after it was discarded during the transition from MobileMe to iCloud. According to the whitepaper, you should be able to securely sync and recover your keys even if you reset your iCloud password, if your account is hacked, or if the iCloud system itself is compromised, either by an external entity or by an Apple employee.

To accomplish this feat, Apple uses a complex web of asymmetric digital keys and advanced elliptical encryption algorithms, coupled with manual controls (like activation codes that must be entered manually by the user on a device) to ensure that the company effectively never holds enough information to decrypt the contents of a keychain stored on its servers.

Interestingly, the engineers responsible for this feature have built a degree of selectivity into it, so that only data that is specially marked can actually be sent to the cloud. iOS makes use of this feature to keep some information that is device-specific, like VPN logins, out of the synchronization process, while other information, like website credentials and passwords, are allowed to go through.

The anti-Google

All told, Apple's whitepaper paints the picture of a company that is--at least publicly--deeply committed to the security and privacy of its customers.

Of course, the actual veracity of Apple's claims depends to a large extent on the trust that its users place in the company, since we can't just waltz into its server facilities and ask--nay, demand--that we be shown the source code. Even though practically everything that flows through iCloud and Siri is encrypted end-to-end, there is still a possibility that the folks from Cupertino may maliciously tweak its services (or even its operating systems themselves) in such a way as to silently compromise our every email, our every call, and our every text message.

It seems to me, however, that privacy and security are more than just a point of pride or marketing pitch for Apple: They are among the primary selling points of its products. As the company says in the first line of its whitepaper, "Apple designed the iOS platform with security at its core." All the complex encryption mechanisms, digital key exchanges, software sandboxing, and hardware protection boil down to a simple message: Invest your money in our wares, and we'll stay out of your life.

This stance places the company in sharp contrast to its rival Google's insistence on collecting, collating, and distilling as much information on its users as possible. Any revelation that anyone from Apple's management to its engineers aren't taking this message to heart, even by omission, would damage its business in ways that would be very hard to recover from.