CIO

Apple and Google harden its smartphones against hackers and governments

As techniques used by criminals and law enforcement become known, the two leading smartphone makers are adapting to protect user data.

Google and Apple regularly make changes to Android and iOS to improve the integrity of the hardware running those operating systems, making it less likely that an unauthorized party could gain access to data stored on them. Two changes, one in beta and one in a shipping device, up the ante for criminals, companies, and governments who have found ways or might force ways of bypassing protections.

Google stops trusting itself

Apple and Google both use secure components within their devices to store critical data in a manner that prevents extraction and deters physical tampering. For Apple, that’s all modern iOS devices; for Google, that’s currently only its Pixel 2 models, though Android P will allow other device makers to built this in. The secure module stores elements like credit-card numbers for payment, and the characteristics derived from fingerprints that are used to validate access to a device. Apple calls its module Secure Enclave, while Google doesn’t have a capitalized term for it.

With the Pixel 2, Google recently added a measure to protect users against a significant and potential threat that could led to the theft of critical cryptographic data that Google keeps under extremely tight security. Google, like Apple and other OS and hardware makers, has cryptographic signing keys that it uses to provide a layer of validation around software and firmware updates for its devices. There’s no effective way to forge a valid signature without possession of those keys. But if someone were to obtain the keys, an unauthorized entity could create software and firmware that a device would accept as valid. Such updates could suborn the hardware, and cause the device to send data to other parties or let them gain access to stored information that should otherwise be unavailable.

Google Pixel 2 phone Google

Google Pixel 2

This would prevent a scenario such as the one during the 2015 San Bernadino shooting investigation, where the FBI demanded that Apple create a special version of iOS that the agency could install into a locked iPhone recovered the killers.The special version of iOS would let police bypass protections and delays in guessing passwords. (The FBI ultimately withdrew its request, claiming it had found another way in.) It’s unclear from Apple’s security guide and public statements whether it has removed that capability since then.

Google stated the concern in a blog post describing its new mitigation by noting that the few employees who have the ability to access the keys could be “open to attack by coercion or social engineering.” The post doesn’t mention government involvement.

Obtaining the signing keys is many orders of magnitude more significant than having a single phone or set of phones unlocked, and thus opens all Android users engaged in no criminal or suspect activity to examination and risk.

With the Pixel 2, the hardware security module the phone relies on to validate a user’s password cannot have its firmware upgraded without the correct entry of the user’s password even with a properly signed firmware update. Previously, Google trusted that it was the only party that could present such a thing; now, it no longer even trusts itself.

No more “Gray” area with USB port locking

On Apple’s side, the company has pushed out a feature that would deter the use of USB-based phone cracking devices, like the GrayKey. As previously reported, the firm Grayshift makes this device available to authorized law-enforcement agencies, with no warrant required, to crack iPhones with relatively short PINs. It relies on an unknown approach that bypasses Apple’s typical lockout for excessive password retries.

greykey malwarebytes Malwarebytes

GrayKey iPhone unlocker

Motherboard reports that starting in iOS 11.3 betas, Apple included an option in the Touch ID/Face ID & Passcode settings called USB Restricted Mode. The option, when enabled, requires unlocking a phone when it’s plugged into a USB peripheral via the Lightning port after a long a delay. Motherboard notes this delay was a week in earlier versions, and has been reduced to an hour in the latest. The feature hasn’t yet found its way into 11.3 or 11.4, but is present in the 11.4.1 beta and the iOS 12 beta, where it’s turned on by default.

This feature would mean that for any party to use a hardware-based cracking device, they would have to plug the phone or tablet into the PIN-cracking hardware within an hour. After that, the Lightning port would be disabled until the iOS device was unlocked again.

Will Strafach, CEO of Sudo Security Group, and who once developed iPhone jailbreaks, believes this change “has merit.” He notes this kind of data restriction mitigates a number of threats, including ones that might target developers who have enabled additional services. “A vulnerable system service would not be accessible by any adversary with physical access to the device,” Strafach says.

Protecting most users with integrity measures

It’s plausible that Google and Apple will borrow from each other, as they do with so many features, and add these options to their own operating systems over time. Google’s pursuit of a hardware security module seems overdue, but the Android ecosystem doesn’t let the company mandate much among its partners.

While the USB exploit for iOS is only reported in the hands of a company that sells access for what it regards as legitimate governmental purposes, its mere existence means that criminals can develop the same hardware. Likewise, Google’s patch for signed firmware updates has a bigger implication for billions of people who no government would ever target and who aren’t involved in criminal enterprises.

These measures may seem designed to thwart governments—whether from agents acting with legitimate cause adhering to a national constitution or other rules or not. And the measures certainly will. But that’s clearly intentional, based on both companies’ previous actions and statements, and incidental. With over three billion active devices worldwide across both platforms, any security holes that could affect users on a mass scale must be patched.