The source code for Apple’s iOS ‘iBoot’ secure bootloader has been leaked to GitHub.
As its name suggests, iBoot is a piece of read-only code that sits inside a boot ROM chip, activating every time an iOS device is turned on before the operating system kernel is loaded.
Its purpose is to ensure that whatever loads before iOS is what is says it is and hasn’t been tampered with or compromised.
As Apple describes the importance of this integrity checking:
This is the first step in the chain of trust where each step ensures that the next is signed by Apple.
Reportedly, the leaked code relates to iOS 9, so it’s unclear how much of the code will still be present in the latest image for iOS 11.
The assumption seems to be that low-level code will by its nature not change very often, so the fact that the files date to 2015 (with a few from 2016) shouldn’t be reassuring.
Apple’s lawyers quickly intervened to have the code taken down under the Digital Millennium Copyright Act (DMCA). The following notice now lies in place of the source code:
Repository unavailable due to DMCA takedown.
This repository is currently disabled due to a DMCA takedown notice. We have disabled public access to the repository. The notice has been publicly posted.
The notice states the following “reason” for the takedown:
Reproduction of Apple’s “iBoot” source code, which is responsible for ensuring trusted boot operation of Apple’s iOS software. The “iBoot” source code is proprietary and it includes Apple’s copyright notice. It is not open-source.
What is the significance of the leak?
There are really two concerns.
Firstly, anyone who gets hold of the code can sift it for vulnerabilities, either to jailbreak Apple devices or, in the worst-case scenario, to undermine the security it is meant to guarantee.
They can’t modify the code itself to execute a compromise because anything that deviates from Apple’s boot image will simply stop iOS from booting. But by understanding its inner workings, someone might be able to find a way around some of the protections Apple deliberately doesn’t go into a lot of detail about.
That’s still an if because Apple’s trust design intentionally minimises the harm that can be caused by a compromise of one element.
More likely, in the short term, it will give researchers an incentive to find and report weaknesses they turn up to Apple in the hope of landing a bug bounty that ranges up to $200,000 for firmware flaws. If that happens (and assuming Apple tells us about it), the first sign will be a rise in payouts.
Perhaps the concern should be how this code leaked into the public domain in the first place. Even if it turns out to be of more minor significance than some have claimed, that’s still another symbolic blow for a company that has dealt with quite a few security issues lately.
Source : Naked Security