“I read the news today, oh boy,” is the classic opening refrain from the Beatles song “A Day In the Life,” a melding of two disparate song ideas from Lennon and McCartney into a 5-minute orchestral epic. In this case, the “oh boy” we are referring to is the recent security news headline about the supply chain failure resulting in a Secure Boot platform test key finding its way on to a huge number of device suppliers and laterally people's laptops and …Github too. There is a lot to unpack here, so we'll break it down and highlight some of the security best practices that were overlooked despite the word "secure" featuring prominently in the Secure Boot technology.
Secure Boot
First, a high-level overview on Secure Boot. This is a technology introduced circa 15 years ago when an attack vector was identified whereby malware could be surreptitiously inserted into the BIOS of a computer. BIOS stands for Basic Input/Output System: in essence low-level firmware on silicon that loads and runs the operating system each time a computer is powered up. From there, the malware could remain immune to detection and essentially subvert and take over the entire computer.
Enter Secure Boot, a technology via UEFI (the Unified Extensible Firmware Interface), which superseded BIOS. It uses the best-practice security principles of public key cryptography to digitally sign firmware patches, ensuring no code can be loaded or executed on the computer unless it is authentic software from the device manufacturer. The technology uses a cryptographic key called a production key (PK). This key is the root of trust for the entire software stack running on the computer. The integrity of the computer is entirely dependent on this key.
In modern software development, nobody builds from scratch anymore; instead, there are a relatively small number of compilers, libraries, and other development tools that are used prolifically across the industry. It turns out that the firmware and Secure Boot sector is no exception, with a relatively small number of vendors who make development kits for producing signed UEFI boot images.
The Faux Pas
One of the three main vendors of UEFI development kits, American Megatrends International (AMI), has for years distributed cryptographic test keys that can be used by device manufacturers to experiment with firmware signing and Secure Boot. This by itself is not yet a faux pas – developers need to develop. The faux pas is that these keys, marked “Common Name = DO NOT TRUST – AMI Test PK,” somehow found their way into the Secure Boot trust stores of countless thousands of consumer devices from hundreds of device manufacturers. The issue went catastrophic when security research company Binarly recently uncovered that the corresponding private key was sitting exposed publicly for several years in a GitHub repository. Fairly astonishing for what most would agree is a high-value cryptographic key that should be handled with care. Apparently it was encrypted by a four-character passphrase – fairly trivial to crack with today’s technology and access the private key. Most shocking of all is how widely this key is used in commercial devices: “Binarly researchers said their scans of firmware images uncovered 215 device [models] that use the compromised key.”
Sadly, this story is a litany of security faux pas, but it’s not entirely clear who committed them. A good craftsman doesn’t blame his tools, so you would think the fault lies with the device manufacturers who chose to put the “DO NOT TRUST” key into their devices’ trust stores. On the other hand, when a mistake is this common, you do start to wonder if the tools could have been better designed. Let’s explore a few things that could have prevented this catastrophe.
“There’s nothing more permanent than a temporary solution”
The big tech industry is famous for the mantra “Move fast and break things”; that as soon as a new feature successfully demos once, you ship it. Apparently Secure Boot – a feature that literally has the word “secure” in the title – is not immune to this philosophy. This speaks to firmware developers not fully understanding the implications of the development tools they are using.
Protect Your Cryptographic Keys
Cryptographic keys underpin the security of an organization and in this case the root of trust for probably tens of thousands of computers. Binarly calls this out in their research paper: “In theory, given its importance, the creation and the management of this master key should be done by the device vendors following best practices for cryptographic key management (for example, by using Hardware Security Modules).” HSMs are tamper-resistant devices designed from the ground up to protect cryptographic keys and ensure they never get exposed or transferred in the clear.
Good Key Hygiene
In addition to pointing out how an HSM could have been used to protect the cryptographic keys, Binarly highlighted a couple of other fundamental mistakes with this case. First, the same PK key was used to supply Secure Boot devices to multiple vendors. As a minimum, to reduce the blast radius in the event of a potential compromise, why not create a fresh key per vendor? It is basic security that you should isolate your customers from each other so that one customer’s bad decisions don’t negatively affect another.
Since these test keys are supposed to be temporary, why not rotate that key every six months or so to discourage customers from relying on them in production, and to reduce the damage in the event of a compromise?
All of the above implies that AMI should either have given customers the tools to generate their own test keys and refrained from distributing the same test keys to everyone, or else run a fully managed service for test keys, essentially treating them like a production service. Either don’t handle key material at all or do it properly; the middle ground is dangerous.
Test and Production Keys – Never the Twain Shall Meet
Test keys are exactly what they say they are: for use in test environments, by software engineers who are working on solutions prior to them being thoroughly tested and productized. They should never be hard coded or stray into production environments. While HSMs are not always required for development or test environments, it can still be prudent if you are not confident in the demarcation between your development and production systems. Equally, and especially, in operational environments, product keys should be generated in HSMs and their usage and access restricted to the specific environment. Don't fall into the trap of thinking that your trial environments, integration environments, test environments, whatever you call them – especially if shared with customers and external partners – can afford to be at a lower security level than your production environment. To build on the adage "There's nothing more permanent than a temporary solution," we can say “Test environments have a nasty habit of becoming production environments.”
How Can Entrust Help?
Entrust has been leading the charge in the design and production of hardware security modules (HSMs) for the past 25+ years. Our latest family of HSMs, nShield 5, are certified to the latest security standards including FIPS 140-3 validation and Common Criteria EAL4+. They integrate with a wide range of business applications highlighted in the illustration below. If your company is using applications in any of these use cases the cryptographic keys they consume should be generated and protected inside an HSM.
The Entrust CodeSafe solution can be used to execute any type of application code within the tamper-resistant boundary of an nShield HSM, providing an additional layer of security and safeguarding the code from internal and external threats.
Entrust Professional Services has delivered numerous production HSM deployments leveraging the CodeSafe solution and nShield HSMs for key injection/anti-counterfeiting solutions and numerous other use cases.
Entrust KeyControl, our enterprise key management solution available on premises or as a service, can seamlessly integrate with nShield HSMs. With a distributed vault-based architecture, it can be used for the management and visibility of cryptographic assets such as keys and secrets. With its unified dashboard, cryptographic assets can be inventoried and visualized, and their risk and compliance monitored. Keys created as test keys can be automatically restricted solely for use in test environments and cannot be used in production environments.
Entrust PKI, either on premises or as a service, manages the full lifecycle of digital certificate-based identities, enabling encryption, digital signature, and certificate authentication across a broad range of use cases.
Hopefully this blog post has given you an understanding of the Secure Boot construct and where poor security behavior in the development and test environment can have detrimental impacts on an industry-wide scale. “Oh boy” is how we reacted to reading the news about this incident. Hopefully lessons will be learned. To reference another Beatles song we all recognize, security is a “Long and Winding Road.”