Earlier this month I attended the 7th International Cryptographic Module Conference (ICMC) in Vancouver, Canada. There aren’t many conferences that have our main product in their title! I worked out from the registrant list that there were over 300 attendees from over 120 different organisations from 26 different countries. That’s quite a large community interested in commercial cryptography and standardisation. There were nine main tracks including end user experience, open source cryptography, embedded encryption and industry vertical application, global validation, quantum ready crypto, cryptographic technology, random bit generators and a crypto enterprise showcase.

What do cattle and turtles have to do with security?

Industry recurrent trends from the conference included discussions on securing smart-city landscapes, IoT security standardisation, automotive security, migration to cloud services, use of open-source cryptography and post-quantum cryptography.

There was an interesting discussion on assessing random number sources for cloud-based systems. Some of the approaches being discussed were very retro-fit and based on taking static measurements of a system. Entropy measurement for a system normally entails characterising the hardware and identifying the physical components to produce the entropy. This would be challenging in a cloud environment where users are encouraged to trust the underlying fabric and build their applications on top of it. Users don’t actually get to lay their hands on the physical systems underneath; instead, they worry about compute, storage and networking capability. Closely guarded servers, pets, have now become anonymous commodities or producers, cattle.

It seems to me that as time moves on, so must our thought process and approach for standards-based security assurance. In an illuminating talk, Alan Halachmi from AWS presented a proposal for FIPS validation in the cloud. Alan asked us to understand better the benefits of cloud environments and make use of it in our assurance endeavours. Continuously verifying correctness of behaviour would certainly be worth considering and is perhaps more powerful than a one-off or slow periodic measurement and an assurance certificate that is aged by the time organizations deploy a system. In a world where we are quickly becoming ranchers, wouldn’t it be great if we were continually verifying that systems were operating correctly and able to detect anomalous behaviour?

In her keynote speech, Karen Reinhardt from Entrust Datacard, reminded us how cryptography was the fundamental ‘nuts and bolts’ of any digital system, from embedded IoT systems to massively scalable computing platforms in the cloud. Karen gave a graphic account of how we need to be prepared for the future where quantum attacks are feasible in a ‘break one, break all’ scenario, especially on systems that rely on asymmetric cryptography. She emphasised that the danger with the quantum threat ‘is not if’ but ‘when’.

It’s exciting to think that we at Entrust Security, with our range of nShield hardware security modules, literally underpin security for secure digital systems, including PKI systems that prevail so widely across a multitude of digital eco systems and use cases such as those Karen and Alandiscussed. HSM’s can be considered the bottom turtle of the security world. They store the root of trust – ‘turtle 0’. There is definitely more work to be done to evolve how systems can be secured now and, in the future, and it’s nice to be in a place where our technology lays the foundation.

In an earlier blog the Entrust Security Office talked about the security enhancements introduced in FIPS 140-3 and in follow up blogs we shall discuss other security aspects including confidential computing and containerisation, PKI and post-quantum cryptography.

For more information on how Entrust helps businesses protect critical data, check out our product page.