Quantum Computers and PQ: The Cryptographer’s Perspective
In this episode, world renowned quantum academic Dr. Michele Mosca, deputy director of the Institute for Quantum Computing, and founder and CEO of evolutionQ and Mike Ounsworth, Software Security Architect at Entrust will be gazing into the quantum computing crystal ball. They’ll be discussing timescales, the NIST PQC competition, post quantum digital certificates and then cogitating on what organizations should be doing to prepare and use PQC algorithms.
Transcript
Samantha Mabey: Welcome to Entrust Engage, an open forum for the most innovative leaders in security technology. I'm Samantha Mabey, and I'm your host. Today, we're still looking at quantum computing and post-quantum. We'll be discussing quantum computers themselves, post-quantum cryptography, and then some takeaways on what organizations should be doing to prepare. I'm pleased to introduce the two guests who are joining me in this conversation today. First off, we have Dr. Michele Mosca, who is a mathematician at the University of Waterloo, co-founder of the Institute for Quantum Computing, and founder and CEO of evolutionQ.
And also, joining today is my colleague Mike Ounsworth, software security architect at Entrust. To get started, I figure we should probably talk about quantum computers, which are essentially the catalyst of all of this. Michele, I believe the work you do has been involved in building and researching quantum computers. I'd love to get a bit of an overview from you on some of the progress that you're making there and some of the findings that you've come across.
Dr. Michele Mosca: Great. Quantum computers, they're about harnessing the quantum framework for physics, which is a very profound thing to harness. It's almost like a 100 years ago or more when we understood electromagnetism, right? When you harness that, you get ICT and all the wonderful technologies that have driven all the economic and social developments of the last 100 years. We're harnessing something actually even more fundamental than electromagnetism. What are going to be the impacts? Even harder to predict than a 100 years ago predicting what wireless communication and many other impacts of understanding electromagnetism would imply.
But we're getting there now. A 100 years ago, we started to understand that there was this new framework which was exponentially richer. In the last few decades, we can now build technologies. What are they good for? We didn't know 20 years ago. We started to get a taste of what they're good for in the mid '90s, and that's roughly when I joined the field when Peter Shor, showed there's this exponential advantage for solving certain problems like discrete algorithms and almost exponential for factoring large numbers. There's potentially this exponential power lurking within quantum physics.
If we could build technology that fully harnesses it or at least one problem and maybe more, there's this exponential reduction in the amount of resources you need to solve it. It's not like it's a flat faster clock speed, you need exponential fewer steps. It's about as many steps to use RSA than to break it. You can't just go to larger keys for example, and I got on into it back then understanding what else can it do. In the last 20 years, the field of quantum algorithmic, so at the top of the stack has matured tremendously. And now, that there's companies out there trying to commercialize this computing power, much of it's still to emerge, but we have first prototypes. We're getting a better understanding, still early days of which useful applications quantum computers might be good for.
One of the main focuses of my work is what are the tools you need to compile those algorithms to run unavailable hardware or eventually available hardware? And do resource estimation and to help understand, is this a million quantum bit problem or a 1,000 quantum bit problem and how much will it take to solve it? So I'm working a lot on that as are many other people around the world. And there's tens of billions of dollars being spent on building the computing platforms. So something that's really been evolving in the last five years especially is more system level research and development and commercialization is a whole commercial ecosystem now that is now really bridging the hardware to end users for early days. But we now kind of have an integrated stack and we're really just continually enhancing it. So it's a very exciting time for the field.
Samantha Mabey: Yeah, it would be for sure. So you're mentioning again, we're talking about the building and evolution of quantum computers. And of course, one of the things that we're looking at is the quantum computer that's going to break cryptography as we know it. And we've addressed that sort of threat timeline and there's a sort of general consensus out there that it's roughly a decade away and you're famous too for the prediction that quantum computers would break modern RSA with a one in seven chance by 2026, 50/50 chance by 2031. I'm not going to ask you live here, make a new prediction or anything. But what I'm curious about is just, you said it's difficult to predict things like this, the advancements of the technology. Why is it so difficult and what sort of research advancements are you looking for when you make a prediction like this?
Dr. Michele Mosca: Yeah, great question. Prediction is hard, especially about the future. And I think I was watching a Jamie Dimon interview recently and he's saying, "I don't know how the war in the Ukraine's going to play out. It could get better, it could get worse, it could stay the same. I have to be ready for all three of those possibilities." And similarly, the name of the game isn't to guess a year and put all your chips on that number, that would be foolish. We need a plan, prioritize and properly resource plan for all those potential outcomes. So the right question is what is the likelihood? Does Mike want to jump in?
Mike Ounsworth: Yeah, I was just going to make the point, we're cryptographers. When you say how secure is this data, an answer of one over two to the 128 is a comfortable probability that my data's going to get broken. When you say one over two, 10 years from now, that's as a cryptographer, that's a terrifyingly large probability.
Dr. Michele Mosca: So, I don't think we should assume 10 years. I think there's a small chance it's less than five years in our recent. So I have my own predictions and I'll explain a bit about how I come up with that, but don't want to just rely on my views. We've interviewed 45 of the world's thought leaders, the people whose opinions I care about who understand very viscerally and deeply all the different challenges that I'm aware of. And I kind of take those all into account and making my predictions. But I went to the sources and asked them point blank, what do you think is a chance of break? If you had to estimate the likelihood of breaking RSA 2048, what is it? What about 10 years, what it 15 years? And we gave them a range, right? And if you took the low end of the range and average it, you get about 2%.
And if you take the high end of the ranges and average, you get about 9%. So you can post-process that data differently if you want. If want it discard the most optimistic half, go for it, do whatever you like. But we're talking about a few percent. And like Mike said, if you're responsible for a critical infrastructure that is not just protecting information but OT and safety and lives, that's an unacceptable risk. You already need a plan now for that in less than five years. And by 10 years if you look at those numbers, it's already into the low '20s and even low '30s percentage likelihood within 10 years. So clearly, you need a plan to be ready in 10 years. And if it ends up being 15, great, but you certainly can't say, "Oh I'm going to let it ride, I'm going to go with the other 75%."
Because with something that critical, you have to be ready for all the different scenarios. Now, of course, you can track and pivot a bit depending on how things unfold, but we don't have a time machine where you can go back and make up the lost time. So our threat timeline report kind of gives you a risk profile. You have to come in with your risk tolerance, your risk appetite and decide what year am I going to plan for now. Might not have shifted later in either direction? Yes, of course. But again, you can't make up lost time. Nothing really bad is going to happen if you're a bit early. Bad things could happen if you're late.
Again, what goes into predictions? For me it's really, I mean this is an oversimplified thing, but we kind of know roughly how to do it. And one of the most important milestones is integrating all these pieces. Actually, we have almost all the pieces working. The hard part is getting all these pieces working at the same time in the same system and not blowing up basically. Because when you try to do all these things in the same system at the same time, they all start interfering with each other. And so, how close are we to really implementing this fault tolerant air correction that we all in theory know how to do. And we've implemented more and more of the pieces, we've implemented more and more of the pieces at the same time and we're getting closer. This is again, simplified, but roughly speaking, when do we reach that point where the kind of all working together now and the main focus is on scaling it. And scaling it with a justified belief that it's going to scale just fine.
When is that going to happen? What rate of scaling do I anticipate? And lastly, how big of a computer will I need? Because we know what's sufficient today, but many of us, including my team, is working on reducing the number of quantum bits you need to break RSA. So the requirements keep going down, the availability keeps going up, when do those two curves cross over? So I kind of look at different scenarios for both of these two things and I look at all the different, it's exponentially many paths in a sense and I compute probabilities for them and I add them up and that gives me my estimates. But in my estimates are roughly the middle third if you go and interview these 45 thought leaders.
Samantha Mabey: Sure. Okay. You have something to add, Mike?
Mike Ounsworth: Yeah, you mentioned that the engineers and the physicists are working on building it on the robustness, the fault tolerance and the scale and algorithmists are working on better algorithms using the hardware more efficiently, fewer resources and those curves cross. We don't hear a lot about the algorithmic advances. They don't seem to get as much press. We all know about Shor's and Grover's. Are there other fundamental algorithms that are being developed or that are on the horizon?
Dr. Michele Mosca: So definitely, there are. Are they relevant to factoring? Not yet. Mostly there've been tweaks on Shor's algorithm. There's one thing actually I did with Dan Bernstein and Jean-François Biasse, which we showed you can actually boost the number field SIP with a quantum computer with asymptotically only ended the two-thirds qubit. Sorry to get into the... But you need on the order of n qubits, like roughly two n qubits to factor an n bit number. We showed that with only end of the two thirds, which is a lot less than n qubits, but it's big O end of the two thirds, which is hiding a lot of complexity and practice, you can actually speed up the number field SIP right? So asymptotically anyway, you can get a huge speed up over the number field SIP way before Shor's algorithm can be implemented. It doesn't keep me up at night because I think the big O will not make it practical for n equals 2000.
But it does say let's not assume. In other words, it's a really good question you're asking, there's no fundamental reason why you need 4,000 logical qubits to break RSA. That said, we've optimized it, it's not glamorous research. But there are ways of optimizing the implementation of Shor's algorithm and some are algorithmic improvements but a lot of it is below the hood of how do I compile it to run on a fault tolerant system. And there's further optimizations on, are there better air correcting codes that need fewer physical qubits? And when you're constantly tracking, we've been constantly tracking this year after year. What are all the improvements in compilation fault tolerance? And that helps reduce the overall resource count for implementing Shor's algorithm.
None of it is glamorous and that's why you don't see it in nature on the news. But those of us, we're tracking it but that's what we're tracking and we are seeing just a gradual, and occasionally, there's little draw like step function downward. But for the most part, it's a gradual decline. But we're obviously aware that there could be a non-trivial step downward with a really modest advance in algorithmics. I mean I'm working actively on something non-model, as are others. But this is high risk stuff, you wouldn't give it to a master's student or a PhD student because it's lowish, probability of success and high impact stuff. But we have to be ready for that too.
Samantha Mabey: Very cool. So one of the things I want to move the conversation to is post-quantum cryptography. So again, knowing that quantum computers are going to break or sunset cryptography as we know it, like RSA. Just sort of looking at what NIST is doing as far as I understand everyone's looking at NIST to set the standards. I'm wondering, and maybe Mike you can jump in here as well, is just providing a quick overview of the NIST competition and what exactly they are doing there.
Mike Ounsworth: I can take a first shot at this. My view is, so I'm a software engineer, I'm building systems that use cryptography. My day job doesn't involve building the cryptography itself but using it. For example, PKI is an infrastructure that uses a lot of digital signatures to build trust networks. And so, we need those mathematical primitives, the digital signatures to be strong and robust and available so that we can build PKIs around them, for example. The NIST competition is looking at defining the next generation of mathematical primitives. We've had RSA, elliptic curve sits the '70s, '80s, they're mature, they're robust, they're everywhere. What's the next 30 or 40 years of mathematical cryptography going to look like? From my perspective, that's the role that NIST is filling and it's largely a collection of mathematicians and academics proposing these really low level structures. And then, there's sort of a whole halo of engineers like me who are watching the NIST competition with baited breaths.
How do we use it, how is it different? How does it compare to RSA and elliptic curve? Is it bigger, is it slower? Can you use it in the same places? Do we have to adjust our software to accommodate it in different ways? How much do we trust it? Are we expecting CVEs and vulnerabilities and buffer overflows and how new is it and how tested is it? And all these, how do we make it more robust and fast and integrable and interoperable? And that's sort of not directly part of the NIST competition. I think NIST has intentionally not weighed into those sorts of spaces. There is this whole community of people surrounding this competition trying to figure out how to use it.
Samantha Mabey: Okay. It kind of leads into one of the questions that I had where when NIST is done, and I guess, it's not going to be NIST, but it's going to probably be a bit like the Wild West where we see, I don't know if it's going to be other standard bodies, vendors, organization. Everybody's going to try to move really quickly after that. So one issue I see, who's going to be reigning it in? Is it going to be governments or non-government standard bodies? Just thinking about all the software vendors being on the same page, how do we ensure what organizations are doing or implementing that they do work together or are compatible?
Dr. Michele Mosca: Yeah, good point. Because I mean, this has a long tradition of an open process because people have to trust the selection so that they are trying to be as transparent as they can in engaging the brightest cryptographers around the world. But then, you're not done and these standards feed into other standards. There's layers of standards, like the X nine people in the banking community are working or they're already anticipating this and preparing. ETSI's been anticipating and preparing all the standards and pre standards work around the algorithmic choice. Because there's so much more beyond the algorithm, this doesn't declare a new algorithm and just magically, everything's fixed. It takes years and years and years.
We've been doing a lot of proselytizing and outreach for a decade or more now. So there are many pockets of the ecosystem that have been anticipating. They've been using open source platforms like Open Quantum Safe, to do a lot of their testing and prototyping. The team at Entrust has been doing really great work anticipating and engaging with the community to look at how we going to do certificates and so on in practice. So a lot of this work where, again, you don't get to hit the pause button and say, "Oh, great, now wait, stop building a quantum computer while we figure out how to use the NIST standards." So there's been a lot of preparatory work, that said, there's still a lot more to do. So who's coordinating it all?
There is no, of course, overall coordination. It there never has been. But there are pockets of leadership around the world. Obviously NIST has been doing great work. ETSI has, there's a lot of corporate leadership like Entrust and others. The White House has been issuing series of directives and they're pretty nice. Like they're light touch, they're kind of sending a signal without being too heavy handed and prescriptive. But it sends a signal that makes it clear you need to get ready. If you're not ready already, you need to soon. And they're giving people enough time to get ready, because what we don't want is this gets managed as a crisis. Because then, we'll make things worse than they already are. So it is a bit of chaos we're trying to manage. But that's how it's been all along, in a sense, because the quantum threat has been so peculiar in that we knew about it in 1994.
Usually you don't hear about zero-days, 30 years before they happen. You usually find out about it after you've been hacked or someone else has been hacked. So we have actually more than ever before had so many years to prepare. Yes, we procrastinated most of that time, but not all of it. So on the positive side, I'm hopeful, as while we're still unprepared in many ways, we're probably more prepared than we've ever been for this sort of this massive crypto migration. But we've been over, we started the ETSI IQC workshop 2013 and NIST was part of the founding team of that. They were part of the first program committee, they came to France for the first meeting.
So we've been really bringing in all the entire ecosystem as they joined the ecosystem together up until the pandemic, we physically met every year to discuss what have we achieved in the last 12 months, what do we need to do in the next 12 months? And so, we've been as much as possible trying to coordinate the ecosystem and build the community of people who often didn't know each other in advance but know they have to work together to eventually affect this migration. So there is informal coordination I would say, and more and more standards bodies are engaging and referencing each other's work.
Samantha Mabey: Yeah, that ecosystem is very much the word I figured you might bring up because digital security today, it is very much about the ecosystem and why would that change moving forward. So that makes a lot of sense for sure. So moving on a little bit to again, post-quantum cryptography. Mike, I don't know if perhaps laying a foundation for our listeners just so that we're clear exactly what digital certificates are and why quantum computers are a threat.
Mike Ounsworth: Yeah, that's a question I can do. Digital certificates, most people encounter them in your browser, that little green padlock in the top bar that tells you which website you're talking to with cryptographic strength. There's entities on the internet called certification authorities and they certify that you are the website that you claim to be. You're connecting to entrust.com and a certification authority has said yes, this is the legitimate entrust.com and that trickles down into displaying a nice green pad log in your browser. And that all rests on top of a type of cryptography called a digital signature. The certification authority has signed that attestation statement at runtime.
When your browser and the website talk to each other, the website is going to use that certificate to sign your session to authenticate that that session is from the legitimate server and they call it a trust hierarchy. You can roll that trust relationship all the way back up to the certification authority and see that yes, everything checks out. The cryptography is valid at every step. And that is, as we've mentioned here, all based on a type of cryptography, RSA or elliptic curve, which is known to be susceptible to Shor's algorithm. So this is one of these zero-days, 30 years in the making that Michele referenced. But we will need to migrate. How does it migrate? What are the implications of migrating it or here?
Samantha Mabey: So with that, I'd love to talk about some of the work that you're doing on composite and hybrid certificates and what are they and why are they considered quantum safe?
Mike Ounsworth: I guess that really gets at the question of do we? So we know we don't fully trust the existing stuff, the RSAs and elliptic curves. We know for now it's fine, at some point it's not going to be fine. When is it not going to be fine? We don't know. Okay, what about the new stuff? All the stuff in the NIST competition, there's FIPS plus, the XMSS, dilithium, Falcon, Rainbow. The community got a bit of a shock earlier this year, we're in round three of the NIST competition. One of the finalists for the signature schemes, Rainbow got critically broken. That's pretty shocking, we're late in the cycle for a break that big to come out.
So what else is lurking below the surface that hasn't been discovered yet? So from my perspective as practitioner, do we have that 2 to the 128 guarantee that if we pick the NIST finalist, our data will still be secure in 2050? Given that Rainbow revelation, maybe not. Maybe the other algorithms have similar discoveries waiting to be found. So we don't trust the current stuff. We're not sure that we trust the new stuff either. Which in some ways, makes this postgraduate migration unique. I don't think we've ever had a crypto migration like this, where we're not sure we trust either the old or the new. So how do you bridge? And the answer I think is actually in, I mean, NIST outlined it right from the beginning, what they're calling hybrid and dual modes.
Take the old stuff, take the new stuff, layer it together. Layer it together in such a way that an attacker would have to break both. Or if you're using more than two algorithms, break three, break four. They have to break all of them together at the same time to break the data security. And that it's not a silver bullet necessarily. You can imagine everything fails, but the chances that everything fails all at once. There's zero-days against everything in a very short time and that's starting to be a safely low probability. So any sort of hybrid system would at least give you time to migrate off in the event of a break. So on that bed, Entrust has really been championing. And I'm not going to say, I choose my language carefully because as Michele said, we're trying to lead the community.
We're trying to really strongly build consensus. We're working through the Internet Engineering Task Force, the IETF standards body, which is a very open and public standards body. And so, we are the primary author but really, we see ourselves more as an editor. We're trying to collect all the community feedback, all of the other vendors in this space, our competitors, our other software vendors we have to interrog with. And how do we design a certificate that takes this hybrid concept or this composite concept and layers multiple signatures together. So you a PKI, a trust hierarchy that's on two or more algorithms in this layered secure fashion.
Samantha Mabey: That's fascinating. That makes a lot of sense. And like you said, that it does feel late for a break. But it's a constantly moving space and I don't want to get all gloom and doom, but I do have a question where when we're talking about quantum computers, we've mentioned Shor's algorithm, we've mentioned Grover's algorithm. So what's the likelihood that there are other mathematical algorithms or search algorithms still to be discovered that can harness the power of a quantum computer, adding further challenges to classical crypto as we know it? Is the quantum computer we're planning for, the quantum computer we're going to get?
Dr. Michele Mosca: Well, we'll definitely get what we know and it's almost certainly going to be new powerful quantum algorithms. And also, remember, these post-quantum algorithms, you can also attack them with classical computers. So overall, the question is, what's the likelihood that these new algorithms will break the post-quantum schemes? Rainbow was probably the one if we had to bet, that was probably the one we were most nervous about. And of course, there's different flavors of breaking. One option is, oh, just go to a bigger key, which is not a great consolation prize for the person who just lost all their confidential data or had their power plant blown up or their car driven off a bridge.
But at least there's different types of braking and some are, I need a month of the world's most powerful supercomputers to get one key. Others are, I need 10 seconds or a millisecond or whatever to get. So there's different degrees of how bad the break is. So I mean, I think there's at least a 10% chance that these NIST algorithms, at least one of the main ones will be broken in less than a decade. That's 10, but I mean, how broken? Well, it'll be a spectrum of if you break up that 10%, some it'll be like you need a super computer with a quantum computer running for a month. Others will be, no, actually you could do it on your laptop. And the example Mike mentioned, that was on a laptop, that was an easy break. That one could be patched by going to bigger key for now.
So what do you do about this, right? Because we're not going to get it down to 2 to the minus 128. I think it's 10% and you could argue, "Oh, it's 20 or it's 5." I've heard people argue both. Well, one thing you can do is the kind of dual and hybrid stuff that Mike was mentioning, bring it, put in a few extra layers. It won't bring the probability down to zero, because you could have a shore like event, which really breaks more than one scheme. The other thing is to really, let's take this 30 plus year head start we got and design things. Because if we had to manage this as a crisis, you could forget about hybrid, forget about agility, it'd be a panic rush and we'd slap on some band-aids and then say, "Don't move it, don't touch it because it'll break."
But now, we've had time so we can design things to be more agile, have some of these dual and hybrid type approaches that helps mitigate the risk. So at least if it is broken, we can update it relatively quickly versus some catastrophe where we're like, "Oh, great, we're 10 years away from fixing it. Okay, everybody, stop using the internet for 10 years while we fix this." That's obviously not an option. And other things we can do. So I'm a big believer in post-quantum crypto. I mean, that's why I got into quantum algorithms 26 years ago and I think it's going to be a beautiful first layer of defense and it doesn't need quantum technology. Obviously, we complain that it's not as efficient as ECC. But at least, it just uses classical tech. I think Mike wants to jump in.
Mike Ounsworth: And a couple points back to keep perspective here, you and I are sitting here as cryptographers. We're really concerned about the security implications of this migration. There's lots to be said about the migratability implications here. How quickly can you move? I mean, saying that the technology exists as one thing, but how easy is it to replace hardware? How easy is it to patch software? I mean you think of things like ATMs, they're out in the world possibly in places that are hard to send a service tech to. You might not have over the air patching capabilities. You may have to wait until that physical hardware is replaced to be able to make it speak post-quantum. And so, that's the other aspect I think to all this hybrid dual composite stuff is also trying to design. How do you make it so that you can have heterogeneous environments where some of your components speak post-quantum, some of your components don't. You're getting the extra security benefit where you can, but everything sort of still works in the meantime.
Dr. Michele Mosca: That's a great point because building a quantum bit is hard. And this is over 10 years away, maybe even 20. I'm like, "I know." And we're at least 10 to 20 years away from migrating, because I think it's largely been two distinct communities. The security world who understands very well how non-trivial it is to change from one algorithm to another and do it without messing up something important. And then, the quantum community understands how hard it is to build a quantum computer. I've been by fluke in both communities for over a quarter of century. So, I've been trying to translate and say, "Look, we really need to get the migration show on the road because I know while there's still some important milestones to be achieved for building a cryptographic irrelevant quantum computer, we still have to do a lot of hard work to be secure against them."
But back to what I was saying, so post-quantum crypto is a beautiful and essential first layer of defense, because you can really put on any endpoint with modest computational performance, maybe not an RFID or something really low performance. But for most endpoints today, it's beautiful, but the risk is it could be broken. So for critical platforms, we do need a plan for that. And again, hybrid helps mitigate it. But what we still have time and we need to think about what do we do to deal with the systemic chance, the risk of these new algorithms being broken. And that's where more and more people are looking at things like quantum cryptography, which can't be mathematically crypto analyzed. How can we add that as an additional layer of defense for some of the more critical platforms?
Samantha Mabey: Fantastic. So we've gotten into it a little bit, but I mean one of the things that you stress, Michele, is that the time to prepare is now. That's something that we very much echo with our customers and anything that we say around this space. So I'd just love to get from you some of the takeaways with, I understand you've worked with banks and governments on helping them prepare. For our listeners, what are some of the things that they should be thinking about or doing to create that sort of readiness plan or migration plan?
Dr. Michele Mosca: Yeah, I mean, we've come a long way in the last 25 years. 25 years ago, the question I get is, "Can I put this off till retirement?" And then, if you now start analyzing the threat timeline, the shelf life of your information, this migration timeline, you realize, I already better have started planning. But part of the challenge has been, again, 10 years ago, even 5 years ago, if you logically realized I need to do something, you'd quickly realize you didn't know where to start. There's no reliable open source platforms for testing things. So few others in the ecosystem are engaging and there's only so much I can do on my own. And then, that was usually enough inertia to not do anything. But that has changed. The main players around the world are working on this. So I think one thing you can do is you can search for the Canadian Forum for Digital Infrastructure Resilience, quantum safe best practices.
And you can see there an example, I mean, we compiled best practices from all around the world and battle tested them with our banking friends, the Bank of Canada and other big Canadian banks. And so, you can already see that we broke it up in the six stages to get quantum ready in the sense of secure against quantum attacks. The first three phases are really preparation and assessment. The last three phases are mitigation, migration, validation. And so, you don't need to do this on your own. There's so much you can share. In most cases, don't worry, you're not going to be first anymore. Now, in some sectors you might be first, but then there's so many others that it will share best practices. So you kind of know what you need to do, which is a great starting point because otherwise, you're like, "What do I do and how do I know I'm doing the right thing? How do I get resources to do this?"
That road is sort of been laid out now. And you can basically go through the stages of understanding, doing your risk and vulnerability assessment, then engaging your vendor ecosystem. Because basically, once you do your assessment, you realize I really need to have this system secured by 2030. You work backwards, you realize there's no way I can do that or it's going to be really hard, but I'm going to try my best without panicking and making things worse. And you start to realize your dependencies and unfortunately, most of that is being taken care of. So you can kind of focus on the parts that are not being taken care of. Which standards does your industry need? Which regulators do you have to anticipate here and steer in a direction? So it'll be smart regulation and not heavy handed prescriptive regulation that makes things worse.
So you can zero in on that little bit that really does require proactive action. You can start doing your proofs of concepts, your pilots, you can start understanding your supply chain dependencies. Know which of your vendors, like roughly speaking, if you're using cloud, apps and some in-house tools, talk to your cloud providers to make sure things are on track there. Talk to your app providers to make sure things are on track there. And you have to do a little more work to fix your own in-house systems. You have to do a bit of training to make sure the right people in your own team are ready to do what they need to do. There's a bit of coordination that needs to be done. So there's still work to be done, but there's a lot of best practices being shared out there.
So you can do it with the minimal necessary work, but you do have to do the work. And the sooner you start, the more it can be. Like the real objective for us here is, let's make this part of technology cycle management and really make it a non-event. There is no crisis, right? By the time we have cryptographically relevant quantum computers, it'll be like, "Oh, well, we've already essentially taken care of that." Or we flip a switch, we do that update that we were preparing for and it's really a non-event that we can just focus on all the other things we really should be focusing on.
Samantha Mabey: Absolutely. It's going to take time, but we have time as long as organizations are thinking about it and talking about it and planning for it now. So, fantastic.
Dr. Michele Mosca: I think boards of directors from the top all the way down, let's assure ourselves. There's got to be KPIs tied to this, that this will be managed by life cycle management, not crisis management. And this has to be tracked at least twice a year to make sure that's happening.
Samantha Mabey: Wonderful. Well, I think that's a promising note to sort of conclude on here. Again, just there's lots of organizations can be doing, making it a non-event. I love the way that you position that, again, if you prepare and you look at this now. So with that, I think we'll call it there. And I would like to thank you so much both of you for your time, Mike and Michele, I really appreciate you taking the time out to have this conversation with me today.
Dr. Michele Mosca: Thank you.
Mike Ounsworth: Thanks. This was a fun chat.
Samantha Mabey: Yeah, absolutely. So that's it for today's podcast and concludes our six episode series on the topic of quantum computing and post-quantum. Keep up with episodes by following us on LinkedIn, Twitter, using the links in the episode description. Thanks for listening to Entrust Engage.