We can't find the internet
Attempting to reconnect
Something went wrong!
Attempting to reconnect
Analysis Summary
Performed authenticity
The deliberate construction of "realness" — confessional tone, casual filming, strategic vulnerability — designed to lower your guard. When someone appears unpolished and honest, you evaluate their claims less critically. The spontaneity is rehearsed.
Goffman's dramaturgy (1959); Audrezet et al. (2020) on performed authenticity
Worth Noting
Positive elements
- This video provides a clear, high-level explanation of the technical differences between functional code and FIPS-certified cryptographic modules in the context of PQC.
Be Aware
Cautionary elements
- The use of 'Harvest Now, Decrypt Later' as a marketing hook to create immediate commercial urgency for a threat that is still technologically nascent.
Influence Dimensions
How are these scored?About this analysis
Knowing about these techniques makes them visible, not powerless. The ones that work best on you are the ones that match beliefs you already hold.
This analysis is a tool for your own thinking — what you do with it is up to you.
Transcript
Even if quantum mechanics, quantum computing sounds like science fiction, but it is moving closer and closer to reality and there is a cyber security threat happening right now that most organizations don't even know exists. Adversaries are waiting for quantum computers to break today's encryption, which they will. They bad actors have started harvesting encrypted data that they cannot decrypt right now but they're harvesting it because storage is cheap. They're storing it and planning to decrypt it the moment quantum computing becomes powerful enough. It is dangerous and it's called harvest now decrypt later. And if your organization's sensitive data has a shelf life longer than 5 to 10 years, you're already at risk. So, how do you protect data that is being stolen today but won't be cracked until tomorrow? That's exactly what we are going to find out in this episode of Secure by Design with Jeremy Ellison, distinguished engineer at CIQ and co-creator of the Samar project that we all use. Jeremy, it's great to have you on the show. >> Oh, it's great to be here. Thank you very much for inviting me. >> Jeremy, let's start with the basic. Let's start with what's driving all this urgency around postquantum cryptography. What exactly is PQC or postquantum cryptography? And why are we talking about it now when practical quantum computers might still be years away, still kind of in the realm of science fiction? quantum computing I mean it's essentially magic [laughter] from from what I can my my own opinion is this using the computing power in parallel worlds to solve problems in ours um I don't I'm not a a quantum uh cryptographer uh I'm an engineer so I make the code that protects against it work um but essentially the the fear is that quantum computers will be able to crack our current um key exchange algorithms which are mostly based on elliptic curve and RSA. They're the mathematics and a lot of those are to do with the factorization of large integers. Now it turns out that one of the earliest algorithms uh written for quantum computers was something called Shaw's algorithm which actually focuses on factoring large uh integers which is basically the problem that RSA uses to protect all of our banking all of our medical records. So the idea is to use postquantum algorithms and these are algorithms that have been standardized by NIST uh the US National Institute of Standards and Technology to protect against attacks where somebody collects data now that's flowing over the internet and they can't decrypt it, they can't do anything with it, but storage is cheap, right? Everybody's got 20 terbte disc drives. You keep storing this stuff and then later on when quantum computers come online, the fear is that you can then decode all that and you can read the plain text of all the things that people were communicating uh at that time. And so, you know, the fear is this could happen sometime. Well, I mean, look at AI. It's happened faster than anyone expected. Um so so the the goal is by 2030 that we should have moved to postquantum algorithms and these are algorithms that I don't understand got to be honest um created by mathematicians that will protect against um quantum computer decryption. Now the main the main protocol that everyone is concerned about is TLS, transport layer security. And that's basically the web protocol. That's that's what we're using right now uh to communicate on our browsers. That's almost what all it's what Amazon uses when you uh you're doing S3 or cloud computing or Google or Microsoft. they all use um TLS to protect the uh to protect the the data in transit and so the algorithms that have been created um are actually around protecting the key exchange. the bulk data uh encryption uh usually using as is mostly considered safe from quantum computers but the key exchange if you can get the keys that are exchanged when you first make that connection then you know it doesn't matter how good your encryption is after that you know how to decrypt it so the goal is to protect the key exchange uh and also the signatures so there are two algorithms called uh MLKM and ML uh DSM say that are basically the postquantum algorithms and those are the ones um that we have gotten FIPS uh well we're in the module in process list but our algorithms have been tested uh as correct and we're uh getting uh FIPS certification for those algorithms. >> You mentioned harvest now decrypt later attacks. Walk me through what does that actually mean in practice? Are we talking about a theoretical threat or is it actually happening right now? >> Oh, I'm sure that's happening. I'm sure that's happening right now. Um, governments uh and bad actors, they're all uh collecting as much encrypted data over TLS as they can possibly get their hands on and storing it away in the hope that when quantum computers arrive, they can just decrypt this and read it as though people were conducting plain text conversations over the internet. It would be like, you know, using using Telnet was in the 1990s. You could basically read everything that was going over the internet. So that that is on that is going on right now. I'm absolutely sure of that. Now, so the goal is to move everybody onto the postquantum algorithms as soon as possible. But this is going to take time. Think about all of the equipment that is deployed right now. all of the embedded systems, all of the Linux servers, everything. Um, we need to start now to transition all those devices onto modern code bases using postquantum algorithms to prevent I mean, you know, there's nothing we can do about the stuff that's already been collected using the older algorithms. Um, but what we can do is roll these this code out as soon as possible so that you can start using it as quickly as possible. And that's that's what the FIPS certification is about. It's basically saying uh and I believe it's targeting 2030 saying by 2030 you must be using especially governments and government approved systems. you must be using these postponum algorithms um to protect you know the the and I'm sure banks and major corporations all will follow um any anyone who's serious about protecting their data needs to start moving to postmortm algorithms what's interesting about CIQ is that you folks have achieved something pretty significant your NSS module for rock has become the first to get CAVP certification from NIST for postquantum cryptography algorithm but you have said the algorithms were already feature complete. What is the difference between having working code and actually being FIPS compliant? So this is working code is is somewhat the easy part of things. Um many there are many libraries out there that implement postquant algorithms. Um they they're correct. They're done by open source engineers, people who understand this stuff. Um, they're feature complete. You can use them. But in order to get FIPS certification, there is a step way beyond that that you have to do. And so the kind of things that you have to do are um so here's an an example. Um the code I was working on was originally written in Rust, the Rust language, converted into C by uh an automated process. And so in that automated process, it leaves leaves a lot of intermediate copies of data. Um just the the process of transpolation creates those those all have to be zeroed out in order to make a secure implementation. The implementation works, nothing wrong with it, but it's not certifiably secure because it doesn't zero out all of the intermediate values that are left lying around in the computer's memory. So, I had to go in and basically add all of that code um to the existing open source code. And we we've published all this stuff. This is nothing proprietary, nothing secret. It's all published on the website um linked to by the blog post. Um you have to zero things out. The other thing that you have to do for FIPS certification is when the uh library is first loaded, you have to prove you have to run tests that say yes, I ran this algorithm through a certain test vector and it created the correct output. And in some cases you have to say and I corrupted the input test vector and I made sure I got a different wrong output. So all of that code um this is the unglamorous part of of making these algorithms. This is the fifth certification part. So actually writing the algorithms and having it all work, that's the fun sexy part, right? Making sure that you actually run these tests and they're called um caps, known answer test. And then when you generate keys, you have something called pairwise consistency tests. All of these things need to be added into the code. And usually what happens is they're the last thing cuz they're they're the boring part. They're the part that nobody wants to work on. So they're the last thing to get added. And in this case they were just missing. Some of them were missing. So we had to add those. And again we've submitted all this stuff upstream. And this is what we have to do for all of the libraries we work on. But this was the first library um that we worked on that had the postquantum algorithms um ready for certification. There there are more coming. Um there will be um you know many more libraries uh that we're tracking in open source. Um, and you know, this this is why a lot of the open source libraries, they don't really care about FIPS. It's not their target market. They want to get the code working. They want to get the algorithms out there. They want to get it widely used. You know, making it certified by government is very low on their priority list. And so and so basically that's where CIQ and other vendors come in and actually do that work because it's you know it it's not the fun exciting stuff but it's essential work that has to be done if you're able to go through this go through the certification process and get your certificates that say yes this is correctly implemented it's tested you can deploy this safely while I do understand that open-source community or open source developer they don't have that much time Mostly they work in their free time. They volunteer to all these projects. But now we have laws like CRA where we cannot have the luxury of not caring about code and security. So what impact can CRA have on these kind of projects? >> Oh yeah. Yeah. Uh I I mean you know it's not fair to expect the open source projects to do all this work for free. um you know the maintain a burnout problem is hard. The fact that you're you know you everyone's seen that XKCD cartoon of of comic of you know the entire infinite infrastructure and one guy working on his own in Nebraska who's holding everything up uh who's who's imple who's uh implementing this essential piece. That's that's really kind of true. Um you can't expect opensource projects to care about the uh details of FIP certification. Um that really is the for the people who want to take that code and deploy it. They're the ones who have to um basically go through the standardization processes and make sure everything works. Now you know obviously you can the the we submit that stuff upstream. We make it available. We we don't want to keep anything proprietary around this, you know. Um but the the hard part is actually going through the certification process because it's it's a very involved process involving collaboration with a government lab, lots of back and forth, code reviews, you know, bug reports, fixing bug reports. Um you know, it it's it's quite a long process. Um which is why we're so pleased to get there. [laughter] Here is what I'm curious about. You are tracking PQC implementation across five different FIPS cryptographic modules. Why is it so complex? Why can't you just flip a switch and make everything quantum resistant? >> Well, uh so why is it so complicated? Um these are implementations that have to modify the TLS uh key exchange protocols. Uh and that's actually I mean somebody um worked out that if you printed out the TLS specification it would be several feet high worth of of documents. Um and the idea is um that there are different ways of doing this. you can do what's called hybrid um postquantum where you um mix the postquantum algorithms in with existing algorithms. Um so essentially you you're getting the best of both worlds. Um or in the future um they may uh they may standardize on just quant quantum resistant only and this very much depends on how mathematicians or if whether mathematicians are able to crack the um postquantum algorithms. Uh and this is an area well beyond my area of expertise. I just I just watch the mathematicians and the papers come out and you know look at the recommendations coming out from the IETF. Um, but it's it's difficult and it's complicated because you have to mix this into existing protocols in a way that existing devices that don't use these things still work. And yet modern devices that have postquantum can recognize each other securely and say, "Oh, you use you want to use Postquantum. I want to use Postquantum. Let's do that." And you have to do that without what's called a man-in-the-middle downgrade attack where someone sits in the middle and says, "Hey, I I'm pretending to be the other side to each other and I'm going to force you to downgrade to a non postquantum algorithm so that I can collect and decrypt later. So it it's actually a a quite a complicated process [snorts] and of course you have to test all of this and this being open source there are five different implementations of the same thing. Some some projects use one some projects use another and so in order to be really secure you have to make sure that each of those has postquantum algorithms implemented in them. Um and you know NSS is just the first. I think the next one and they're already very well ahead of this is OpenSSL. Um probably GNU TLS um will come along a little later. I'm I'm still tracking that. I I monitor upstream very carefully to see what's coming down the down the line there. Uh and then the Linux kernel. Well, the Linux kernel is a Linux kernel. Um that depends on the taste of Linus and he wouldn't like FIPS very much. because security kind get gets in the way of what he wants to do. So you know that that will be done um probably uh by the security team the people who maintain the crypto code in the Linux kernel. >> Well I have great faith in them. So but they're not there yet. >> Who actually needs to care about it right now? What industries are most at risk from these harvest now decrypt later attacks? And what do compliance regulations actually look like for PQC today? >> Everybody should be thinking about it. [laughter] Even even the people who are just doing the root the the routers that sit in people's houses should be thinking about this. Um essentially, yeah, obviously defense, uh finance, healthc care, you know, people people who are really going to get fined big time if if they um if they have data breaches, obviously they're the ones who need to come to do this first. But eventually, it's going to affect everybody. My my big fear is the Internet of Things devices where you have these, you know, $20 devices that use um TLS, they're never going to get fixed. No, never going to upgrade the code on those. So, they have to get replaced, you know, and there are a security is only as strong as its weakest link. So, you know, everybody has to start thinking about this, but it's very hard to move the entire industry forward on on uh on on secure communications, you know. Um but I mean the whole point is to move everybody as fast as we possibly can to the highest level of security appropriate for what they're doing of course, but anyone who's dealing with customer data really has to care about this. >> You are dealing with two different algorithms here. MLKM and MLDDS. Break those down for us. What do they actually do and why do we need them both? >> Well, KM is the key exchange algorithm. So, that really is the one when you are first uh when you are first setting up a TLS communication, you need to exchange keys so that each side has a a common encryption key that they can then use for bulk encryption. And so, that's where MLKM comes in. The MLDDSA algorithm is used for digital signatures where you have data that you have to verify. You have to sign and you have to be able to verify that there's this comes from the person who had this particular private key. And so they're the the two different uses, but you do need both. Um so I yeah it's not quite accurate but sort of the data in transit is MLKN uh data at rest is more uh the DSA u algorithm is probably the the simplest way to look at it. It's not completely accurate but >> now another achievement for Rocket Linux is that you folks are the first enterprise link distribution advancing towards full FIPS 148-3 validation for PQC. What makes this validation process so brutal for open-source projects and why isn't the upstream community just doing this work themselves? Well, I I mean open source projects are not going to do this because it's expensive because you you can have all the best code in the world or I mean you know all of the code that we've got is all open source and yet you have to go through the lab. The lab has to test this stuff. The lab has to write up your security policies. This cost a lot of money. Open source projects are not going to do it and you can't expect them to. It's it's not really their role. um you know the major vendors um they all have FIPS certifications uh to some extent I I think this is the first postquantum one um I'm I'm sure everyone else is is eventually going to do this uh it's it's nice to be first uh and of course I I would recommend people look at uh uh CIQ um CIQ's Rocky Linux um but yeah it's it's a matter of going through the process and the process is expensive, long and uh very very detailed very detailed um because they have as I said they have to examine all of the code um and you have to work with them on feedback you have to propose patches they have to agree with them all the algorithms have to be tested it's it's a detailed process because essentially they're the labs are putting their reputation on the line They're saying we guarantee that if you use this code, you are using correct algorithms correctly implemented and tested. So they can't make a mistake. You know, they they put their reputation at risk if they make mistakes. Um and so it's yeah, it's it's a difficult process, you know. Um that not saying whether you know everyone else is doing this too. Um um I'm just happy that we got that NSS done with PQC first. [laughter and gasps] I'm sure others will follow. Let's get practical. You are expecting full FIPS 140-3 validation for NSS in second quarter of 2027. OpenSSL validation is starting later this year. What should companies be doing right now to be fully prepared? What does a realistic migration path looks like? >> Um, you need to start turning off older algorithms. So, this is a security policy. Um, uh, this is a security policy kind of thing. So you have to start looking at the applications that your business depends on and the libraries that they depend on and you have to conduct an audit of what algorithms these libraries are using and whether those algorithms are going to be uh FIPS compliant and also um postquantum secure and you have to start turning off the algorithms that are not. And as soon as you do that, of course, everything will break. Lots of applications that people are using will break. I mean, you know, just some of the storage uh I think ice scuzzy the standard algorithm for I I'm originally a storage engineer obviously from Sambber. Uh the original I I scuzzy storage al um algorithm uh security algorithm is completely uh deprecated now. Shouldn't be used any. So, you know, sometimes you have to say, okay, well, we can't really use that protocol anymore. We have to we have to migrate to a more modern one or we have to rev the standard so that um you can put um stronger levels of encryption, more modern algorithms in there. I mean, look at Microsoft and how long it's taken them to get rid of um NLM, which was the authentication algorithm used in the original Windows. um they're only just getting rid of that now in active directory. So it takes a long time especi the the larger your deployed user bases uh the longer time it will take to migrate but you need to start looking now and at least in your test labs you need to start turning off algorithms and saying does my stuff still work if I turn this off and so that's kind of how you you need to start approaching it and but in order to do that you first need a secure audit of what am I actually losing um because you can't you can't fix things that you don't know you have. So getting a full audit of the software bill of materials that you're using. Everybody uses open source code. Everybody uses open source code. Um you know it used to be that you you went Microsoft you could say no I'm I'm only using Microsoft stuff but these days everybody Microsoft uses open source code. So you need to know what it is that you have in your organization. Uh what algorithms are being used and create a migration plan to move to a more modern more secure future. The NSA's CNSA 2.0 sets transition milestones starting in 2027 with full migration by 2035. That sounds like a long runway. But you are saying organization need to act now. Why is this urgency and how the people who are watching this uh discussion should start preparing right now? What is your advice to them? >> We stand on the shoulders of giants. Um we've taken open source code work with it passed it through the certification. Other people can do the same. Hopefully other people will take the code that we've published uh and do the same for them. Um we're hoping to move the whole industry forward. Um, you know, we're I I think we're great, but there's there's nothing special about what we do. Many people can do it. Um, especially as the code is public, uh, you you can take it, you can examine it. Um, so at least even if you're not going through the lab certification yourself and you don't need to have a certified product, you can still use the same code that has been certified. um because it's freely available to you. And so, you know, if you take anything away from this, it's please learn from other people, use other people's work. They've made it freely available for you. Um it it's it's there for the taking and it will help make your environment, your industry, your company more secure and that's the goal that we all want. You know, obviously I'd love it if you bought it from us, but I'm an engineer, not a salesperson. [laughter] You go to ciq.com. I'm sure they'll happily point you at the right salespeople for whatever it is you want to buy, but I I you know, as an engineer, I I would like to move the whole industry forward to a more secure postquantum future. Jeremy, thank you so much for joining me and breaking down postquantum cryptography and what organizations need to be thinking about. Right now, the shift to quantum resistant infrastructure is coming faster than most people realize and I look forward to another discussion. Thank you so much. >> Oh, thanks so much for having me on. It's been a lot of fun. I I really enjoyed our discussion and I I hope it's been useful for your viewers >> and for those watching. If you are running infrastructure that needs to be secure 5 or 10 years from now, make sure to check out CIQ and Rock Linux. And don't forget to subscribe to TFIR, like this video, and share it with the team. Thanks for watching.
Video description
Adversaries are collecting your encrypted data right now, waiting for quantum computers powerful enough to crack it. This "Harvest Now, Decrypt Later" attack isn't science fiction—it's happening today, and if your data has a shelf life longer than 5-10 years, you're already at risk. Jeremy Allison, Distinguished Engineer at CIQ and co-creator of the Samba project, breaks down post-quantum cryptography (PQC) and why organizations need to start preparing now. CIQ recently achieved a major milestone: their NSS module for Rocky Linux became the first to receive CAVP certification from NIST for post-quantum cryptography algorithms, and Rocky Linux is advancing toward full FIPS 140-3 validation for PQC. In this conversation, Jeremy explains what PQC actually is, why quantum-resistant algorithms like ML-KEM and ML-DSA matter, the brutal complexity of FIPS certification for open-source projects, and what migration paths companies should be building today to protect their future. Read the full story at www.tfir.io #PostQuantumCryptography #PQC #CyberSecurity #QuantumComputing #RockyLinux #CIQ #FIPS #OpenSource #Encryption #TLS