 All right, everybody. Thanks for being here. We're going to kick off next talk with speaker Christian Paquin, and he'll be covering migrating to quantum safe crypto to protect against the quantum hacker. Thank you very much. Thank you. I'm very happy to be back this year to discuss our progress towards migrating to post quantum cryptography. The work I'm presenting today is in collaboration with a lot of people, some in my team at Microsoft Research, also with some other organizations, but for this particular work, it's mainly been done with Douglas Stibula at University of Waterloo. So I don't know if you remember or some of you might not have been born 20 years ago when South Park introduced these very weird characters, the little gnomes, and they went around and they had a plan to make money, a kind of interesting business plan. They would go around steal people's underpants and then with some unknown way they would try to make a profit. I'll get back to these guys later. But on an unrelated note, also something that happened 20 years ago is that I started to study quantum computing in my graduate studies at the University of Montreal. It was a fascinating subject and I kind of abandoned it. I came back to it a few years ago, this time not trying to use quantum computing, but trying to defend against it. Because I'm sure you all know that, was that not on before? Okay. So I'm sure you all know that quantum computers, although they would be fantastic for the field of algorithmics, you can solve a lot of problems more efficiently. For the world of cryptography, it's really bad news. Because a quantum computer would break due to Shor's algorithm. It would break RSA and DSA and ECDH and all the elliptical variants, essentially, because it can solve the underlying mathematical problems on which these schemes are based. Namely, it can factor and find a discrete log of numbers, very efficiently in polynomial time. What does that mean? It means that a quantum computer would break all the public e-crypto we use today. And what I mean is that it would break HTTPS, TLS, it would break SSH, it would break peer to peer communication messaging systems like Signal, break certificates, software update channels, and bitcoins. If somebody has a quantum computer, it could steal all the bitcoins. And there's another algorithm that's important in quantum computing. It's called the Groover's algorithm. It affects hash functions and symmetric primitives like AES, but that's not a big problem. We can just double the key sizes, the hash sizes, and we're fine. So Shor is really the main problem here. And there's a wide spectrum of estimates of when quantum computers will be built. Some people say it will never be built, some people say it's going to be done in five years, but there's a wide consensus in academia and experts that say a specialized quantum computer could be built within a decade or two to break RSA. So that means that we need to start thinking about migrating to quantum safe cryptography. So quantum safe cryptography, or post-quantum crypto, is not cryptography that runs on a quantum computer. It's a type of algorithmic schemes, cryptographic schemes that run on normal computers, but for which we know no way to break them with quantum computers. So I get asked often, why do we care now? Why don't you just tell me a year before the quantum computer is going to be built and then we're going to scramble and rush to fix everything like we did for Y2K a year before? That's a valid argument. The problem is that the data we encrypt today on the web is at risk. It is at risk of being captured now, stored for a couple of years and then decrypted. So if you're sending important data, trade secrets, or if you're a whistleblower and sending confidential information, then if this data can be decrypted in 10 years and ruin your life, then you need to start protecting it differently today. Another problem is that once we're ready to transition, it's going to take a long time to change the standards. TLS, SSH, all these things will take years to update and consequently also will take some time to update the software stack, all the code. There's this notion of crypto agility to program your software to be able to replace algorithms. We've done that many times replacing MD5 with SHA-1, SHA-1 with SHA-256. So we're used to that, but you'll be surprised how many places RSAs just are coded in some code bases. So it might be hard to transition some software stacks. And also these new algorithms, they don't necessarily function like the old ones. Some of them have huge keys, they run slower, so it might have a critical impact on some software. Bottom line, if you have data that needs to be secured in 10 years, take all these steps backwards, we kind of need to do that today. We're back to these guys now. So now they have a new idea how to make money. They don't need to collect underpants anymore, they can simply collect ciphertext, wait a few years and then they can turn out a profit when they have access to a quantum computer. Fortunately, on the crypto side, we're on top of things. So NIST, the National Institutes of Standard and Technologies, they started this effort to provide new standards to replace RSA and ECDH and all these things. They started in 2017, they got, they asked the community to provide, to propose new schemes. There were 69 accepted schemes that were submitted and in January of this year, they started round two. So Thanos happened and a bunch of these just got snapped away and then some more got merged together because they were very similar and now we're left in right here with 26 schemes and we're two weeks away from a workshop organized by NIST where the submitters will discuss these run two candidates, what updates they did in the second round. And if all goes well, then within three to five years NIST is going to publish some new standards. So it's coming up fairly, fairly soon. So our goal is to experiment with these new schemes and see what's the impact on the software we use today. So this project was started by the University of Waterloo and we joined among other mini collaborators to create a framework in which you can integrate your new crypto schemes and in turn we can take these and integrate them in higher level software. So the open quantum safe project is an open source project, it's available at this URL. Our goal is to integrate all the round two schemes where we're on the way of doing that and we also have integrations into open SSL, open SSH and also open VPN which allow you to try and deploy these things experimenting with post-quantum cryptography. What's new, what we did in the, this year is we kind of completed the roadmap of all the use cases. So we can do full post-quantum and also the hybrid case classical post-quantum and both the key exchange and the authentication, the signature part for both TLS 1.3 and SSH. We also have 1.2 support, I'll talk about it in a second, but it's not as complete. And we also implemented wrappers so you can use the library from C++, C-sharp or Python. So using this framework we did some case studies and analysis of different ways you can plug in these algorithms in our target protocols SSH and TLS. So we just released a paper describing this, it's going to be presented at the NIST workshop in more details. And we also have an extra collaborator on the paper, Eric Crockett from Amazon who provides also some insight from their S2N SSL integration. And I will present a few results from this analysis to give you an idea of what is entailed when you try to migrate to post-quantum cryptography. So first I mentioned this before but we support what we call hybrid deployments. So hybrid means that, so if you take a post-quantum algorithm you don't want to transition to it right away. You don't want to dump RSA and start using a like picnic signature because these are fairly recent. RSA has been there since 76, there's been a lot of crypt analysis done on it. So we don't know. These post-quantum algorithms may be broken in five years, not even by a quantum computer but maybe by a classical computer. Maybe there's a flaw in the algorithm that we haven't seen. So how to achieve more safety is to combine a classical algorithm with a post-quantum one. So you take let's say a PCDH and you combine it with a psych algorithm and then you create a new one. And the resulting communication will be secure if any of the two is secure. So if there are no quantum computers then you're safe, you have your elliptic curve Diffie Elman. And if there is a quantum computer then your communication will still be secure because you would have protected with a post-quantum one. So TLS, SSH, they already have this notion of negotiating in algorithms. The client sends the support list and the server responds and negotiate which algorithm they'll pick. But we don't have a way to pick two algorithms at the same time and combine them. So that's where we have to work. And there are multiple ways to do that. We listen to paper. The simplest one is just to create a new scheme. It's like P&J sandwich, peanut butter and jelly and just combine. It's the new thing. It's P&J. So the same thing. We can take a classical algorithm, a post-quantum one, combine them and that's a new thing. The great thing is that this is backward compatible. So when you say I support these algorithms, you just name the new one, which is a combination of both. And if the other peer supports it, you can have this exchange that provides extra security. And there are a bunch of more advanced ways to negotiate the two algorithms separately. And they're a bit more complicated. They require protocol changes and might affect a few items. And the things that you need to take to care about when you design these is, okay, how is it going to affect backward compatibility, performance? Because for example, in TLS 1.3, when the client sends its first message to request communication, it has the ability to pre-compute some data. So assuming you're going to pick ECDH, here's a pre-caputation for this exchange. But if I'm going to send three proposals and I have to compute pre-calculations for each, I'm consuming data, the data is bigger. So we don't want to introduce too much more problems or more data, increase the bandwidth requirements. And we don't want to add communication messages and affect the data flow too much. So these are all things that we take in considerations. The first thing we've implemented and the one that we use for our experiments is the first one I mentioned, the combo scheme, because it is the simplest to implement minimal changes in open SSH and also doesn't affect backward compatibility. So it's easier all around. So for the TLS use cases that we considered, we have some work in TLS 1.2, the more recent work in TLS 1.3. The base protocol does not support, it only supports, let the curve type extensions, ECDH. RSA is not even there anymore. So we have to masquerade our algorithms as curves. So we pretend to be an elliptic curve and then we can run. So we don't have to modify the stack. And one problem is the size of these things. As I mentioned, some scheme have very large public keys or very large encryption systems or signatures. And the protocol specifies a maximum size, in this case 2 to the 16 bytes for public key and 2 to the 24 for certificates, for example. And some scheme are bigger than that, so they wouldn't fit. But also open SSL has smaller limits because RSA is not that big as some main thing that's used. So they don't want to provide this empty buffer where people could dust them. So they reduce this limit. So at some places we had to tweak, remove some of these limits in open SSL to be able to run our experiments. Open SSH is kind of similar, but it has a bigger limit. It's 2 to the 32. So all the proposed round 2 schemes would fit there in theory. Open SSH also has its internal limits that we had to tweak to fit our algorithms at some places. And in this case we support both the client and server public key authentication. So it's kind of a full coverage of what we want to achieve here. Just as a summary of what we tried is a table that shows for the key encapsulation, so the key exchange. So as I mentioned at the beginning in the OQS slide, there were some algorithms that are in there, some that are not. And there's no particular reason to why schemes are excluded other than the fact that people involved in the project contribute their own schemes and the other ones were trying to fill them by taking implementations that we can find by the submitters. So at some point we hope to have all the NIST candidates in that table. The little yellow check mark here, for example, Frodo, has larger artifacts that we needed to tweak open SSL internal limits to be able to fit in there. NTS, which is a code based scheme here, like McLeese also is a similar scheme, which had a very large public keys and we're not able to fit them in the software. We had problem. And the spec in theory would support that size, but we were not able to make it work. So it's a good reason to try these experiments in practice, because even in theory they say the size should fit. There's some resulting, there's a lot of code paths in the software that didn't work. For the signatures, we have a lot more of these yellow boxes, meaning that needed to augment the internal size and open SSL and open SSH to fit these large signatures or public keys. And we only had two failures. Picnic level three and five were expected because the signature size were bigger than what's allowed by the protocol. The round two version of Picnic all fits. So they reduced, they did some optimization that allowed Picnic to be used in TLS. And on the SSH side we had some failures with the rainbow, some more higher level parameter sets. So now I'd like just to show you a quick demo of SSH. So if you know and use, if you know SSH, then that's going to look, it should look exactly as how you used to see it. So the only thing we did is add these post quantum algorithms. And nothing is different. If you've never seen a SSH connection that's just going to look a little bit weird. Sorry for the unreadability of these consoles. But what I'm starting on the right hand side is a server. It's expecting an ECDHE with the NIST curve P384 combined with a psych algorithm. There it is. And on the left hand side it's going to be a client requesting the same algorithm. And I've put the debug output there. So it's just so that it's to show something on the screen. Otherwise it would just be nothing. And we see the connection was successful. The same year the authentication was done with the Picnic hybrid, the Picnic signature and also ECDSA. So the point I'm trying to make here is that we modified these software stacks, but they work exactly as you did before. They did before. If you deployed SSH before, one step that I skipped was the creation generation of the key pair and added it to the authorize set on the server side. So it works exactly the same. Which means that you can basically go out and try it and deploy it even. So what's next? For us is what we want to add more of the round two schemes, the one we're missing. So we can have a full list of all the algorithms. And we want to do some performance testing. We did some tests a year ago with round one candidates and it was very promising. The fast ladder schemes for example were competitive and sometimes even faster than state of the art ECDH. And some slower one, which might not be suitable for multiple connections a second like on an SSL server, could be great for kind of more one-on-one interactions in the messaging system or SSH. So you really have a broad choice of deployment options if you're a system integrator. And we also want to tackle more protocols. There's all sorts of things that needs to be made quantum safe. We want to start looking into that. And for you, an interesting thing to consider is are you working with projects or with software that protects data that needs to be secure for a couple of years, for 10 years. And if so you might want to be able to, you might want to start using post quantum cryptography to protect this data. Either integrating in your code or using some of our forks and deploy them SSH or VPN. All right. So that's my time. Thank you for your attention. I'll be posting the slides right after the talk. And I'm happy to take some questions. Thank you. All right. Thank you. If you have questions, please come up to the human microphone. That is me. Come on down. Can you come up and, thanks. Form an orderly line. My question was just obviously I haven't looked at the GitHub repo yet, but did you use the open SSL module API to build your provider or did you hack open SSL to extend it? Yeah. The engine API you're talking about? Yeah. Okay. So no, we didn't. So the engine API is an open SSL. There's a way to provide an alternate implementation. So if you have RSA and you implement RSA in hardware, you can create an engine to plug it in. So since these were new algorithms unknown to open SSL, we add to modify open SSL itself. And there's open SSL is in two layers, the crypto layer and then the SSL layer. The crypto layer you could maybe get away with using the engine, but in the SSL layer where you define specs or specifications for the actual algorithms, that didn't work. So we modify the code directly. We have a fork. You can see the diff between the original project and the... I just have a question. I don't know. I mean obviously you're aware that IBM has actually created the Q1 system, correct? Because I was at CES in January and that's where they showcased it. And I don't know if you know anything about it, but knowing what it is, it's a legit... How do you think that's going to help your field of migrating regular, like what you were talking about? How do you think that the creation of the IBM Q1 system will help? Maybe not you, but like whenever they streamline it more, how do you think the creation of the IBM Q1 system is going to help you and the people that come after you in this particular... So yeah, so there's a lot of vendors that have work on quantum computing and provide specialized chips. A lot of what's out today, the D-Wave system and some others, they're very specialized and they're not general computers to do all these things. So the estimates I gave at the beginning, 10, 15 years, is the output of very serious quantum computer experts that took everything in consideration, research, time, money, invested then, they give like a 50% chance to have something in 10, 15 years, including all this development. That's all... None of this is surprise work. You see the scaling of quantum computer and we have to plan that pessimistically, if you're trying to defend against it, then our mark is like 10, 15 years. I mean, I just wanted to get your opinion on how you think, even though it might not be the Q1 system itself that helps you, like the later iterations of quantum computing, do you think that will just streamline the migration from regular to PQC? I mean, it's like all this stuff, the migration doesn't use quantum computers at all, right? We're just like, if they exist, they threaten what's in today and you need to migrate to this stuff. So what's going to trigger this is NIST is going to do their, they're going to have new standards and then the protocols will update. And what we're trying to kind of kick start is you might want to migrate before all that takes place because that could be, you know, five, seven years and all the data that you encrypt today until then is at risk to be decrypted later. So if you want to migrate to something, a hybrid scheme, then that you have the tools to do that today. Expecting that one of these systems, the Wave IBM, Microsoft is working on them, Google everybody as a quantum computing team that somebody will have a functioning one in, you know, 10, 15 years. Hi. Does any of the algorithm being evaluated, both any potential application and IoT environment where bandwidth, computational power and latency are critical issues? Yes. So, so there's a lot of different flavors of algorithms. Some are very big, very small, very slow, very fast. And some of them that might not be competitive in speed with some other like fast lattice one, they will, they will give them the advantages to themselves saying that, oh, we have very short keys. We fit in very small embedded systems. So this is the selling point to some of these. And this said that they won't pick a winner. They're more likely pick a bunch of them for different use cases, including IoT, which is a very important scenario. That's all the time we have for questions, but hopefully it can be available afterwards outside. Yes. All right. One more round of applause for our speaker. Thank you very much.