 I'm just going to dive right in since I've been introduced. About three years ago, I gave a talk on the issues of post-quantum crypto. So today, I'm going to be talking about the NIST contests for determining, standardizing, post-quantum crypto algorithms to be able to be used in our systems. So what I'm going to talk about today, I'm going to do a quick review over some of the stuff I talked about three years ago, just to put us in the level set. Then we're going to talk about the specifics of the NIST contest, what NIST is looking for, what kind of algorithms they got, et cetera. I'll do a quick review of some of the four major post-quantum systems that I discussed three years ago. It'll be very quick overview. And then we'll look at some of the unique algorithms, some of the algorithms that don't fall into those categories. And then we'll talk about some conclusions. What I'm not going to do is I'm not going to go into the math behind any of these systems. And we're not going to do any deep dives other than some of the unique algorithms on how these algorithms work. First thing, post-quantum computers are coming. IBM last month announced the commercial availability of their quantum computer. They've actually had it available for over a year for researchers. It's a 20-bit. So we're not worried about breaking anything with it. But the fact that I'm talking about a real machine that does 20 bits of quantum is something that I couldn't say three years ago when I gave this talk. So it's moving faster. It's now looking like not if, but when we're going to have quantum computers. Quantum computers allow us to be able to solve some problems that we can't solve quickly in classical computers due to certain kinds of parallelization. Not every hard problem can be solved with quantum computers. Unfortunately, some of the ones that can are all of our existing public key systems. So RSA, DH, DSA, EC, all toasts once you have quantum computers. There are some algorithms for doing general solutions. You have a single answer to a problem. And you can check that single answer. If you can do that, you can find that answer with a quantum computer in square root 2 time rather than the slow time in our existing systems. But that doesn't really break our symmetric algorithms. It means that our key sizes have to be twice as big in order to have the same security. Hashes would be the same thing. But our hash functions are already twice as big because of collisions. Because in collision cases, we have a root in time anyway. So not really a whole lot of hurt to our symmetric and hash algorithms. There are sets of algorithms that are quantum safe, that we can do public key operations with, or thought to be quantum safe. And we'll talk about some of the issues with them. And when we need those algorithms, it depends on how long you want to secure your data. So if you want to secure data for 10 years, make sure it's a symmetric key encrypted. Don't use a public key system because it will probably be broken by then. Well, because we're in this case where we're actually running up where we really needed these post quantum algorithms yesterday, NIST has started a new contest just like the AES and the SHA-3 contests looking for replacements, so looking for proposals for replacements for our public key algorithms. NIST announced this contest in 2016. The submission deadline was November 2017. So all last year, they've been evaluating the submissions. NIST got 87 submissions of which they considered 69 were properly formatted. That is, had all the right criteria, met physical criteria of you had an algorithm, you had a implementation for the algorithm, you had all the right descriptions and documentations in place in order to be accepted. Of those 69, five of them were broken and later withdrawn. Another four or five are broken, but the presenters have not, the teams have not withdrawn them yet. There was a conference in April 2018 where everybody presented what their algorithms were. There'll be a second workshop in August of this year, assuming the government shutdown didn't push that out. And NIST plans to call that list down to 20 or so algorithms. So we have 64, officially 64 algorithms still in the contest. That should drop down to about 20. Seven of those 64 don't fall into our traditional post quantum algorithm systems that we talked about before. And NIST is going to select more than one algorithm for standardization. They're looking at maybe three or four of the algorithms. And they're also saying when an algorithm drops out of their contest, it may come back in some future standardization. They're just trying to get it down so that we can really focus on just a couple of them and make sure that they're really secure. OK, so most of the submissions were from teams of people. Only six had a single person submitting it. And two thirds of the submissions had teams that submitted more than one application. So there's people who were on several different teams. And that was the more common case, not the least common case. Of those that only had people who were on their team on one algorithm, there were 26 submissions. 10 of those 26 were broken. So almost half of the ones where you were only on one team are broken. The other 43 had about four broken. And in three of the unique completely invented out of blue algorithms were broken immediately. So when NIST presented this, what they're asking for is replacements for our signing, key encryption, and just regular data encryption algorithms. And they're looking for them in five different levels, which correspond to the current security of AES128 through AES256. And they asked the presenters to concentrate on the first three levels. They're going to evaluate them on how well they meet that security level. They're also going to evaluate them on how well they perform. Can you do an operation in less than a day? And then they have some other nebulous criteria like drop-in replacement. Can we take an existing protocol and drop in this algorithm without having to change the protocol? How strong are they against side-channel attacks? As you saw, Simo and his talk at RSA has some difficulty. The post-quantum steps are even more complicated and have more issues in that area. Perfect forward security, simplicity, et cetera. So here's the base cryptosystems that I talked about three years ago. Most of the presentation, or most of the algorithms fall in one of these. In hash base, last time I talked about how the hash base system and one of the issues with hash base systems was you couldn't use the same key twice. Well, all the ones in this standard are what's called stateless. What they've done is they extended the space of the hashes, the number of times you can sign, up to the point where they've covered every possible signature for that size. And so they don't need to keep track of how many times you sign your signature determines which of the keys you're going to use. And then it gets used at that one time. One of the important parts of hash base is the merkle tree. I only bring this up because it turns out other cryptosystems use the same thing. It's a way of taking a very large key, compressing it down to a hash. And the cost is you make your signatures very large. Here are the systems that are proposed for crypto hash. There's only two proposals. Most proposals have multiple flavors. And because there was only two, I was able to list all the flavors that were available. As you can see, the public key sizes are pretty reasonable. 32 bytes, 64 bytes. That's smaller than the RSA. But the signature sizes are in the order of 10,000 bytes. And the times can be fairly long, up to half a second. Some of them are even over a second to sign. So it's a kind of heavy weight in size of signature and signing time. But there may be some operations that can deal with that. Codebase. Codebase uses the notion of air correcting systems. So you have a noisy line. You want to be able to send your data through and not have to send it three times. You send your data through once with some extra stuff. The data gets corrupted. You're able to use the extra stuff to recover the data. Turns out you can turn this into a crypto system. You take a matrix that puts that extra stuff into your data. And if you obscure that matrix with some additional matrices, people can't tell what the stuff that you put in in order to do the air correction is. Only you know what it is. And that's your private key. And so you simply encode some data with that and make it noisy. You XOR some random data to it to simulate a noisy line. And that's your ciphertext. And you decrypt it by undoing the air correction. And you get your data back. This is actually a very old system. First proposed in the 70s. We have lots of confidence in it because we've been trying to analyze it a lot. Unfortunately, the key sizes are fairly large, megabytes, and when they try to reduce it, most of those have been broken. So the proposals are some ways of reducing the key size. One of the proposals is the original standard, the classic maquiles. You can see it's got a one megabyte public key. And it can be a bit expensive to do a key gen almost a second. But you can see others of them are in the order of under 10,000 bytes. There's still thousands of bytes, high thousands of bytes for your key sizes, and thousands of bytes for your encoded key. There are some of the gets in the hundreds. Who knows if they're actually going to be secure and make it to the end. But oh, and code base is only for encryption and key encryption. There were some proposals for doing code signing with code base. All of them turned out to be insecure. All of them were broken. So there are no code base signature algorithms that survive. OK, lattice. I'm going to completely hand wave at lattice because it's complicated. There are two main categories of lattice. There's the LWE and then the RLWE entry systems. The LWE is provably secure. We have pretty good confidence in it. At least we have confidence that it's secure as the underlying hard problem that it tries to solve. The other systems, they haven't really hatched out the proofs that much, but they get smaller key sizes. Lattice is the one that gets as closest to our current system. So if you look at them, this is by far the most common proposal. I did not include all the different variants of all the different algorithms. I've only included their smallest ones on their list so I could fit them all here. You can see that our key sizes are getting into under 1,000 bytes. And oh, by the way, whenever you see these lines here where it's in orange or yellowish here, those are algorithms that are already broken. There's encryption and there's code signing. And there's signing that can be done with lattice. So it's a pretty flexible system. Multivariant, since I'm running short on time, we're not going to talk about how it works. But we'll jump right to the key sizes. One of the interesting things on multivariance is this is another one of those algorithms that can use Merkle trees to compress the public key. So you can see we've got fairly large public keys, orders of hundreds of thousands of bytes. But some of these have really, really small public keys, but you'll see the key sizes are on the order of tens of thousands of bytes. So if you have systems where you're going to generate the key and then sign, it's good to use the smaller key size because the sum of the public key and the key size is smaller than it is for the other variants. OK. I'm going to turn this into a homework problem for you. Mersane is one of those systems that are not based on the systems we have. It actually looks pretty secure. It's cool. What's interesting is it's simple enough that you can go read the paper without a whole lot of math and understand how it's working. And the other important thing about it is the scheme that it's using is related to the scheme some of the smaller lattice systems are using. And so you can get an idea of how that scheme works without having to dive into all the details of lattice. And underneath the covers, it's using air correction systems. It can have a decryption error because the air's got too high. Same thing with lattice. So if you see something with air correcting systems in a hash system with air correcting systems, it doesn't mean it's code-based. It could be a lattice-based system or like this system. So air correcting systems are becoming a new cryptographic primitive that we'll just have to deal with. Guess again. Guess again is an interesting algorithm. I want to bring it up because it was just completely off the wall, bizarre. Basically, it uses probabilities. And the idea is you have a better than half probability of guessing something, but the attacker only has a half probability of guessing the same thing. The issue with this is it encrypts one bit in 2,000 times the size of whatever your numbers are, which are fairly large. The other problem with this is a simple script broke this algorithm completely. So psych. ECC has new life. Psych is the ECC, but instead of using point multiplication, it uses mapping to different isogenous curves. Now, curves are isogenous if they'd have a one-to-one mapping for the numbers. You have a set of numbers on one curve, a set of numbers on the other curve, and they have a one-to-one mapping. Between those numbers, they're isogenous. And so instead of points, the public keys are actually curves. And we send these curves around. The psych is more compute-intensive than ECC and slightly longer keys, but actually still pretty good. We're still under 1,000 bytes for our keys. But you can see the encryption times are much higher than some of these other algorithms we're looking at. So ECC is back again. ECC is still part of crypto. RSA. RSA is back again. Not really. This is mostly a joke, but it made it into the first round of submission. And it's good to think about these types of things. The question is, can we make RSA big enough to make it safe from post-quantum? And the answer is yes. But it takes a gigabyte key, four gigabyte private key. It takes three and a half days to generate a key, 20 minutes to verify a signature, and two and a half hours to make a signature. Anyway, so crypto's the quantum computer's coming. We have alternatives. We now have a good list of possible replacements. That number is going to get smaller in the next year. And there's where you can go find more information about everything. So any questions? Yes. Did there were any proposals with like a drop in replacement with the system we have now? The lattice-based proposals are fairly close to drop in replacements, their sizes. Actually, the isogenous ECC is pretty close to drop in. The only issue is the performance. The key sizes are bigger than ECC, but smaller than RSA. So yes, the ECC ones are very Diffie-Hellman. I'm sorry. The question was, are there any systems that are Diffie-Hellman-like? And the isogenous ECC is very, very Diffie-Hellman-like. The whole math is exactly Diffie-Hellman, the way the protocol's structured. So yes? You mentioned some persistence with 32-bit keys. But any 32-bit keys, a system can be a protocol. Oh, OK. I'm sorry. I should have mentioned it. I did not label those slides. Those numbers are not bits. Those numbers are bytes. So when you see 328 bytes, that's bigger than an RSA key. You multiply the bits by 30. So the 32 ones are actually 256-bit hashes is what they are. The times are in milliseconds. So these are all still sub-second. But if you see something like 1,000, it's over a second. Any other questions? OK, thank you.