 Awesome. Welcome to Nordsek Psychedelic Edition. So you're not seeing it, but I don't have the Q&A anymore, but it's all right. I have it here. So welcome to the Cryptography Block Q&A panel discussion. You know, we're doing this live, so it'll take its own form. There's been a lot of questions on Slido, so I'm quite happy about that. It was very interesting topics to be discussed. I think the first one, really the first group of questions about DRAND, quite a lot of interest on the, you know, the resilience of the network, the threshold, you know, how many new nodes introduction would become a problem, you know, what kind of protection mechanisms exist to avoid these kind of takeovers by nefarious actors, maybe, you know, someone thinking about what we have heard potentially could happen with Tor or other types of distributed systems. I don't know if you want to talk a bit about this, Yoland. Can you hear? Yeah, so that's a thing I haven't really discussed in my talk, but the trust assumption and the way the randomness is generated make it so that it's enough for just a single node to generate proper randomness during the setup process for the whole thing to be properly random. So that's a very strong assumption. It's a very strong result, because you don't need to trust any node but one, and then you're sure you get proper randomness at the end. So that's super cool. The current threshold is at like 12, so it means half of the network could go down as I've discussed, but the thing is, as long as among the 12 signers, you have one honest party on the signatures checkouts at the end, you're good to go. So the main issue is as soon as you have a threshold of my issues node, then they can collude, you know, they can work together and produce signatures ahead of time, they can go faster, they can break the trust assumption you have in the network, and that's the main issue you have if you have a threshold number of nodes that are my issues. So the bigger the threshold, the safer you are against collisions, but the bigger the threshold, the less safe you are from a liveness point of view, because if a few nodes go down, I don't know, like if AWS goes down and we have like maybe six nodes on AWS, then we need to make sure the rest of the network will keep working properly. Thanks. Yeah. All right. That's I think it's a it's a it's a very good answer. They're, you know, of course, lots of concern on resilience, but you know, probably also on capacity, you know, what's what's the speed like how many, you know, random nonces can you generate per second per minute per hour? What are we looking like? Is the addition of nodes going to influence that? You know, are these sorts of things? I don't know if you can cover a bit on that. Yeah, so the the current way it works is each node will generate a signature every six seconds. So the the current networks is generating one random 256 bit value every 30 seconds. So that's really the current bandwidth. The new network which launched on testnet is generating one value every three seconds. So it's way faster. The lowest we tried to go is one second. You could go lower in theory, but then you start to have issues with latency and like the current latency and signature aggregation is around 800 milliseconds. So if you want to go below one second, you would need very good connections between the nodes and it becomes then very also difficult to consume. So yeah. All right. Quite a few quite a few questions for you, Christian. You know about, you know, Claim QR, NSHC. I think, you know, at the end of the talk, talking about digital identity, you know, some people I guess we're kind of asking because it's it's in the air, so to speak, right? Especially here in Quebec when you're when we're talking about the the efforts that the government is looking at for digital identity, you know, could Claim QR help, for instance, in replacing social security numbers or other elements of identity that that we use on a not daily basis, but quite a regular basis. And we've seen issues with, you know, leagues and whatnot. Well, I think these types of credentials can help in day to day life, maybe not so much social security numbers, because that's typically a very infrequent way. And the way they're a leak, it's not when they're presented, but mostly when they're, you know, stored in the database and then the database gets exposed. But it's easy to imagine scenarios for things we use already, right? So the driver's license is the one that most people use to, you know, get into a bar or whatever. But once you have these type of things in place, then you can imagine a lot more scenarios, especially online, let's say your school could issue you a alumni credential you could use to get some some rebates online or some restaurants or whatnot or same thing exploit employers or employment verification. You I hate to use this word metaverse identity, you know, you could give your online avatars and present some actually identity attributes. And of course, they wouldn't be in the form of a QR there. But if they're just packaged in a type of credential for which the user controls the key, therefore the ownership and they can decide and select how this information can be disclosed and by whom. So I mean user centric in the sense of user empowerment. So okay, that's what the verifier want to see. And then what's I can make the decision if I want to present that information and what's the minimal set I can present. That's why I like to use the word minimal disclosure. So the minimum that's needed for a particular transaction. So sometimes that's nothing. It's anonymous. Sometimes it's your full identity for, you know, crossing a border. So the QR form is interesting because it bridges the gap between these online credentials for which you need, you know, a key store and maybe a smartphone. But there's a wide range of scenarios before that that's just QR, you can carry it on paper literally. And so there's, there's a few opportunities, I think that hopefully the ecosystem can help build. Thanks a lot. And, you know, are there any, any attempts to kind of unify these various health guards that exist around the world? You mentioned that there were, you know, multiple competing standards, I guess. You know, and when we're talking about digital identity might be just broader than, than vaccine scenarios. Are you aware of any such attempts to to unify some of these standards? Yeah, so there's no effort to unify them the same way there's no effort to unify the passports, you know, so each country do their own thing. And that's mainly the result of, you know, all the planet trying to solve the same problem at the same time in a very rapid time frame. So these identity specifications, they take typically years to develop. So it was it's a miracle that it happened in search. So I think the framework is like a one year old. And the reason it succeeded because it was rooted in the medical world, where they want to use that for a long time for when you know, for your kids, cool vaccination history that you have to carry over from state to state and then be able to present that to schools for travel immunizations in the future. So they say, okay, let's use this situation to build something that's going to be lasting and reusable. So that's why here it was successful. And other jurisdictions in the world had to solve the same problem for their own case. So in Europe, yeah, they decided to their green certificate is not just the medical fact, it's also a decision, you know, so it could be a lab test, it could be an exemption. And it's, it's you get a green check mark out of that. So it was the different scope, different decision here is, as we don't want to make these decisions, here's the facts and whoever checks it makes a decision based on the current policy. Yeah, thanks. Alright, I want to I want to go back maybe to talking about, you know, we're talking about entities and how these get set up. You know, you have had to set up, well, work to set up the the League of entropy. I like like this, this name. But there were a few questions about, you know, how this was set up as an organization, how one gets to join it. You know, is there any paperwork involved? Is there any some, you know, formalities? Or is it just, you know, open to anyone? I guess, you know, people want to maybe see what it entails and if they can help. Yeah, so the it's basically a question of how do you do your governance on your network, right? So currently, the League of entropy has a notion of each member is voting on new members to join. Maybe we'll come up with a better way to do the government at some point where we can delegate voting power or whatsoever. But currently, that's how it is because we are only, as I said, 16 members, so it's still okay. To join, you just need to show interest, show that you've got the proper setup and infrastructure and then you can on board on the test net if you get accepted. And once you've shown you were able to run a node on test net and it worked, you can upgrade to main net. So that's how it works on our side. So I guess if you got a bunch of lava lamps, you're good to go. Yeah, that's how it works. Or even just a fish and some bubble generator in your aquarium, you know, that's also chaotic. Awesome. So there was a, you know, there was a mention, you know, in your talk about time locked encryption. And I think there are a few folks that I'm seeing here asking, you know, how does this work? I think it's a very new research and, you know, might be a bit, a bit complex, but can you maybe drill down a bit on how this is thought about and, you know, if there's going to be future work on that and what it's going to entail? Yeah, so the time lock encryption is something I'm super excited about because it's, it's super cool in my opinion to be able to encrypt something toward the future. The thing is we're still working on the paper and the blog post. So it's going to be released in July or maybe August, hopefully. And we will also realize an open source tool that allows you to use it. So no worries, it's coming that technically it's a bit difficult to just explain, you know, like out of the blue without a whiteboard. But what you do is you take the value you want to encrypt. You pick a random value, which is secret. You can XOR the both together and you get an encrypted message, which is a one-time path, basically. You encrypted something with one-time path. And now that random value will get committed towards the round in the future. Using the pairings, we can decrypt or decommit that commitment as soon as the signature is published for that round. That might give you a bit of an intuition of how it works, I guess. So I guess it's upcoming work. So maybe NART sec 2023. Yeah, we hope to be done before that, I guess. All right, awesome. So let's step back a bit and maybe go into the more theoretical, well, theoretical applied crypto. There's been some questions about, I know you've got maybe a bit of background there on quantum cryptography. You know, what's in your opinion, and you along, of course, you as well being a cryptographer, what do you think? How far are we from mainstream adoption? What are the hurdles, basically, for it not to be post-quantum crypto, for it not to be adopted today? What kind of standardization do we need? In your opinion, what's the road to getting there? Yeah, first a little terminology explanation. So quantum cryptography, it typically means quantum key distribution, which is the ability to using quantum mechanisms to establish a shared key that you can use then with conventional encryption. So I think that that's very little chance of being widely deployed. It's been deployed kind of in pilot and very specialized environments, but the classical cryptography that we have today is sufficient, I think, for real world scenarios. The other word that's post-quantum cryptography. So that's what it means. It's normal cryptography that you run with ordinary classical computers, but for which we don't know how to break with a future quantum computer. So a quantum computer is a machine that if we end up building, could break the crypto that we use today. And the problem, why is that a problem now, if it's going to be built in 20 years, is that you can record a traffic today and decrypt it in 20 years, which in some cases, if you're Coca-Cola and you're trying to protect your recipe forever, then that might be a problem. So this, as I mentioned in the talk at the beginning, NIST is, which is the National Institute of Standards and Technology. There are, yeah, working on replacements of quantum algorithms, so they're now waiting or waiting any day now after a three-plus year process to see which is going to be the next round of algorithms that are going to replace RSA and CDSA. So these are the ones that are very close to be picked and then it's going to take a year or two to ride up and standardize. So we're going to comfortable, let's, you know, change the underpinning and the machinery of the crypto pipelines before it's too late. So I think that's in good shape. Yeah, and if you look also at what's being done in the mailing list and, you know, online, people are still not super, are not agreeing on how we should do it. Like should we combine current classical algorithms with the new quantum-resistant algorithms? Should we switch over to the new algorithms even though the current ones have been battle-tested and we know they work properly or at least we hope they do? These are questions that still need to be answered like because if you want to do hybrid encryption or you combine both quantum-resistant algorithms with the classical algorithms it's not the same as if you say okay we ditch the classical stuff and we jump on the new cool quantum-resistant train, you know, and people are getting worried as well because like recently I think it wasn't in the U.S. they said okay now all the secret, top secret stuff needs to move on to quantum-resistant tech like within five years or something like that and people are like what what's happening? They didn't even standardize the post-quantum algorithms yet. Why do they do that now? So yeah yeah I can just like add a note to that so yeah it's a fair point that there might be this transitional period where we're going to use hybrid encryption, hybrid in the sense of yeah combining more than one scheme and that's something that we've implemented it's not something that's pushed necessarily by NIST as their goal is to say it's a new standard but it's something that the industry is wondering about if that would be a good idea. There's this project which I've talked about in the two north sects ago it's the open quantum safe project if you look that up it's led by the University of Waterloo and we've prototyped all the new algorithms including in combinations and hybrid way with the current one with with RSA or ECDSA or sorry ECD Filman. So you can run that in open SSL and open SSH CLA looks and at the end of the day there's going to be some people it's going to be their choice their their policy to what's the performance versus expected security and there's going to be a large body of people who won't care that just follow the government standards whichever they'll pick so it's yeah hopefully NIST will you know in the academic community will have picked good a replacement so at some point that that's going to be the new one. Thanks I think that's you know that leads kind of well into the next question maybe a bit even been broader you know there's been a lot of we've seen the pickup of certain popular software like you know like you know Warwick shark to give an example versus open VPN very different approaches and you know you've just mentioned that the government you know the NIST will standardize some form of algorithm maybe people can use some other algorithm you know in cryptography there might have been some times where we had a discussion about enforcing sane defaults versus letting the users choose themselves what to what to use in terms of algorithms you know what what's your what's your take on this you know what would be a preferred approach should we let the users choose which algorithm when it gets standardized or some configuration but then you could lead into that you know TLS SSL TLS issues where you know basically death by a potential thousand paper cuts so what's the what's the you know what's the the theme there that leading discussions around quantum resilient cryptography yeah I mean it's I'll give a kind of my large vendor answer it's like Microsoft which we have such a varied set of customers from all over the spectrum so it's really hard to pick for them so as a crypto developer it's yeah it's kind of harmful to give too much choice and it's easier to give just you know very minimal set of configuration so you can shoot yourself in the foot on the other end depends you know so the cryptography in China is very different than in the US and we you know we provide software all around the globe so the platform itself needs to be adaptable for that but I think there's it's recognized right the mystic it's not mistakes because you know it's doing the best we can but the the crypto community learned a lot of lessons in the last ten years was all the TLS attacks like hard lead and all these things year after year and then the TLS 1.3 is I think it's a good example of a standard that had a good analysis and review by the academic community and a lot of the research was applied there and there's a way fewer choices so when it came out of the box it was okay okay great like good configuration and then the industry or some part of the industry fought back to trying to put back some features like the RSA encryption because they need it in middle boxes for for traffic intercept for you know some time valid reason so that you know kids don't go and browse the nasty parts of the internet so it's just when they're in school right like so there's gonna be an equilibrium but I kind of tend to like the direction where where the practical security is going so it's yeah it's it's a complicated answer and situation for sure all right Yoland you want to chime in and we discussed a bit about you know this topic earlier on yeah well I would say crypto agility is a good thing for protocol designers and for software developers but it's a terrible things for users in my opinion because it allows you know like non crypto developers to shoot themselves in the food way too easily like if you look at the Josie spec or they could choose the non encryption algorithms which wouldn't encrypt anything and that kind of issues I mean sure they wanted to do good and to provide a lot of options but at the end it were it was too many so I like also the way TLS 1.3 reduced the number of options to just a safe set and how they did it and I think it's super important for us as protocol designers to be able to quickly switch to new algorithms if we see something got broken and that's not working anymore and it's in my opinion also a good reason to go the hybrid way rather than directly jump the to quantum resistant algorithms but yeah we'll see all right so I guess the debates lives on but there's a tendency towards simplifying things and you know leaving less options to shoot yourself in the food all right so it was a it was a it was a you know great session lots of good questions I went through most of them thanks a lot you know and thanks a lot for for all the for all the help and answering these these question Christians well if you have any more questions you can probably find both gentlemen maybe grabbing a beer at the bar or later in the in the chill room thank you all for submitting all these great questions and I guess see you on the next the next block and that will be the red team block in almost 30 minutes now how good one everyone