 It is high time that we develop the next generation of cryptographic protocols and organize the transition in our security infrastructures. Japan is facing a cybersecurity talent shortage. So what can OIST do to help? OIST is a small and a newcomer in the field of quantum science and technology in Japan. But it can help Japan become a frontrunner in quantum security, research and technology. We will focus on our strength, which is fundamental research at OIST, especially on the quantum and optical physics and integrate mathematics, computer science and quantum hardware to create a new seed of quantum cryptography. OIST Advantage is the ability to attract a wide range of globally recognized researchers. This is based on our strong reputation as a world-leading interdisciplinary scientific research institution and a graduate school with excellent and internationally diverse faculty and students. Today we are very pleased to be joined by several experts in the field of quantum computing and cybersecurity, and I welcome all of you to this exciting webinar. Thank you Peter for this overview. The key really is that OIST, the Okinawa Institute for Science and Technology, wants to become a focal point for the debate. And not just the debate amongst the scientific community, but also the linkages and interaction with the engineering world and with the business world. And we sincerely hope that today's forum serves as a kick-off for a forum where we build a little bit of a community to have a free exchange on the challenges and the real-world applications of quantum going forward. So we can see the shrinking computer. So every year we pretty much double this way on the size of the typical feature size in our computing devices. So we are marching towards the quantum world. And then obviously the question you may ask at this point. Okay, so once we are at the level of molecules and atoms, what shall we expect? What is really that makes us, that makes a qualitative difference? Okay, we'll be able to compute faster, but is this the only thing we achieve there? Actually the answer is, and it's actually quite fascinating, it's much more than that. It's not only that we'll just compute faster and faster and faster, but we can compute differently. We can use inherently different physics to achieve something that is simply impossible because certain quantum phenomena do not have any classical counterpart. So in those two slides I will try to explain one concept. And if there's one thing I would like you to take away from this little presentation is this concept of a quantum superposition. So imagine this very simple experiment where you have something that physicists called a beam split or half silver mirror. And that is a piece of glass, such that if you take a photon and you send the single photon on this semi-transparent mirror, it's, and you put two photo detectors on the, on the sort of outside on the on the two output parts. Then if you run this experiment sending a single photon over and over and over again, in a perfect world assuming a perfect efficiency of the detectors and so on so forth, you will notice that only one detector will click. So the photon will never split. So you look at this experiment and your first reaction is, well, you know, it is kind of like a random switch, right, a quantum random switch. You send a photon from a port that indicates zero and it will end up with probability 50% in an output port which I call zero or another one which I call one. So then you say okay I understand this is a quantum random switch that's fine. There's nothing really mysterious about it. But then look at the second experiment which I will now challenge you to explain. The second experiment. If you look at the top, bottom left corner of this diagram you see our previous experiment. But what I'm going to do now I'm just going to collect the output parts. I'm not going to put photo detectors there anymore. I'm going to put proper mirrors, and we'll refocus the two paths on the second beam splitter so I have two consecutive experiments. The second experiments is identical to the first one, except now that I just simply collected the output from the first one and channel the photon to the second beam splitter. And then I will challenge you to explain what's going to happen in this experiment. Then you will say, and this is a perfectly valid line of logic you will say okay, I understand the first experiment I saw a random switch right so the photon will emerge. Randomly either in output zero output one. So when it goes to the second beam splitter the same will happen. So you expect then that again you will see the photon with probability 50% either at output zero or output one on the at the output of the second beam splitter so you see that your photo detectors will probably fire with probability 50% each. Well, valid good solid classical logic but completely wrong. It's not going to happen this way. So what's going to happen in this experiment is that it depends really on how you adjust the optical path the distances between all those mirrors but it can happen for example. You can adjust it in such a way that only one detector will fire at the end so it will be only one detector that will register photon you send the one photon after another photon. And you would expect that you have two random switches so two random switches is another random switch but now somehow mysteriously the randomness councils and the second detector and the second beam splitter generates only one particular output. So that's actually if you find this really bizarre, you are not the only one because actually if there's a mystery to quantum physics this is somehow nicely encapsulated in this experiment. The question is you know what the hell is going on in between those two beam splitters. How come the photon that goes you know and emerges randomly from one of them when we measure the photon mysteriously just somehow gets into that doesn't behave in such a random way anymore in the second splitter. And you know then this is actually for real you can set up experiments and you can really demonstrate the whole thing it's the device that I showed you as known as a quantum interferometer or an interferometer. And, and this is actually quite interesting indication that indeed somehow the quantum world is different and this it behaves in a different way. Now this has impact and the big impact it has is in particular on the whole field of cybersecurity simply because many methods of encryptions are based on the fact that we have certain computational problems which are difficult and therefore it's difficult to crack in the cyber but then if quantum can solve certain problems that are behind our crypto systems, then obviously there is a reason to be a little bit concerned because that means once we have a quantum computer that those those cyphers will not be secure anymore. So, when people realize that quantum superposition is such a powerful tool that can be using the all kind of way. That was heralded as a second quantum revolution the first quantum revolution brought us semiconductors, you know, iPhones and all kinds of digits and electronic devices that you see around. There's a quantum phenomenon that is like a quantum tunneling and so on so forth, but we haven't used to our advantage the whole quantum superposition yes those coherent quantum phenomena are completely different. You know, game completely different sort of open different set of possibilities and we do know now we have enormous impact on privacy on secure communication on generating certifying randomness. It has amazing impact on ultra precise frequencies standards or sensors, you can build you can use quantum superpositions to build devices which can perform measurements of magnetic fields gravitational fields with amazing precision and you know last but not least they also offer those amazing power to to computing devices. Now, this is not I you know I stress, I focus on concepts because my colleagues will tell you a little bit how those concepts can be applied. But let me just say that this, there is amazing progress in quantum technology for example quantum communication is essentially a reality it's almost a commercial proposition, and amazing achievements, both in terms of trying to build quantum systems and also trying to communicate in a quantum way in a secure quantum way from space. I now want to turn over to what what it takes to build a quantum computer and why they're so different. Here I'm going to quote my colleague Bill Phillips, no glory at at Maryland. He's fond of saying that a quantum computer differs more than a laptop computer than the laptop differs from an abacus. And here is that the laptop and the abacus are actually the same type of computer. They're both classical touring machines. One's a lot faster than the other of course, much more useful, but the quantum computer is completely different. The laws of computing at the foundational level are revolutionarily different. And to me the corollary to this quotation is that, why should we expect a quantum computer to look anything like a classical computer. So, I've adapted a slide that was published in science magazine about five years ago, the editors had a review of different technologies, and they all have a certain level of exoticness, a certain exotic character to them. They're either cold or they're in a vacuum chamber, things that you wouldn't see in a normal computer. I highlighted ions and superconductors disease are really the only two systems right now that are being built into systems, and that have lots of industrial support behind them. I think the photonic platform is coming on strong as our neutral atoms, but they're they're not they're not there yet so I want to talk about these two technologies briefly. There's atomic ions and I'll talk a little more about this in a few minutes, we store individual cubits in individual atoms one at a time. And this schematic here is of the gold here is a bunch of electrodes that are painted on a silicon chip and coated with gold. And there's nothing quantum about this chip. The atoms are the quantum part and they're floating above their levitated above the surface. The atoms are actually cold, but they're laser cooled you don't need a refrigerator for laser cooling you just need one laser beam to do that that's actually the simplest part of dealing with atomic cubits. But the atoms are very well isolated in this vacuum chamber, and they're charged atoms their ions and they have a very strong Coulomb interaction, and we can modulate that Coulomb interaction with laser beams. Laser beams are like the wires they allow us to entangle do gates. They're sort of like our transistors the atoms are like our transistors when poked at with lasers. We can turn the lasers off, and the atoms are idle. And when the atoms are idle, they're perfect. Being idle is not very useful though so when we do poke at them with laser beams, there are effective errors from the lasers that look like decoherence. We don't have decoherence in trap dimes it's just that we don't have to worry about the manufacturability of cubits. They're all identical. They have no noise on the on their own it's all about optimizing the controller around the around the cubits. Over the last 20 or 25 years. This is progress in the error rates in the two cubit gates like the NAND gate of quantum computers based on these platforms. The errors are getting better every year, you know 10% 1% 0.1% and the fidelity naively is just one minus the error. So a 99% fidelity means you had a 1% error. And you can see superconductors were are coming down neutral items coming down ions were kind of the first on the scene about 25 years ago. I remember when I met Arthur about 1995 and he started teaching us a little bit about even before error correction what that meant. And with Dave Weinland in Boulder, Colorado, we were building better atomic clocks by entangling the atoms while we were building gates actually. And these gates have gotten better. The community has now gotten down to about three nines. We know the fundamental limits, according to these laser base gates is for maybe five nines so wait on here. This is a arguable summary from the Boston Consulting Group that they did a big study on quantum computing, and they wanted to see when things would come along. I mean these these these these potential market markets are, you know, anyway, you could you can dispute those for sure. The interesting thing is they split it up into three phases, and I think it's, it's kind of reasonable. The first phase we're in right now these are so called noisy intermediate scale quantum computers and NISC sometimes a John Presco coined it. Here we have systems that are not error corrected, but they just like remember when I said we have three nines so we, we can do a thousand operations or something with three nines. So hopefully there's a good application with a thousand with just a thousand gates because if we need to do more we're going to run into noise. So the technical barrier in phase one over the next several years is error reduction. Get your native qubit and your native gate really good. That might be obvious that's that's the first phase of, I guess any technology phase two in 10 plus years will involve error correction where we have used theoretical encoding to encode logical qubits that can withstand much that can that can survive errors. Now the idea I'll talk about error correction in a minute but this allows you to do much longer computations at the expense of needing more qubits. If qubits are cheap and I think in solid state qubits can be cheap you can print millions of them. If we can just do the error correction that's really the hope in those platforms. Phase three is really the scaling phase where as I hinted earlier we need some modular architecture. I would argue that anything complex in nature is modular. Even classical CPUs are now modular they're they're multi core architectures and so forth. So we need a modular architecture in the long haul. There's two big superconducting companies that are building superconducting quantum computers. You see here are designs of their, their dilution refrigerators for their next generation system in fact this Google. This Google rendition, they're planning for one million physical qubits, why one million, because they're going to do they're going to need so much overhead for error correction that might give them several hundred good qubits, but they don't have a million physical qubits to do that. And you can see the dilution refrigerator itself is the size of a small building, and it costs many hundreds of millions of dollars. The nice thing about atomic qubits, they can be held at room temperature this is the entire vacuum chamber that is strapped on top of a chip. And in the future, this will hold several hundred qubits that are very high quality with even lower qubits of 10 to 100. If you looked at one of my laboratories, even five or six years ago you saw a mess. So this is a single iron trap with lots of lasers lots of wires. This is an experiment, we have flexibility on every knob. And of course as we shrink things down, if we know what we want, we can freeze the design. And recently we've put pretty much all of that in this box. This is about a cubic meter scale. And my colleague, John saying Kim and Duke, also co founder of IQ, he's a minute tries it even further down to about a cubic foot in his academic laboratory. And I showed this picture before this is John saying Kim's group again, this, this mini mount. This is a few centimeters across. Now we're not here yet, but some groups at Sandia National Labs and MIT Lincoln labs have pretty much integrated the optics on the chip itself. The point to go home with is if you have too many qubits but not as not as many operations it's just not interesting. And so this this gray areas, this represents what we can simulate classically. If you can only do three or four gate operations it doesn't matter how many qubits you have it's not interesting. No matter how many gate operations you can do if you only have. Sorry, this, this, this, somehow the scale didn't work this this asymptote should be about 100 or so, maybe even less than that I'm sorry. What I want to say is if you have less than 100 qubits. It doesn't matter how many operations. It's not, it's going to be signable classical. Sorry, that curve is that curve is just drawn wrong. I guess the point I want to make is all of our current technologies are sort of unfortunate in this gray area we're starting to see hints going over into the into the orange here, but you can see up in the far right this is factoring an interesting number to break codes it's a long ways away. And it's going to involve error correction so you can see how many orders of magnitude we're away. I hope Arthur's right and we have experimentalists that are that are really clever and can figure this out. So, with any new disruptive technology it's important to get ready. And so the stages of quantum readiness are similar for other disruptive technologies you have to understand what the technology is what it can do, how ready it is and the technology perspective. Then understand what does that mean to you to your company to your country. And then you come up with a plan. Obviously, you know, Chris touched upon many ways to potentially, potentially, as you put it benefit from the destructive capabilities of quantum computing. And we also have to plan to mitigate any threats that this new technology brings. So, this is taking off all around the world. There's major global players developing the hardware platform in North America, Europe, China. And we're not just building the devices there's entire software stacks and commercial entities to deliver the software tools to do focused work on specific potential application areas. And there's users and users in the aerospace sector financial sector and others, trying to figure out how might this disrupt us. So this this you know I say epsilon chance but of a exponential impact. The prime example we know of is cryptography, you know RSA cryptography Diffie Helman cryptography we thought it was fine. But only it wasn't it's completely broken by this hypothetical quantum computer. And these are these public key algorithms. They're the base most people don't care they've never heard of these acronyms. They're the basis for secure web browsing auto updates. All these secure protocols are based on strong cryptography, and these will be broken. And all of our digital platforms that we use all around the world directly or indirectly use these protocols. Do I need to worry about it today, or can I wait till I retire. This is the question I've been asked since the 1990s. And the answer has always been well that depends. Right, what is the security shelf life of the information you're protecting. Is it a trade secret. If it gets out will it mean the end of your company. Right so that might have a long shelf life obviously national security is another important area human private information or DNA information and so on and so on. Long shelf life, and much of our information is a very short ephemeral life. And the next factor parameter is the migration time. This is something most users of systems don't really know, don't really understand the intricacies of migrating a system from one, you know, cryptographic algorithm to another, but in any sort of real world deployment. If you're lucky it's five years, but that is pretty rare, and it'll take five to 20 years. You can't rush it if you try to rush it you can, you can rush it, you can go faster, you can go more assertively but there's a fundamental limit. And if you go any faster, then you're going to compromise the implementation and lastly, so call that why years. Lastly, what is the collapse time, when will quantum computers break these crypto systems and read all this habit. So for the next why years, we're stuck with the quantum vulnerable systems. So if x plus y is bigger than Z, you're already too late. There's three inflection points coming in this transition. The first one is on the solution side. One of the questions was are there new algorithms that resists quantum attacks. There are indeed algorithms that resist the known quantum attacks. And NIST in the United States is standardizing them really standardizing a first week of algorithms to be done by around 2024. And so many people, you know, the final stage of deployment involves I need a standardized algorithm. So they're doing a preparation, but they have to wait for that standard algorithm to be ready. When that algorithm is ready, they're going to the next phase. Now, when that fault tolerant qubit that logical qubit, Chris and Arthur talked about when that is demonstrated. I think you need to be 90% ready. We don't know exactly how many years it will go to go. It'll take to go from 10 logical qubits to enough to break RSA. But we're counting the best that's really the home stretch. Right, we better, you know, the protectors of our digital platforms. And the users of our digital platform, they better be able to tell everyone, don't worry. This is a great day that we have a fault tolerant on a computer, we're totally ready on the cybersecurity side. And there'll be another inflection point, you know, shortly after that a few years after that, which is commercial long distance quantum photography and quantum communication. And it will be important to be ready to benefit from that as well when that comes out. And what are you by 75% ready 90% ready. Well, if you look at the four stages of readiness, understanding doing an assessment, doing your planning and testing phase three, and then that last 10% is implementing the plan and test, you know, validating it and so on. And so I think when fault tolerant qubits are available, you better be starting the deployment. You better be doing that last 10%. Now, when the NIST standards are out, because you need the NIST standards to finish your preparations and migrations and so on, but you better be well into the planning test, you plans better have already been started, and you're into the testing and so on and preparing. And when the NIST standards come that will enable you to finish the preparations. That's where I think we need to be at these inflection points. And solutions while we kind of know what to do about it in theory. Right, and it's the first the first line of defense is deploy new algorithms that we don't know how to break with the quantum computer. That's the best first line of defense, because you don't need quantum technology, you can deploy it on a smartphone. You know, PCs anything any classical platform. Now, but we can do better we can add quantum cryptography which doesn't have that risk of crypt analysis. And these are not in competition with each other some people frame it as being competition they're not there are more tools in the tool chest. You don't need to wait. A lot of people so wait till the NIST standards are out, then do something well no you can do 80 to 90% of the work without the standards. I think it's a great open source platforms experiment with. This is one we started I think it's probably the most widely used. And I also want to emphasize with regards to quantum cryptography we should be. We should go where the puck is going as the Canadian, you know, so Wayne Gretzky saying goes. We know. I have no doubt that brilliant experimentalists and physicists and like Chris Monroe and others. We'll build a quantum internet. Okay, our job as security specialists figure out how do we leverage it to make our digital platforms more secure to make our citizens more secure economies more secure or resilient. That's what we need to do the action items for business for policymakers for government doesn't require the Schrodinger equation. There's no H bars here. It's simple. It's very simple. We need a company needs to put somebody in charge of producing a quantum readiness plan if they haven't already. It can't be a hobby they do in their spare time when they're bored. That's their main job. The key performance indicators need to be based on their quality of their quantum readiness plan. So no quantum physics required except for the person, actually even the person implementing the plan doesn't need quantum physics. They also need support. You can't put them in charge of a task they will fail at. This is a very cross cutting issue. It's not just an IT problem it's not just a risk problem. It cuts across the business lines as well. So you need to give top level support of your business executives to this person and their team if they have a team. And then they're going to have to not just study it internally. When you can start when after you do the initial awareness and assessment internally. You can start reaching out to external stakeholders, like the relevant standards bodies people often will say well, I should do something but there's no standard, as if they're helpless by standards. You need to start talking to the standards bodies and others in your ecosystem to make sure the standards you need are available when you need them. And you know, these these this migration, well there's a lot you can do in house inside your own company. And ultimately you do need to talk with others in your sector for certain for the standards for example and regulations. And there's cross sector dependencies as well. So you're going to have to go to these cross sector for and start talking with these other important stakeholders in the supply chains. Another very low hanging fruit again this is all stuff you can do in the next six to 12 months. Start updating your your procurement policies your requests or proposals, starting gauging with your vendors, because you will rely on them. Right. And if you're a vendor, your customers will rely on you for this migration to happen in a timely fashion.