 I know some of you weren't born in 1982, but I started young. All right, so as you can see, they, for some weird reason, left out the very important term, quantum computing, from the title of this conference. So I just got rid of all the rest of them anyway. Quantum computing is one of these really, really interesting things lately. And about three quarters of the things you're going to read about it in the media are really wrong. And so as we go along here, I'm going to try to tell you about the state of the state, at least from our perspective. But also kind of tell you things never to say out loud, because ultimately it will come back and embarrass you when you actually learn more. So think of that part as a public service of what we're trying to do here. Now, even though this is a FinTech conference, when you're talking about the motivation for quantum computing, you really can't get away from chemistry. This goes back to a conference in 1981, where the physicist Richard Feynman pointed out, really, that if you're going to model nature, that it's exactly represent natural processes in a computer that a classical computer, and a classical computer, is really anything developed with the architecture starting in the mid-1940s. So the ENIACs all the way up to your smartphones, that there are gaps in what you can actually compute there. There's no a priori reason to say, just because we have developed this particular type of computer, that it's going to solve every single problem that comes along. Now, people have, I think, gotten the wrong impression through the years and through the decades, because computers just kept getting faster. We had Moore's Law, so roughly every two years, they became twice as fast, twice as small. They used half as much energy. And we became junkies on getting this faster. And we thought this would always happen. Now, depending on to whom you speak, Moore's Law died 10 years ago, or it's napping, or whatever it's doing. But that's almost kind of missing the point in that, in reality, these types of computers will not do for every type of problem. And there are some mathematical proofs for this. And so the work really began around 1981. And when Feynman pointed out, he said, well, since nature, and here we mean basically atoms, molecules, and particles below that, is based on quantum mechanics, something which goes back to 1900s, really, 1920, Schrodinger, Heisenberg, and so forth. If you build a computer with a fundamental gate set that responds the way quantum mechanics responds, and so here we are not talking about bits, and ands, and ors, and nots. And then as we build those up into adders, and multipliers, and subtractors, and eventually you get to Python. We're not talking about that. At the lowest possible level, it's different. The best classical software engineer in the world is not the best quantum programmer in the world. It's a completely different thing, since if aliens came down and said, hey, we developed this completely different type of computer based on absolutely different principles, here it is, good luck, figure it out. And that's where we are. And that's what we're figuring out what quantum computing will be good for. And we're very early. So as just a very quick example, this is the caffeine molecule. If you were to represent, so from a quantum chemistry perspective, the ground state energies. So you think of atoms, you have neutrons, you have protons, but more importantly, you have electrons. And these exist in this probabilistic cloud around the molecules. And if you were to write down, saying how much information is expressed in a single molecule at a single instant with respect to this energy configuration, it turns out that the number of bits, zeros and ones, you'd need to represent that is 10 to the 48th. Yeah. How does nature do that? We don't know. People have started religions over trying to figure out how quantum mechanics works. It's weird. The number of atoms in the Earth are estimated to be between 10 to the 49th and 10 to the 50th. So worst case, the number of bits you need is comparable to 10% of the size of the Earth. Well, that's different. Now, if we think about qubits, and I'll talk a little bit more about qubits here, quantum bits. When a quantum computer with 160 of these qubits is running, you could store that amount of information. I'm completely skipping over how you get all that information in there. I'm completely skipping over, and then what do you want to do? But just to give you an idea that classically, completely impossible to exactly represent. We at IBM have our largest quantum computer is 50 qubits. It's reasonable to think. Progress in the next few years will get to 160 and beyond. And so there's hope here. So first thing, as I said before, things don't say. Never say quantum computing will, because we don't know. There is real science, theoretical physics, theoretical mathematics that has to be developed. And you can't just toss it off and say, oh, well, smart people will figure it out, blah, blah, blah. Really, it'll be four years. No. No, right? There are going to be a lot of extremely important scientific milestones from here to wherever you want to be, where, frankly, we don't know how to do it. We have ideas about getting from here to there, let's say, but we know that there have to be major breakthroughs. So when people say something will definitely happen in five years, do not believe them. Because probably they're not actually building quantum computers, and they don't know the actual issues. A lot of times people will say, especially about crypto, the number of wrong things that people say about quantum and crypto. Quantum will break cryptography next Tuesday at 3 o'clock. A quantum apocalypse is coming. The largest computer we have right now, as I said, is 50 qubits. Maybe others have slightly more. I don't know if there's work, but whatever. The number of qubits we're going to need to break the types of encryption, such as an RSA, is on the order of 100 million. We have 50. Sleep well this weekend, if you're insecure. Now, breakthroughs may happen, but it's not Tuesday. Therefore, motivated by chemistry, by this example, chemistry is an area where there's likely to be some early breakthroughs. And we've seen some good algorithmic matchups to certain problems. We had an article a year ago, September, on the cover of Nature. Material design is an area, particular types of materials. Oil and gas, midterm. So we usually say near term is up until the end of the 2020s. And if we're really optimistic, we'll say three to five years. So just to give you an idea. So chemistry, mid-2020s, drug discovery, much later. Much later. You need very large quantum computers if we can build them. If. We might, if possibly. These are all important words here. Artificial intelligence, two possible areas. One, we use quantum computing to speed up the low level math we do, particularly linear algebra, somehow, some way. Except quantum computers are not big data machines. You cannot push a lot of information into a quantum computer. So therefore, you have to reduce the size of the problem, something which you do classically. And so it raises questions. What does sketching mean for input to a quantum calculation? So that's kind of scattering a result. The other area, which to me is frankly a little bit more interesting, is to say, are there completely new algorithms that take advantage of quantum computing, such as entanglement. Fundamental principles of the programming model that somehow maps to something about the data that allows you to do, let's say, classification much more efficiently. And then financial services. Here, this is kind of a topic. People are figuring this out. It's happening a little slowly. The big houses, the hedge funds, have people who are working on this. Traditionally, they're physicists who can code. They know a lot of math. They're trying to understand how these certain types of algorithms will match. We think it's going to be the first areas of interest will be replacements for Monte Carlo sorts of simulations. That's for risk analysis. With quantum computers, and again, we've only done very small problems. This is for very small problems, but it's a nice paper. Says that, essentially, you can do your simulation because you can work with much smaller sample sets on the order of quadratic square root of what you have. And oh, by the way, the samples you're taking are higher quality. So that's the name of the game there. So I encourage you to look at things like this. Quantum advantage is a period, and this is what I talked about before the 2020s, saying, OK, right over here in this area, we can show that quantum computing is significantly faster than anything we can do using classical methods. Classical algorithms on a classical computer. Anything we know how to do, which isn't to say somebody can't figure out something and be really smart and come up with faster classical methods. You know that. I mean, that's what algorithm people do. So there's a race here. But in any case, the point is that the future is a hybrid. We are not going to be replacing the current classical computers with quantum ones, except maybe in like 200 years. They're glacially slow for traditional computing things and ridiculously fast for a special class of problems. And so the mix is, how do you combine those two things? How do you maybe rethink your whole attack on a certain problem? So when I say significant advantage, I mean like 1,000 times better, not twice as good. You want to run something, run it twice as long. Big deal. I mean, everybody has hardware, clouds, whatever you've got here. So that's what we're looking for, this quantum advantage. And we're looking for it in all those fields I mentioned and several others as well. And also, if you get down into some of the algorithms for some of the areas I mentioned, those things can be used elsewhere as well. So that's just a subset of industries. And I think especially some of the things that will be used early on just for hedge funds and portfolio analysis and so forth is very broadly used in many, many different places. OK. If you're not going to let me use any math, I and no one else are going to give you a good explanation of what a qubit is. There's the spinning donut family of explaining what a qubit is. Literally a donut. The spinning coin, all sorts of things like this. Here's something else not to say. Never say a qubit is 0 and 1 at the same time. People love this expression. And for those of you who know some math, it's because these people have never heard of a linear combination. It's like saying, go three blocks north and four blocks east and being so excited you're north and east at the same time. So all we're basically saying here is instead of this kind of zero dimensional situation where you have two points, 0 and 1, really the computational space for a given qubit is in fact two dimensional. And here, this is called a block sphere, B-L-O-C-H. That two dimensions is wrapped around the surface of a sphere. And we will arbitrarily label the north pole as 0 and the south pole as 1. They're just labels. They're not numbers. And so when a qubit is operating, the value it can take on can be anywhere on the surface of that sphere. But at the end of the calculation, when we poke it and we measure it, it must collapse to 0 or 1. Where does it go? Well, think of the positions in some sense of where it fits, particularly vertically, as the probability of where it will go to 0 or 1. See, quantum computing, because it's based on quantum mechanics, is probabilistic. And so these numbers over here, in this simple case, actually when you take the absolute value and square them, actually represent probabilities. So what you've got to know when you really start getting into this, if you want to completely ignore the physics, you have to know about five minutes worth of probability. You have to know a couple of hours worth of complex numbers. And then you probably need to know a little bit more about linear algebra. And that's the basis for the mathematical treatment of quantum computing from there on. So if we start here, we can say a qubit is a linear combination of these two things. We'll just arbitrarily label 0 and 1. And what's even cooler is that A and B are complex numbers. They're not integers. They're not real numbers. They're complex numbers. And that affects the mathematics tremendously and gives basically birth to why all this works. So when we add one more qubit, we double the power. So this is a traditional computer. It's a power nine core. Moore's law, you want a computer twice as fast, twice as powerful, double the number of transistors, roughly speaking. You add one more qubit. It is not trivial, though, in all times to just add one more qubit. This is a 16-bit qubit device. If I program this one and you do this via microwave pulses, if I'm sending a pulse down to program this, is the ringing of the pulse actually affecting nearby qubits. So there are all sorts of problems of how they're close together and the intensity of the pulse and the size of the pulse and things like this. So quantum computing is, in many ways, even beyond the simple semiconductor where you get noise and crosstalk, even more complicated. Because these qubits really are dying to entangle themselves with each other in ways beyond what classical computers do. So exponential growth, this conference, I don't have to tell you what that means. I will tell you, though, Americans in general think exponential means really, really fast. And that's the sum total. One qubit, we get these two basis states, A and B. Two qubits, we get four basis states. Think of it as four pieces of information being the coefficients and their complex numbers. So with two qubits, I can hold four pieces of information. And because it's exponential with three qubits, I hold eight. And now we're getting interesting. Now the growth is certainly not linear here. And so with n qubits, we have two to the n coefficients, two to the n pieces of information. And that goes back to caffeine. And why with 160 qubits, we can represent that information. That's how it works. Mathematicians, it's a tensor product. All right, the largest computer we have is to the 50th, as I mentioned. This is how many pieces of information it could store. While it is running, that's very important, not while at rest. And that's the computational space it can work with. So it's problems that would tend to explode exponentially inside a classical computation that this may be suitable for. Not problems that have a million pieces of data, but things that you can represent. And once you start computing with, just blow up on you. That's where you look for it to see if quantum computing might be useful to you. If we get to 275 qubits, the amount of information is larger than the number of atoms in the observable universe. And we hope in 10 years' time this will be considered a small computer. How does it work? We don't know. OK, that's your weekend reading, the philosophy of quantum mechanics. So it gets strange, but it's also fascinating for so many different reasons. These are the two principles. I talked about superposition in several ways. I talked about it on the surface of the sphere, what that means. And also, as you get this exponential blow up here as well, entanglement says that two particles, wherever they are, are so correlated, when you know about one of them, you know about the other one. Einstein hated this. He called it spooky action at a distance. He really tried to prove it didn't work. He helped prove this is the way it actually works. He wasn't always right. Now, going to the title. Please stop counting qubits. If you have 4,000 really lousy qubits, it is not better than having 50 really good qubits for lots of reasons. It's a multidimensional measurement on the quality of a set of qubits and how many things you can do with them. And so here are some of the factors, the number of physical qubits, the connectivity, what gates can you put in this, the errors on the individual qubits, the errors on the connections of the qubits, decoherence, and things like this. So when someone says I have more qubits than you, ask how many of them actually work, right, or how many are usable, then they usually shut up after this. OK, so some numbers here. Shores, this is our 10 million number for here. It's going to play out. This is going to be a multi-decade sort of thing. I think probably it will be the most significant computing technology for this century, from a core hardware perspective. All right, this is what a quantum computer looks like. I'm finishing up here. Let me just tell you that when we enclose this and cool it down, right down here at the bottom, it operates at 15 millikelvin. Close to absolute zero. The base of a quantum computer is one of the coldest places in the universe. Outer space is 2.7 degrees Kelvin. This is much colder than outer space, right? And it's being able to do this and keep it consistent that is part of why we can even begin to build quantum computers. All right, we just passed the 100,000 user mark in the last 24 hours on the IBM Q Experience. You can go and play with a 5 and a 16 tube at real quantum computers for free the IBM Q Experience. You can join these other people. There have been 130 papers. It's actually up since I wrote that. 6.7 million calculations have been done all by nonpaying customers. How un-IBM, right? OK, so just to conclude a couple of things here, QuizKit is the open source SDK. Yes, that's sort of why I'm here at this conference, right? It extends Python 3. It has several levels of programming below that. There is a quantum assembler language, and there is a microwave pulse definition below that, which is also open source. QuizKit Aqua, excuse me, a set of libraries on top of that. QuizKit Terra, a set of open source libraries at the very lowest level. Why spend millions of dollars and build your own quantum computer when we will let you program it at the lowest possible level of the pulses, OK? So with that, there are lots of ways of joining us. The IBM Q network, we bring in universities. People can partner with us. These are the advanced machines. These are always going to be the top quality machines, the 20 qubits and beyond. So with all of this to end, this is what a modern quantum computation center looks like. That's in Yorktown Heights, New York, about 30 miles north of here. This is not a lab experiment. 100,000 people, right? And this is what we're aiming for. So if you come visit me in a few months, you'll see one of these up at the IBM Research Lab in Yorktown. Thank you very much.