 The story of computing is the story of humanity. This is a story of ambition, invention, creativity, vision, avarice, and serendipity, and it's all powered by refusal to accept the limits of our bodies and our minds. There is a very famous AI researcher from the early day, Alan Newell, who once observed in an article called Fairy Tales. Your technology offers the possibility of incorporating intelligent behavior in all the nooks and crannies of our world. With it, we could build an enchanted land. I want to take you on a journey of twelve parts to examine that particular landscape and what your place in it might be. Let's begin at the beginning what computing itself means. The earliest ways we looked at the world was trying to bring certity and predictability to a very uncertain and unpredictable world. This is the central premise of science, that the cosmos is understandable. And that led to, of course, the scientific process and scientific ways of thinking. In fact, we as scientists, we look at the grandeur of the cosmos. We wonder why and how, and we try to reduce that to the simplest kinds of concepts. This is the nature of the standard law of physics. All the motion of the galaxies, the motion of water in an ocean, the motion of water in a teacup, they're all governed by the same principles and the standard model of physics attempts to grab that. We also make the premise that the mind is computable, in a sense it is clear to say that consciousness is an exquisite but perhaps very rare consequence of the laws of physics. Alternately, the central premise of computing is very different. The central premise of science is that the world is understandable. The central premise of computing is that the cosmos is computable. And here we start with the simplest ideas, the basic concepts of a Turing machine, which underlie all nature of computation. And from that, we build worlds of our own invention. This is not a real world, but this is a simulated world of a black hole, taken from years upon years of us observing the cosmos, rendering it in models and regenerating our own worlds. This is the nature of computing. So we probably understand what scientific thinking is, but what does it mean to do computational thinking? I would observe that it's based upon the following premises, an assumption that the cosmos is discreet. It may or may not be at the end, but there's reasonable belief to say that we can look at the world and view it as non-continuous, and we can do some remarkable observations about it in that regard. We also assume that information is the foundation of reality. This is indeed one of the premises of Claude Shannon, who will appear later in our story again. We also assume in computational thinking that data is an abstraction of reality, that if information is the foundation of it, then the abstractions we build through that data are our representation of reality. We use our algorithms to form abstractions, and indeed it's a very powerful combination, the yin and the yang, of algorithmic abstractions together with object-oriented abstractions that let us build these models of the world. It's also important that we consider the promise of scale, not just scale in space, but also scale in time, the ability to do computations actually faster than the physical processes of the world themselves, the ability to do computations that serve not just one, but potentially billions upon billions of people. It's also the case that in computational thinking, we build upon the universality of computing. Given that we assume that the world is computable, and computability means, in effect, being able to run on a Turing machine, just about any kind of device with certain characteristics can be indeed a computer. I could as easily run a simulation of a black hole on a supercomputer as I could on a computer built out of Tinker Toys. It would take longer, but functionally it would be quite equivalent. So whereas scientific thinking has evolved as a means of understanding the world, computational thinking has evolved as a means of controlling the world and controlling it out of a level of fidelity that once our myths relegated to the gods and goddesses. Stuart Brand, who wrote the Whole Earth Catalog and was very active in the 60s in the counterculture movement, observed we are as gods and we might as well get used to it, and that's what computing brings to us. It is the case that not only is the story of computing as the story of humanity, the story of computing begins with women. And I start here with Ada Lovelace, who worked in the shadows of Babbage, but she was perhaps the first human to understand the potential power of computational thinking that it was more than just numbers, but those numbers could represent something. Babbage's engine, on the hardware side of it, was one thing, but Ada's software point of the view world changed everything. Now we come to Boole, who was present around the same time as Ada and Babbage, but he had this audacious view that we could take the laws of thought and compress them into the simplest kinds of things. It is a bit of hubris to think that this is the laws of thought. It was the beginning, but from this we build the ideas of Boolean logic, the ability to take mathematical logical concepts and use it to describe the world. Women gave birth to computing. In fact, also around the same time we had the Harvard computers. This looks like your typical scrum stand-up meeting, but the women here are the ones doing the real work. The woman in front, who's looking at the magnifying glass, is particularly interesting. She's Annie Jump Cannon, who was one of the first human computers, and her work back then revolutionized our understanding today of the luminosity of stars, but she also plays a role in computing because she helped organize these teams into the way they worked. And now we come to Claude Shannon, so the ideas of lovelace that symbols can mean more than just numbers, the ideas of Boole that we can refine these things into Boolean logic, and then Shannon, who observed that there is this wonderful connection between informational entropy and information itself and complexity, such that we now have some laws that help us understand the very nature of information itself. This is the foundation of computational thinking, and it is the foundation to all that computing brings to us. That being said, it is a tough thing to realize, but all of modern computing, was really built upon the progress that came about during World War II and the Cold War. Therefore, I think it's fair to say that computing is woven on a loom of sorrow, that we build a lot of what we do today based upon the things we had to do back then. In fact, let's look at one particular case. During the Gulf War, we're looking here at the river that flows through Baghdad, and we saw the disbursement of a number of smart bombs on targets on that edge of the river. Turns out this is the very same location that was the house of wisdom that was the place where the algorithm itself was devised. So a bit of irony that we use the algorithm against that very sacred place itself. Women gave birth to computing. During World War II, we had computers such as the ENIAC that were devised by men, but ultimately programmed by women. In fact, there are some five women who were, until recently, still living. I think one or two of them are still alive who were the actual programmers for it. They programmed not through our typical software as we see today, but through these plug boards. In the UK, at the same time, a very remarkable thing was happening. This is the Colossus, a machine largely again run by women, that was used to decrypt the Lorenz code, which is a code even stronger than the Enigma code, which Turing had discovered ways through the bomb, a machine, an electromechanical machine, we could decrypt. We moved up to the next level where this machine devised and built by Tommy Flowers to decrypt images or messages back in World War II, and it changed the nature of the war. In the Cold War, we had devices such as this. This is the SAGE, the semi-automatic ground environment, and it was perhaps the largest software-intensive system of its time. Here we are in the late 50s and 60s. It consumed easily 20 to 30% of the human brainpower of programmers that exist in the United States. These effects were long-lasting. Look at the displays in the background. This was inspired by the whirlwind computer, a real-time computer built on the East Coast, and formed the basis of displays in SAGE, far ahead of its time. We use those same kind of displays today in all of our air traffic control systems. The very basis of this software inspired the architecture of hundreds, if not thousands, of different large-scale systems, including, for example, SABER, the first ticketing system for airlines, all based upon the ideas of SAGE. And now we are today in modern warfare. Einstein once said, I don't know how World War III will be fought, but I do know World War IV will be fought with sticks and stones. Well, with all due respect to the great professor, I would observe that World War III will probably be fought with 1s and 0s. The other great influence upon computing was, of course, economics, follow the money. In fact, it was the computational work that happened in the late 50s and 60s where we began to see computing move out of the defense world and very much into the business world, in the IBM 360, tremendously legitimized that. It brought computational power to businesses that simply had not existed before. And from this, we had an overwhelming number of changes in the world. The rise of the programmer as a discipline unto itself, the programming priesthood as we called it, the rise of structured analysis and design methods, the rise of higher to programming languages. But very rapidly thereafter, the economics of hardware changed. Whereas in the IBM 360 days, the cost of the computer was greater than the cost of a human. But as we moved to many computers, it was a bit more balanced than we moved to microcomputers. All of a sudden, programmers were a lot more expensive than the machines themselves. And that changed the balance of everything. It also changed the nature of the kind of systems we can build. And of course, the development of the World Wide Web by Tim Berners-Lee changed everything. And the introduction of mobile computing changed everything yet again. One wonders what the next change will be. These were all driven by a number of very clear and present economic concerns. And they drive us to the systems we see of scale today. To the point where now, as it should be of all technology, computing becomes invisible. It is part of the very atmosphere in which we breathe, the waters in which we swim, the cities in which we live, it's there, but it's not fully present for us. Because it's very much embedded in the interstitial spaces of society. To that end, computing impacts all of us individually. We are storytellers, and we love trying to understand the world through our narratives. But each of us as individuals, especially now, from even before birth to after death, leave a digital footprint. In India, for example, in order to deliver services to everyone, fingerprinting is universal. And so we have the first country who really at scale began to digitize all of its citizens. Now, there's an impact to this because as computing moves its way into a world, it impacts us. This is a Mennonite girl. She's thinking Mennonite is kind of sort of lighter weight Amish. And their relationship to technology is interesting. But an Amish, a Mennonite person observed in the use of their technology, it's not just how you use the technology. It's about the kind of person you become when you use it. So in the world of computing, we're really just beginning to understand how computing affects us at a very individual level. Computing also wires us together. We are born together with others through a thousand invisible threads. Facebook certainly unites it for better or worse. Huge percent of the world's population is connected via Facebook. And I say for better because it is given people to have communities with voice. For worse, because as we have seen in the news of late, Facebook can also be an amplifier of misinformation and dissent and indeed, downright evil. With the bad comes the good. With Netflix, all of a sudden we have universal access to entertainment in ways that never existed before, binding individuals into communities across time and space. Amazon, what's starting off as the simplest little book company, now dominates the ad world and dominates online distribution. While these first three things look big and ominous and very much, let's say, out of Silicon Valley, the impact for communities is tremendously high. The very presence of microtransactions has changed the way individuals work in even the most distant rural areas. And so it binds us again in some invisible threads. We as individuals and then as bound together in groups also organize ourselves in nations. And those nations have a very interesting relationship to technology. It used to be that it took literally days for information to get to London from distant parts of the world. There was a major earthquake in India in 18, 19, and it took almost a third of a year to get there. Computing has reduced the transfer of information to be almost instantaneous. Indeed, if an earthquake happens in one place, the Twitter message that follow, and once you get past 100 kilometers will actually outrun the waves of the earthquake itself. So computing actually does things beyond what the physical environment can do. And the way governments use this technology can be for good and for bad. In the 1790 census we collected data such as how many slaves did you have? Today, we collect information such as what is your gender? It may not be male, it may not be female, it could be something else. The danger of course is how governments use that information against us. There are many different ways to approach computing and every line of code certainly has an implication, a moral and ethical implication. But there are different moral and ethical frameworks. So whereas we in the United States, we tend to eschew facial recognition, China, a very different world. It's also the case that computing has changed the way we deal with science. This is a hole drilled in the surface of Mars. Indeed, the mission to Mars and our return to Earth, to return to the moon is leading us to the internet that eventually moves to the space itself. There's a lot of talk about autonomous vehicles, same revolution is happening in the oceans. This is the Mayflower, an autonomous vessel that IBM has been working with that leads us to really use computing to cover every part of our world. Computing has enabled us to reduce the cost of genomic translation. We are now below, and this is a few years old in this chart. We're now to the point where it used to cost $100 million to sequence the first genome, now it's about $100. And from that we can achieve some amazing cross-discipline discoveries of diseases. We use computing to build models of the world and climate. And from that, not only do we do meteorological studies, what's the weather going to be tomorrow, but we can also observe the breathing of the Earth and the growing climate crisis, and its impact that humans have had upon it. Turning from science, we have its pure faith. And it's fair to say that computing also impacted our means of faith. In fact, Leibniz, when he was devising the binary number system, deeply religious man. But he was also inspired by the Chinese I Ching, a book of opposites. And he came to the observation that he could combine his faith with these ideas and realized that one represented God, zero represented chaos, and from that was born the binary number system. Now to be fair, Leibniz reinvented what had been discovered in India some centuries before, but it truly entered the mainstream of philosophical thought at that time frame. It's also the case that the earliest days of the internet were haven to places such as the well, but they were also haven to the earliest very disparate faith communities of Wiccans. Wiccans found a home in the early web because before, their various covens had been very disparate with one another, but they found a way to come together on the web itself. And not just that faith, but even Judaism is impacted by computing. Yes, you can get a kosher phone under strict, under strict rabbinical rules. There are things you can and can't do on Sabbath days. There are sites you can and can't visit. And so their faith has adapted by the formation of technology that binds them to their faith even further. If you're interested in your genealogical studies, you may have studied various sites and say, hey, who's in my past? Well, probably a lot of that came from data from this data center that you see here. And it all springs from a faith practice called the baptism of the dead, from the Mormon church and the baptism of the dead and the Mormon faith. It is possible to pray for others before you. People have already died so that they can be saved. You may accept this or not, but it is a very much belief system. And from that, the church has gathered billions upon billions of genealogical data into a database. And that actually is a foundational database that most of us use for genealogical studies. The rise of computing also gives us pause and now with the virus as to what it means to worship and community. In Islam, one would go to a mosque to pray, as you see in the lower left. Why does that sacred space have to be physical? Why can't it also be virtual, as you see in the upper right? Here we have someone praying in a virtual mosque. It's also the case of computing impacts art. And there are many aspects to it, but let me walk you on one journey. There's a colleague of mine who works in GANs, a form of artificial neural networks that are generative. And just as Picasso would grind his own pigments, this particular artist grinds her own tools by using GANs themselves. The earliest kinds of computer art were 8-bit art, this pixelated stuff. But not too long after that, we began to see art like this. This was the first line-based art in a movie called Hunger, using the ideas of keyframe animation, where keyframes were written by one person and the computer extrapolated between them. Next we had this, the Oregon teapot. We began to discover how to represent 3D models and then begin to skin them and put textures upon them, the beginning of CGI. Once we learned to do 3D modeling, then we could put colors and textures on them. We started getting these kinds of plasticy shapes, which looked kind of real, they were getting a bit better. But then we started in that community to start doing CGI of creatures. Why creatures? Well, we as humans were used to seeing other humans, and we can tell you if it's a human or not, but creatures never seen one of these before, a triceratops, so probably close enough and we can generate that through computing. Well, that was done very organically, but then the ideas of fractals from Mandelbrot came into play, and once we learned those algorithms, then it was possible to develop images such as this, which began to look very realistic. So we had these models of what the real world did via practicals. We could write algorithms for them and now begin to generate our own worlds. The next step up was that of fur. So imagine if I could actually make each individual hair on Scully, and I forget how many, but it's some 400,000 or something like that. Each one of them was independently modeled. And now all of a sudden I'm getting things that my CGI looks very, very real. Liquids were the next one. This is from the movie The Abyss, and we began to see the movement of water through algorithms. If you've seen the movie Moana very near and dear to my heart because it's based upon Hawaiian legends, you'll see that the water there looks so incredibly realistic. Next was the simulation of clothing itself. So these are not real models, but these are CGI models wearing CGI clothes. And now that we've understood the physics of fabric and the physics of those kinds of materials, we can generate them as well. And lastly, this is where we are. This is not a relative of mine, in fact, this is not a living human. But she's entirely computer generated. We are to the point where CGI can generate for us completely human realistic figures. What can we do to that art? We're creating our own worlds. And now we're very much in what a dear friend of mine, Rita King, speaks of as the imagination age. This is the age in which in many ways we are only limited by the nature of our imagination itself. Now consider the following. I can dream something. But if I as a computer scientist want to turn that into the real world, there are a number of things that I must get through to do that. First, there are the laws of physics that I can't pass information. I can't process information faster than the speed of light. I can compress information only to certain levels. That's just the nature of the laws of physics in which we live. And then there are the algorithms which we try to devise. One that all of you use every day that you never think about if you have a mobile phone is the Viterbi algorithm. And this is an algorithm built by a gentleman of that same name that allows us to extract signals out of the very noisy environment of the wireless, the wireless world in the 3G and 4G and 5G world. And it was really only until we devise those algorithms that it broke us through the next level of applications. This is why in the first part of turning our imagination to reality, this is why this is the domain of computer science. A lot of hard, true, repeatable science therein. But then once we want to turn those things into systems, then this is where software engineering comes into play. It's not just an algorithm, but it's a society of algorithms that work together in an architected system that allows us to interact with the world. So in the earliest days, we had algorithmic abstractions. Today, we have octetroided abstractions. And now we have evolutionary architectures that build upon those. That's a technical side of it, but all these systems are built by humans, and those humans have to be organized as well. And so this is the place where computing and software engineering in particular starts dealing with organizational architecture. Jim Copleen's wonderful book, The Organization of Agile Teams, is a great book to describe some of the patterns on that space. It's also the case that this is not free, but it costs. And so in the real world, we have to deal with the economic costs of building such systems. And so we may devise, we want to build something, but we might not be able to afford it. And lastly, there are the human issues. Every line of code we write has a moral and ethical implication. And especially in this world of AI, we see that those human issues begin to dominate. The earliest systems we built were mathematical, and so the fundamentals of computing science were really first born in that time. And that is Aida's ideas bore fruit, and we started moving from just numerical calculations to symbolic calculations. We were now dealing with the problems of managing complexity. As computing became personal, then human computer interaction began to dominate. As we began to move out to global scale, scale both up and out began to dominate. And now as we move into these imaginary realities, building worlds of our own, the ethical and moral issues dominate. Well, now we're at the point in computing where we see that it's allowed us to touch virtually every aspect of the human experience. What's next? And I would observe that that question of what next leads us to ask the very question of what does it mean to be human? The first is that of the extension of the human body. This is a picture of me at the Johnson Space Center just to be clear on the person in the middle. The one on the left side as you view the picture is the Robonaut 2, which for a while had been up the International Space Station. It was brought down for some maintenance. That it will be sent back up again. But this is a way in which we can extend our reach. In fact, Mars is the only planet that we know of that at this moment is populated entirely by robots. That's extension of the real world. But we also have extension in the virtual world. I'm a gamer. I love Halo. This is the chief in Halo. And the amazing thing about that level of gaming is it allows us to immerse ourselves in a completely different world. In fact, whereas you may be a football fan, and I mean real football, not like American football, but your real football, that's real football. But we see the move toward esports, which is beginning to garner amazingly large crowds. This is the nature of future sports. So computing can extend this physically as well as virtually. Now, the question we have to ask ourselves is how far will we go? Will we lose ourselves in the worlds of augmented and virtual reality? And that depends upon what kind of future we wish to build. What then is the nature of being human? When I speak of artificial intelligence systems, I have a very simple litmus test. An AI is a system that reasons and that learns. And if it doesn't do both of these things, then it's not AI. We see a lot of systems that reason, very statistical and oriented. But if it's not learning or if it's not taught, then I would not consider it to be AI. What's appealing or perhaps interesting about AI is that it tugs upon the deepest myths in the human spirit. This is a coin representing Talos, who was an imagined creature that would walk around the shores of Crete and it would hurl stones at anybody that intruded nearby. It was perhaps the first robot. And that came to be true in our imagination even in other centuries. Da Vinci had the idea of the mechanized knight bringing automation to warfare. In the movies Metropolis, we began to see the ideas of AI perhaps as a danger to us. The Terminator movies especially. Turing, remember him from way back in the earliest stories, he observed that a computing would be deserved to be called intelligent if it could deceive a human. The believing was human. Well, this leads us to our current day. This is from the movie Her in which we're on the cusp of building machines that act an awful lot like us. I am of the belief that computing, that the mind is computable, and that we don't require organic structures to build consciousness, but we could do so within organic ones. This is the nature of the rise of AGI. In fact, in a bit of a parallel history for symbolic computing, we see the same thing happening in AI with the rise of discovery of neurons as the basis for computation and consciousness in itself. We've been able to build our own artificial neurons. Now, we have mapped the entire neural system of the common worm. This is a project called open worm. And even though we can do that, we haven't built any mechanical worms, but we're kind of getting close there. This is the neural network in the eye of a fly. The worm has roughly 10,000, 20,000 neurons. A fly has roughly 50,000, 60,000, 70,000 neurons in just its eye. In the human brain, we're talking about 100 billion neurons. And the best AIs we have today, like GPT-3, we're only dealing with millions of neurons, not billions upon them. So to that end, a lot of contemporary AI is about pattern matching, about inductive reasoning, but it's not about decision making, abductive reasoning, building theories, or causal reasoning. That's the future of AI. And so that brings us back to the beginning of our story. The story of computing is the story of humanity. It's a story of ambition, invention, creativity, vision, avarice, and serendipity, all powered by a refusal to accept the limits of our bodies and our minds. In a world of abundant resources where nothing is forgotten and where we are connected in pervasive, unexpected ways beyond our choice, it's reasonable for each of us individually to stop and ask ourselves, what kind of world do we hope to create? Let me leave you with this as developers, as computer scientists. Software is the invisible writing that whispers the stories of possibility to our hardware. And you are the storytellers. Go forth and write some wonderful stories. Thank you all very much for having me. By the way, I'm very accessible. I answer all my email personally, so here's the way you can reach me. Every time I gain a Twitter follower, an angel gets its wings, so feel free to follow along for the ride. So I'm going to leave you with this as the end of the story. The angel gets its wings, so feel free to follow along for the ride. So thank you. I think we have a little bit of time for some Q&A here. Yeah, yeah. Thank you very much. Great presentation. There's a lot of questions, but most of them about the same topic. You talked a lot about the story of computing, but people is asking about the future of computing and very specifically about your opinion on quantum computing. How do you see the future of computing and specifically quantum computing? So we are an interesting place in computing at the moment, whereas the earliest days of computing, really for decades, it was largely about symbolic computing. But then AI brought to us the notion of deep learning and therefore neural computation. And so now we've got two modes of computation. Quantum brings us a third. And so whereas we have a long way to go on the hardware for it, we're probably still a decade or two away from building truly practical quantum computers that work at scale. We're already beginning to see the foundations laid for where quantum fits within that triad. I just spoke working with a gentleman in IBM who is helping to build out the software stack for AI. The last thing you want to do for quantum, the last thing you want to do if you want to program a quantum computer is have to know high energy physics and the notion of Hamiltonians. We're trying to build a software stack that brings quantum computing, which can accelerate certain kinds of computation to astronomical speeds. We want to bring that to the masses. So we're on a journey. Quantum will be a piece of that journey. And I would say that we're at the beginning of the beginning of that journey. OK, so thanks so much, Grady. It's a pleasure having you here and keeping touch. I think as you know, this is an anonymous community. So for sure, many, many, many people will drop you on a line. So keep in touch. My pleasure. And one of these days, maybe I'll eventually be able to travel and see some of you. Yeah, that would be amazing. That would be amazing. Next year, here in Madrid. Take care. Let's hope so. Stay safe, everyone. Thank you very much for having me.