 So brains are composed of neurons and synapses, those are the fundamental units, and computers are, loosely speaking, constructive chips. So let's just take simply operating characteristics of these smaller units, neurons and chips, before we even get into higher level systems differences, like parallelism, etc. What is the clock rate for a computer? That's how many gigahertz you have, right? That's how many computations can be performed per second. Roughly on the order of, say, 10 to the 9th, it's probably higher for some of the faster computers and that just keeps going up, that's 10 to the 9th. What's the speed with which, say, neurons can compute things? Well, a neuron can only spike, that's the way information is transmitted from one neuron to another with an electrical signal, set down its axon, where it stimulates the dendrite of the neighboring neuron. I would say about 200 spikes per second would be really high, so let's say, best case scenario, 10 to the 3rd is this clock rate, 10 to the 3rd being about order of one spike per millisecond, which is probably faster than we can do. So you have 10 to the 9th for the computer, 10 to the 3rd for a neuron. What about signal transmission? How fast does that signal move along the axon? Well, in computers, electric signals obviously travel the speed of light, which is, if I recall correctly, roughly 10 to the 8 meters per second. How fast do signals move along axons? Roughly 10 to the 2nd meters per second. So 10 to the 9th, 10 to the 3rd for a clock rate, 10 to the 8th, 10 to the 2nd for signal transmission speed. What about noise? Signal to noise ratio. Computers, basically the problem with noise is solved. If you run a program the same way, you always get the same answer, right? There's very little noise in these circuits. Some thermal noise is estimated around 10 to the 6th, which is 10 to the 6th signal ratio to the noise ratio. In the neurons, we're hard to estimate signal to noise ratio because we don't even really know what signal is, but it's on the ballpark of 1, you know, charitably 10, right? So the noise is very, very significant, right? So if you look at these, just these three operating characteristics, and we can discuss some of the systems level differences as well, you have 10 to the 6th, that's the order of magnitude difference between these operating characteristics. So then why is the computer used as a metaphor? And there's a good reason why the computer is used as a metaphor, right? I mean, there have been all kinds of metaphors about the brain throughout history. Hydraulic engines, watches, right? Electrical telephone signals, right? The prevailing technology at the time, and many articles and books have been written about this, is generally the leading metaphor for the brain, because the brain obviously is capable of the most remarkable functions possible. So I want to match that with whatever technology is capable. And this dates back since the, you know, ancient Greece, the time of ancient Greece basically, these different metaphors have evolved over time. So now we are able to get computers to do very, very remarkable things. Because we can do remarkable things with computers, because we can do remarkable things with our brains, there's a natural tendency to say brains and computers are alike, they're not, okay? How we're doing those things is entirely different. And then you mentioned the others, parallelism, right? Computers do not work in parallel. There's some efforts at doing parallel processing, neuromorphic engineering, none of those have panned out. Standard computers operate basically in series, and very rapidly, okay? So the motif for computing in a computer is lots of serial computations. In the brain, you've got the slow, sloppy, and precise computing, but it's highly parallelized. And that's because each neuron in the brain connects to ten to the fourth other neurons, roughly ten thousand other neurons. That's a rule of thumb, some are more and some are less. But the whole architecture is different. Maybe you've heard of that game, six ways to Kevin Bacon, or six connections to Kevin Bacon. Six degrees of separation. Six degrees of separation, right, Kevin. And so you start with one name and then you make these associated jumps. Well, something like that actually existed in the brain. It's called a small world architecture. And the idea is that no synapse anywhere in the brain is more than three or four synapses removed from affecting some other synapse. So these are just testaments to the degree to which there's this parallel architecture where everything is talking to everything simultaneously. But somehow the system is organized in a way that it's able to make sense of it all and perform these remarkable functions, okay? But this doesn't mean you're doing it any way like the brain is doing it. In fact, again, the differences are striking. How did these object recognition algorithms manage to all of a sudden perform at this human level of performance? Was it because there were architectural changes? No, architecture's been the same since the early 80s. Is it because there are any algorithmic changes? No, algorithms have been the same since the 70s. What's changed is processing speed and the amount of data. Particularly, you've heard of Moore's Law, processing speed doubling. What is it, every two years or something? That has been maintained for a long time. This is what's going on. We are good. We have developed the technology to enslave electronic circuits for performing trillions and trillions of calculations in series to perform tasks amenable to those operations. And that's why object recognition has been a problem where we've made a lot of progress. It's not going to be the same for all the problems. As I said, the problem of motor control with Roger Federer, for a variety of reasons, you have all these interaction torques between all the degrees of freedom of your body. It's a continuous problem as opposed to a discrete categorization problem. Those are hard problems, all right? When there's a computer or a robot that can play tennis like Roger Federer, then we've solved those problems. It's going to be a long time before we ever get to the stage because it's just a different type of intelligence, okay? Our official intelligence will work better for emulating some types of natural intelligence than others. Now, you talked about differences between the computers and the brains. Let me give you another strong one, which also harkens to an area where I do research as I was discussing mnemonic techniques. And so the question is, how is information stored and processed in the brain? In the brain, information is stored in bits in these registers, right? I have my 10101010, everything's reduced to a binary code. And I have an address by which I can connect to that register. So if I want information somewhere, I need the address, okay? Then I go to the information. So one point about that information is that it's segregated. It's localized, it's only there, okay? And the address is what connects the user to that information. The brain is entirely different, okay? All information is stored relationally, associationally, in terms of its content. And it's all in a distributed fashion, meaning that information is encoded in distributed activation patterns that share neurons for different pieces of information. It's not segregated in one register. It's spread everywhere, okay? And I don't have someone reading it with an address. It's organized by content. So in other words, you have this vast associative web inside your head that's embodying information about all your previous experiences and knowledge. And information that you learn newly will have to make sense and be referenced or grounded by this associational web. That's how it enters it. So again, this is just an entirely different paradigm of how we learn, right? And how we think, then how a computer does it. And one way that you can see this is in the use of mnemonic techniques to improve people's memory. I don't know how much your audience is familiar with it. Lots of people are familiar with it. But are these ancient techniques dating back to Greek civilization, a gentleman named Simonides, and even actually further back than that? And if you apply these techniques, which I'll just describe very cursorily, you can remember huge pieces of information. Now why was this important? Well, in the old days, papyrus was expensive. They didn't have computers. There was no printing press. So you want to learn vast pieces of information. You want to memorize your speeches, et cetera. People practiced at this, okay? And all of these techniques were based on association. You would take some new information and you'd relate it to some old information. And then you'd concoct some kind of story up. And voila, the magic of these techniques was that it became really easy to remember lots of things. This is now a lost art. We completely ignore memory because we have these electronic devices. Kids are not even learning the multiplication tables in school systems. Or handwriting. Or handwriting, right? But think about what an incredible mistake this could be if these memory techniques embody a more general, associative computing faculty that is being ignored. In other words, if association is a fundamental paradigm of how we process information, i.e., relating new information to old information and experience and putting it in context. And that helps us not only memorize things, but it helps us with pattern formation, concept formation, even creativity, linking different things to each other, right? It's all part of general neural computing, which we don't really understand. And if we all of a sudden omit one major component of it, memory. Because the way memory is done in computers is so easy that we have infinite amount of memory. It would be like not running on the treadmill, not staying in shape because I can drive a car in a train to wherever I wanna go, right? That wouldn't make sense. You gotta exercise the fundamental faculties of your brain. And if association in memory is only one instantiation of it, is an important faculty and you're ignoring it, that's a problem. But you only realize that if you try to think carefully in a nuanced fashion. What are the differences between how the brain works and how a computer works? And they are profound, the differences are far greater than similarities. Only similarities is that they're both capable of remarkable feats.