 I'd like to start off by sharing with you a text message that I got one day last fall while walking to my office. It was sent directly from the brain of a person who is paralyzed from the neck down using only his thoughts and some clever computer software. As you might expect, I got pretty excited about this message. In fact, I've been working toward it for most of my life. On my fourth birthday, my family of five was happy and healthy, and my father was an engaged and attentive parent. Six months later, he was involved in a terrible car accident that left him barely able to move or speak. I grew up hardly knowing him, frustrated that I was unable to understand the things that he was trying to tell me. In my young child's brain, there had to be some way to fix this problem that the adult said was unfixable. I was inspired by science fiction, like a novel that described detaching the brain of a disabled person to the controls of a starship. But the more I learned about the complexity of the brain, the more difficult the problem seemed to become. How could we extract signals from the brain and put them to the outside world while bypassing the broken body? The answer lay in understanding the language of the brain. To do this, we rely on a phrasebook that's been developed over decades of fundamental neuroscience research. We need to listen to the neurons, the cells that represent the citizens of the brain, and decipher their language so that we can derive signals to steer our starship or to write a text message. The best way to listen to the neurons is to get very close to them. In our laboratory, we use a sensor that's about the size of a baby aspirin made of silicon with 100 electrodes that penetrate into the outer layers of the brain to about the thickness of a quarter. These probes nestle down among the millions of neurons that control movement of the opposite arm. Communication in the nervous system happens by brief electrical impulses called action potentials. Each one of these probes can read out from one or many neurons and read out the activity across the entire ensemble. We are now listening to the opinions of our neural citizens. These tiny signals are then routed out of the brain, amplified and digitized. Once in digital form, we can use them like any other data. So for example, to decode using our phrase book. One fundamental observation that forms the basis of our phrase book is that neurons communicate by firing faster or slower depending on what they want to communicate. You can think of it almost like putting an applause meter on each neuron. So for example, this neuron may clap like crazy with leftward movement. And we can denote that with a left arrow. This neuron doesn't really care if I'm moving my arm downwards, but it may applaud with upward movement. We can now tell on a second by second basis how all of our neurons are voting. Each neuron has its own preferred direction, which we can map out over the entire array. If we denote the firing rate with the size of the arrow, we can tell how our neural population is voting. So for example, if I want to move up and to the left to touch this target that appears on a computer screen, the neurons will vote depending on how much they like that move and let us to directly read out the brain's movement signal. This has allowed me to finally realize my dream of restoring function to people with paralysis. Given the ability to move a cursor, our research participants can use an unmodified Android tablet to surf the web, compose text messages, and to do things that they might not otherwise have been able to do. I've been particularly inspired by Jean-Dominique Bobby's incredible book, The Diving Bell and the Butterfly. The former editor in Elle magazine, he was struck by a brainstem stroke and left unable to move, completely paralyzed, except for the ability to blink. He wrote this book one character at a time using eyeblinks to indicate his intention. Many of you have heard of ALS, or Lou Gehrig's disease. This disease causes progressive paralysis and sometimes results in the inability to speak or communicate. This 52-year-old woman with ALS is using the innovations developed in our laboratory to type it about six words per minute, or about half the speed of a teenager texting on a cell phone. The goal of our laboratory, as well as of the larger brain gate consortium of which I'm a part, one of our major goals, is to provide access to a computer system using only thoughts 24 hours a day, 365 days a year. Think of it as plugging a USB Bluetooth into a computer and being able to use it with your brain. We're on the threshold of building replacement circuits for the brain. The potential is enormous, both for restoring function and for understanding how the brain works. As our phrasebook turns into a dictionary, we'll be able to decipher the inner workings of the brain and route around damaged circuits. In the very near future, we'll be able to provide restoration of function for people with paralysis. But what if we could provide some advantage by leveraging action at the speed of thought? These are the types of questions that we're going to have to consider as our technology continues to advance. But in the meantime, I think restoring the ability to connect with loved ones should be our major goal.