 Greetings. Make sure I'm on my slide here. Thanks for the introduction. I am Thomas Rurden. I'm the CEO of Control Labs. I lead a merry band of neuroscientists and engineers in New York who are building what we believe is the world's first non-invasive, always available and always on neural interface technology. And we started the company because of a really simple problem, that there is a yawning gap between human input and human output. You have a phenomenal ability to take in information all day long at all times via your vision, via your audition, via your hearing, via your proprioception. You have an amazing ability to process that information. What you lack is sufficient output. You output through your muscles. And that is slow and error-prone. We want to close that gap. And we want to do it because of this. The new computing platforms that are upon us now, these pervasive computers like an Apple Watch, where my fingers didn't get smaller when it showed up, or new kinds of robotics that have high degrees of freedom, but our bodies weren't really meant to control directly. And maybe the most exciting of all are these new immersive platforms in AR and VR, in which there aren't natural means of interaction other than movement. But what we believe with a neural interface, you can explore new kinds of interactions and exploit new powers you didn't know you had. I believe that the devices that we've used over the last decade or so mock us and they limit us. When I try to go do something simultaneously, adjust the temperature on the wall or play in VR and have an interaction that makes some sense virtually. And I love this picture because it captures what I think is probably the most frustrating thing for me in being doing something as joyful, as driving, and easy as driving, and conflating that with what it's like to actually just type on a touchscreen. This thing that causes us cognitive load, I have to engage the corrector of the autocorrector. I believe when the iPhone came out and many of the technologies over the last 20 years, they've caused us to regress as a species. We are not at all exploiting our real capacities. So we think the question at Control Apps is not, how do we make our devices more capable? It's how do we ourselves become more capable? Neural interfaces are the root solution to this problem. Neural interfaces translate the activity of your nervous system to make you more powerful than all those devices. We created a specific technology we call intention capture that allows us to solve the human output problem and close the gap between human input and human output. So I'm giving you our point of view about how you work in the world today. You have a brain. Everyone here does. Your brain actually continues down. It's not just what's up in your head. It's that full continuous brain that goes down into your spinal cord. Your spinal cord is sort of the USB port of the brain. You generate command signals that then turn muscles on and off. That's what your brain does. That allows you to move in the world, but most importantly, it allows you to use these skillful parts of your body, your hands and your mouth. And ultimately, you manipulate devices. You cause a microphone to go off or a camera or you move a mouse or type on a keyboard. And it's those devices that translate your activity, your movements into control over a myriad of devices in the world, these digital devices all around us. But it doesn't have to be that way. What if instead, I'm sorry about that. It was a build here. What if instead, we took the electrical activity of the brain directly, not the movements of your muscles, but we decode that electrical activity and use that to directly control devices? That's the heart of a neural interface. And when you explore that more fully, what you realize is it's not just control as you would naturally experiencing it, say moving your five fingers. It gives you the potential to do something like have a sixth finger. And it allows you to go and really dream and do something kind of exotic, like dream about having eight arms and to propel yourself like an octopus. To not be victimized by your muscles. So I'm gonna do a little bit of neuroscience for you and walk you through how we do it. And then I'm gonna show you the results of this work. So at first, as I said, what we try to do is decode the electrical activity of your motor nervous system as it goes into the most skillful part of you, your hands. And we break out this very convoluted, complex signal, this electrical signal that courses through your arm as it's controlling your hand, and we break that down into the individual activities of your muscles. And when we do that, we can effectively recreate or virtualize anything you do today, like moving a mouse or typing on a keyboard. Without you actually moving the mouse or typing on the keyboard. But this is where the neuroscience comes into it. What's actually underlying the electro activity of the muscle are the impulses that course out of motor neurons. These are the motor neurons that live in your spine and directly turn fibers in your muscles on and off. What we've done at Control Lab for the first time is allow us to noninvasively listen to those neurons as they send what we call action potentials, a single spike. We listen to the muscle and we can recreate the zeros and ones of motor neurons, the output neurons of the brain. And that allows us to do some magical and somewhat unimagined things with this neural interface technology. Now, what we've done here, if you think about it, is to take the biological output of neurons, the zeros and ones that they generate, and feed that directly into a machine learning network, a deep network. So we're taking biological neurons, sending them into a deep network and decoding your intention. And then further, we're taking that output and feeding it back to you so you now have a symbiotic relationship with the deep network. You are in some sense adapting with it, but you are enslaving that network. It's working in service to you. I think this is maybe the most powerful idea we have that AI and machine learning can be dominated by us and our neural networks. Now I'm going to show you how this actually works. So I said intention capture is this technology to decode the biological output of neurons using digital neurons. Now I want to show you the magic behind intention capture. This is control kit. Control kit is seeing the light of day for the first time here at Slush. This device is up and running. It's now sampling and it's what we hope to put in developer hands in the first quarter. You can wear this or you can wear two if you wish and you can wear them all day long. This is a surface electromyography device. It takes the signals of the nerves and sends them over the air to say your phone or computer to allow you to control those things at a distance. You become the controller. I'll come back to this at the end but what I want to do right now is really show you how this works in real life. First I'm going to show you one of the forms of intention capture we call myo control which is extracting the meanings of your movements. So this is something from over a year ago we did which is what it would be like to actually type. And here I'm just pushed away the keyboard and I'm just tapping on the desk. I'm wearing a band on each arm and I'm typing all your interface are belong to us. My shout out to the nineties and this is the future. Here what we've done is virtualized the hand, the full continuous motions of the hand, the dynamics of the hand. I can wave it back and forth obviously. I have each individual finger is active. If I can decode all the actions of the hand, again I can decode what the hand would otherwise be doing when it's moving a mouse or a joystick or the throttle of an airplane. This is a hard ML problem. The neat thing is of course cameras aren't relevant here. This works all the time. Your hands could be in your pocket behind you. It's always there and it's the full continuous activations of your hand. And again it's the intention, not the movement. Here he's restraining himself but the nerve is telling him to open his hand. The virtual hand responds to the nerve even when his hand doesn't move. And I'm showing you that we can decode the forces you generate, not just the movements but the forces. So when you're grasping, how much are you grasping? Or pinching? Here he's going across each of the joints. It turns out most of the manipulations you do are some kind of fine force control across your joints which is why getting this accurate is really, really important. It's sort of the way your brain generates commands. This is a demo we did on Magic League to show you how you can have control over things at a distance. So here, this is a force demo. What's happening here is he's pinching to set up this tractor beam. He's gonna wave the tractor beam back and forth until he can fix on an object. And now he's going to be able to grab that object and bring it to him. And it's not just that it recognizes his hand is closed, it's how much force is he exerting as he opens his hand and co-contracts. He can push it away or pull it slowly or fast. It's this kind of magical action at a distance that we all kind of have lucid dreams about. Now I wanna tell you about the other feature of Intention Capture which is neuro control. And this is no longer about how you move. This is about the activity of neurons. And this kind of gets a little bit crazy. What you can do with your neurons is not a problem of science, it's a problem of imagination. What would it be like if you could control a sixth finger? Your muscles won't let you do that, but your brain will. Your brain will let you control the eight legs of an octopus and swim around with full fidelity. Your brain can do this, and it can allow you to have the force. Oops, sorry about that. Here again, we're at the level of individual neurons. When a single neuron fires, here that blue one fires, the dinosaur and dino rider jumps up and down. It's a simple game, one dimensional, but it's with millisecond accuracy. The point is you don't need to move to do this. Here my colleague's gonna demonstrate playing it by hitting the space bar and the dinosaur jumps. There's no big surprise there. But what's crazy is he can do the same thing without moving at all, by just activating that single neuron. And he can do it in any kind of posture you want, whether he's folding his arms or doing some other task. With millisecond accuracy. I do love the hands and pockets. I wanna show you a slightly more complicated version that this is asteroids. This is my co-founder playing asteroids, which is a high degree of freedom. Again, two degrees of freedom plus pushing buttons. And he actually has full control without any movements. He's just being able to stare and get his attention to move left or right, and his attention to control the thrust without movement. Just by decoding the activity of neurons. And I'm gonna end with this. Whoops, excuse me. These videos are not queuing properly, so. So I showed you this when I came out on stage. What I wanted to show you is the neural control version of this. He's doing little teeny pulses, but then he's gonna go into a neural control version of this, where it's just trying to perform and respond to him in a high degree of freedom. Think of this as six to seven degrees of freedom without actually moving at all, just by activating neurons. It's something which he has actual control. It's not just a random response to his nervous system. Intention capture is really universal control. It's for everything, and it's inevitable. I hope you'll join us on this journey to really empower people in a way they've never imagined before. To allow them to get control over a myriad of devices, and to do it flexibly, and in a way that engages their entire brain, and for us to leave behind the sort of bad past of devices that force us to do what they want. Thanks so much for your time today. Thank you so much.