 Welcome to the session on augmenting human capabilities. As the lovely intro said, I'm Oliver from C-Bass, a co-founder of the company about three years ago, and I lead the R&D efforts at the team. So what does it mean to connect the future of humans and machines? We live in the age of digital technology for less than 100 years. Less than 100 years ago, the first computer came into our world. And in reality, it's only been 10 to 15 years that digital technology has been ubiquitous in our lives. And we choose to make it ubiquitous because it makes us superheroes. Digital technology that we use outside of our body allows us to connect to every piece of media that has ever been created anywhere in the world in any language at any time and view it whenever we like. It allows us to connect to every person that exists in the world whenever we want to. And it also makes every person the smartest person in the room by giving you connection to every piece of information. Most of us yesterday probably googled who did Finland gain independence from 100 years ago. But that digital technology on the outside of our body has not yet translated into our body. What will it mean for the future of digital technology when it becomes insiders and is used to solve the major problems there, such as health care? This is what real health care looks like. Health care is not the newest augmented surgical robot. And health care is not the drug that the AI algorithm invented for us. 86% of health care looks like this. It's manual. And it involves constant care for small chronic conditions, things that have gone wrong with us and that we care for every day. Some with diabetes manually checks their blood glucose. They look at a piece of paper. And it tells them how much insulin to take. And then they manually administer that insulin. Someone with an amputation manually sets the position of their thumb. And then they move to that position and allows it to pick up the objects that they wanted to do. It's a conscious decision. Health care is not yet automated. It is not yet AI. In order to do that, this is where Bionics come in. Bionics are that digital technology that we know and love on the outside of our body brought internally. Bionics lets you standardize that interface and lets you take devices and care for you. Automate that manual process that people do every day. Bionics lets you control devices naturally by thinking about it. And Bionics lets devices care for you. In order to make Bionics, you have to have a connection between the person and the machine. We all know how machines communicate. But in the body, the neural data is how your body communicates internally. Your DNA is the hard drive. It's the storage system. You store information very efficiently in those proteins. But the way your body tells everything what to do in real time is through the nervous system. In your limbs, there's around less than a megabyte per second of data flowing down each limb. And that data is what's controlling all of your motion. It's what's allowing you to do the 26 degrees of freedom that make up complex human arm motion. It lets you play the piano. It lets you interact with your phone so you can call your friend. And it also encodes all of the sensory information you're feeling from the surfaces and temperatures and other senses around you. If you come slightly further in, your organs are connected to your brain through slightly bigger nerves. They're running around four megabytes a second. Those nerves are in real time telling your organs what to do and how to behave. The nerve running to your liver is probably about six hours ago, was really telling your liver to get its move on because it was dealing with all that ethanol you drank last night. And then your pancreas a couple of hours later started being told to catch up because it now had a load of blood glucose to deal with when that ethanol got broken down. And your heart, which has got two nervous systems going to it, mine's probably running faster right now because I'm on here and it's a bit warm, but also to deal with the blood glucose that came from that ethanol I had at 1 a.m. So, and those are your organs. When you come even further in and you get to your spine, you're looking at around a gigabyte of second per data flowing up and down your spine. And if you could read all that data, if you could see every signal and understand what all of it meant, you would be able to tell the limb position of every part of your body. You would be able to tell exactly what information was being fed back to your brain from your heart about your blood pressure, from your lungs about your current oxygen state, and from your limbs about the surfaces they were in contact with. And if you could read all of that, then you could know everything and you could interface with it. But that is exactly the problem, the interface. You have to be able to talk to all that data. In order to connect a true bionic device, a piece of digital technology inside your body to that nervous data system, you have to be able to solve the interface. And the interface is a very wide problem. The interface is everything from the piece of metal and plastic that's got to sit next to your tissue and not make it all generally unhappy and the cells die. And it's also the data connection where you have to interpret this very high bandwidth data in real time, which is encoded in a way which we marginally understand. That's what we do at CBAS. We grow that team who solves that range of problems. Personally, my background is as a tissue engineer. And once you've solved that very wide problem, you can create that interface as a standard. And as we all know, when engineers have standards, they build fantastic things on top of them. When you have standards, when you have open APIs which connect one thing to another thing, any engineer and any designer is only limited by their creativity of what they can then choose to interface with the body. And their bionics can then be better than any of us can think of today. So I'm going to show you the bit of that you're probably the most interested in, which is the data. So this here is the live data coming from an experiment we did in March. This is what your body looks like to a computer. That is the raw data coming down a nerve, in this case, in a leg. And that's what you have to interpret in order to interface with the body. And you have to be able to both read it and put new data back in. In that study in March, we outpaced the largest data set in the world for neural data by an order of magnitude, created over a terabyte. And to do that, that is what the data you need to create the state of the art machine learning in order to understand that data and understand how to put information back in. This is what you have to interface with to bring digital technology inside the body and enable our fantastic digital world internally. We are starting to do that at CBAS in a first product. That first product is called the prosthetic interface device. It's for an amputee, someone that's lost an arm or a leg. And it's an implementation of that technology as an open standard interface between a bionic limb and the human person. When that person has a USB port for their arm, they can connect to any device they like. That allows the person with that implant to think about moving their bionic limb and it intuitively will go where they want. They don't manually set the position of the thumb. It just sets itself to the right size because it's understanding the data that's still coming from their brain. It allows that person to put their bionic limb next to something and intuitively feel that surface and know where it is without having to look at it and make this conscious decision. And when you make both reading from the bionic and talking to the bionic an intuitive process, you make that person forget that that limb is robotic. They start to understand that that limb is their own. They identify with it. And additionally, what's super cool is that person can change that limb at will. It's a USB port. They can clip it off and put on a new one. They can have one for going to the shops and one for changing their dinner. But what's on that list is only at the creativity of the designer or of the engineer. So I'm from Cambridge Biogontation Systems. In order to create an interface and augment human capability such that people can interact with digital technology internally, you need to define that open standard for the body, the API that allows engineers to talk to the person's natural communication system, which are your neurons. You allow digital technology, which has done so much for us outside the body to become inside the body. And there you solve care and you augment human capability. Thanks very much.