 Rhaid, hi, ydyn ni James, a cymdeinol ar y cyfnogi call Curve. Rhyw gyrfa cwmdeinig newydd Creativity, Efinity, Technical Wiz Kidori.若 nhw het yw'n gweld cynillwc Creative ac technologig, ond y Crefnodi tawn i ddwylliant i'r brand, i meddwl am y cyfnodyni. Bydd e'n meddwl i'n meddwl am lŷng ymddangos st sciences. Ac erbyn e chi ddim yn gweld, mae gennym i'n hawl, gan ymddangos hynod o'r tyfnol sydd a'r pwysig. I'm going to show you some cool examples of what people have done with it, what my company have done with it, and then I'll give you a quick demo if we have time slash if everything is charged. So, yeah, let's go for it. So, EG is electroencephalo-graphy. Basically, it's a method that's used to track brain activity by putting electrodes on your head. Specifically how it works is your brain has loads of neurons in it, and neurons communicate with each other by axons and dendrites. So, the orange cable that's going between those two neurons is an axon, that's a communicator, and the little orange bits on the edges of the neuron are the dendrites they receive. So, essentially, neurons communicate to other neurons by sending a voltage between each other. When that essentially happens, there's a polarity shift in the two cells. That is what the sensor picks up and reads. So, its normal use is for understanding epileptic seizures, or understanding migraines, or behaviour abnormalities, or brain death, predicting comas, or another use of it is for research. So, in neuroscience, people study the nervous system, or in marketing, my old company actually used EEG as a way to detect if people were actually focused on an advert that they've just made, which is quite a cool use of it. But I think the really cool use is when people like us get hold of it and they hack stuff together. So, this guy, he bought a little toy EEG headset, and he 3D printed a robot hand, and he made this really cool thing. I'll let the video play. I'm really prepared for this. Easton Lysiappel was 14 when he first started taking apart toasters. Five years on, he's being touted as a global leader in robotics for his range of low-cost, amheromod robotic hands developed in his bedroom. Some can be controlled by a user's mind. A good example is we actually had an amputee use the wireless brainwave headset to control a hand, and he was able to fluently control the robotic hand in about 10 minutes. So, the learning curve is hardly a learning curve anymore. Lysiappel taught himself how to design, make, and code his creations. Using a device that picks up on electrical impulses coming from the brain, he can manipulate his robotic hand's fingers. We actually track patterns and try and convert that into movement. So, with this, I'm actually able to change grips based on grip patterns, based on facial gestures, and then use the raw actual brainwaves and focus to actually close the hand or open the hand. Lysiappel's robotics aren't the first to be controlled by brainwave frequencies. Scientists in Austria fitted a truck driver with something similar in 2010, but that's not where the magic ends. 3D printing allows you to create something that's human-like, something that's extremely customized, again, for a very low cost, which, for certain applications such as prosthetics, is a pretty big part of it. The hands cost us little as $600 to make. Lysiappel wants others to use his work as a platform to create customized versions for themselves. He's made his software open source. That could eventually mean robots being sent in to control search and rescue missions, as well as improving the lives of amputees globally. So, that's basically just an entry-level tool that he was using to pick up those EEG waves. This next one uses a slightly more advanced EEG headset. I won't let the whole video play. I'll just basically just sum it up, but basically what's happening is they're tracking different ranges of brainwaves. So, there's the five. There's alpha, beta, theta, delta. There's another one. But, basically, they're to do with the frequency of how often the brainwaves move. So, essentially, there's a five that relates to the five different brain patterns, and when one is more active than another, it'll activate. So, those circles, they're speakers. It'll activate the speaker so sounds will play. So, it's quite a cool arty piece. So, those are the patterns there. Cool. So, you get a good understanding of that. So, this one was done by BBC. They took the EEG headset and they allowed use of... Basically, they basically made it so you could control iPlayer. So, people who struggle to use remote controls or problems to get access to their TV if they're just sitting down, they can just put on this headset and they can just think. And by focusing or relaxing, depending on how you set it, so either you have to focus. So, right. So, on the left, basically, you can see it's moving up and down. That's your focus level. And then when you basically relax your mind, it then selects the show you want to watch. So, this is still in R&D. I don't think this is live yet. But this is the kind of innovations that are on the way in coming. So, finally, what we did with it, this is my company. We made a mind controlled car for Money Supermarket. The campaign was about smarter driving. So, I don't know how many of you know, but car insurers are going to be putting little black boxes in your car in the future. And these boxes are going to track where you're going, how fast you're going, how you're driving, basically. And they basically determine your premium rates based on that information. So, they wanted to market it in a positive way. And so, this whole idea of rewarding smarter drivers came about. And they wanted to do something really cool, innovative and techy. So, we proposed making a mind controlled car that all the best drivers got a chance to use. So, I'll just play the video. This is our website. This is a unique UK first driving experience where we give people the opportunity to drive a real car live around the track just using their minds. We've got a high advanced EEG headset that has 16 sensor pads that can measure brain activity. We then have a software that can take that different brain activity and assign it to certain functions. Our software is then programmed to send a radio frequency signal to a robot inside of the car, turning the wheel and pressing on the accelerator in the brake. I'm now going to demonstrate this working. So, I'm now going to think left. As you can see, the wheels turned left. I'm now going to think right. The car is now turning right. I'm involved with Money Supermarket and the Car Insurance Epic Mind Drive because this is all about reminding drivers really about the power of thought and about using their heads with driving. Obviously, be safe behind the wheel. Use your brain at all times. This is all about using the power of thought and all of my life. I've been a strong advocate of using your brain as much as you possibly can. This really is the ultimate test drive for the brains of Britain. We're going to have people controlling cars with their minds. This is epic. So, yeah, that was really epic. So, yeah, that was last year and that was Carol Vorderman. She was actually quite good at it, but she didn't like it very much. I was the one who taught her how to use it and I had to put this device on her head. She wasn't a big fan when I asked her to remove her metal hair extensions because they were interfering with the signal. I don't know how much time we have left because I know we started late. I'm going to try and give a quick demo now. Of course, everything is going to work. I'll just talk you through roughly how things work for the mind control car. This is sort of a similar-ish demo to that in the sense that basically what I had to do was you put this headset on, you have to calibrate it, which you're going to see me doing in a minute, and once it's got two of them, oh, my God, right, everything's going wrong with this talk. Right, depending on how well my computer behaves, what I might do, I might just do the demo afterwards if it's going to be difficult, but basically the way this works is you have to train the software to understand what a left signal means or what a right signal means to you. What it does is I say to you, think left, and you then in your head, you then think left, whatever that is to you, and I then press a record button, and this software records what those patterns of electrical signals happening in your brain equal left. Then I do the same for right, I do the same for forward, and so then what you've got is patterns. This software basically recognises the patterns. It then, once it detects left, right or forward, it then sends a radio frequency signal to a robot inside of the car, which then controls the steering wheel and controls the accelerator and the brake. Oh, wow, it's working. I'm surprised because normally when I put this headset on, those green bits there means that every node on my head is basically connected and is getting a signal from my brain. I've never had it. It has put it on my head and it's just worked great. Ideal, right. When we actually built it for my supermarket, this is the example software. We didn't code this. This is the sample software that comes from a motive, but we used the SDK that came with this basically to build the similar functionality that you're about to see into essentially our app, which then worked with the radio frequency controllers and the robot and so on. I'm going to add some actions here that I can train. First of all, I've got to train a neutral state so it understands what my car... It starts from neutral then from there, if I have any active thought to then record them, so it's got to have a neutral starting point. I'm going to tell the software what my neutral is so it can then learn other actions from there. It's now accepted my neutral. Now I'm going to train it to understand what left means and we'll go from there. I'm going to cheat this to make it a bit faster. Normally, with this kind of thing, you'd have to train it for hours to understand what left is, because to pick up the signal, it's not as simple as thinking the word left and then suddenly it just works. It's much stronger when it comes to understanding muscle movements, so if I was to clench my fist because there's a signal, a strong signal having to go to my arm, it basically picks that up much clearer. Equally, it's also really good at picking up facial expressions, so I think what I'm going to do to make this a bit quicker is I'll just pull a silly face and then it'll pick that up. That's a very strong signal for the sensors to understand, so hopefully it'll speed the process up. Try to do left. Cool. Left. Go left. Do it. Okay, let's try again. I'm going to record it. This is all on video as well. Great. Catching at my finest. Okay, so neutral. I've got mind control. Okay, cool. How much time have we got left? Five minutes. I think so. Let's close it. Okay, cool. All righty. Well, listen, that's actually the end of the talk, but yeah, I'm going to stick around a bit longer after this, off the stage, and let the person set up. If anyone wants to have a go, put it on or ask me questions, feel free to just grab me, and yeah, hope you enjoyed the talk.