 Hello, my name is Andreas Wusserman and I'm going to tell you a lot about brainwaves and how we as Python programmers can use our skills and tools to explore our mind. First, a little medical disclaimer. I'm not qualified to give medical advice on on certain medical conditions or treatments thereof. So whatever I say about these treatments, take it with a grain or salt and consult a qualified medical doctor if you need advice. First of all, what are brainwaves? Well, as you probably know, your brain cells all your brain cells communicate via electrical signals, among other things, and decide the effect of this signalling is that they send out electromagnetic waves. And if you sum all these potentials from all the brain cells up, you get a sort of summation signal that you can measure on the scalp. But as I said, every brain cell counts a little. That means you can't really isolate single cells. You can't even often you can't even resolve certain brain regions with medical devices, with medical EEG that is sometimes possible, but also quite restricted. The signals are extremely weak and noisy and virtually every facial muscle has a stronger signal than the brain itself. Relatively early on in the 60s, researchers asked the question if you can actually if you can actually consciously control your brain waves. And they did some experiments on cats. They trained the cats to exhibit certain brain waves. They just rewarded them for certain patterns, and they were able to control the patterns by just holding very still. Later, these cats were used in experiments to test epileptic medication or rather the opposite medications that that elicits epileptic seizures. And it turned out that these cats were actually more tolerant for these medications than other animals. So they were more resistant to epileptic seizures. Over the decades, it has turned out that several studies really showed an effect for neurofeedback as a therapy for epilepsy, but it's not really all that common today. Epilepsy also is a very severe condition, and so it's not really amenable for us as amateur brain scientists. The more important application of neurofeedback is in the treatment of ADHD, which is an attention deficit hyperactivity disorder, which is characterized by a short attention span, by high impulsivity, among other things. The consequences are, for example, higher risk for depression and other conditions, lower academic achievement, were social skills and delinquent behavior, higher risk thereof rather. This is not a complete discussion of ADHD, because I don't have the time for it, and it's very complicated condition. It turns out that these research around ADHD is highly relevant to general mental training and general training of, for example, mental focus or skills training. There's one part of the brain which is especially relevant and responsible for the executive functions of the brain for impulse control and for all sorts of positive features, and that is the prefrontal cortex, which is right in the front, as the name implies. And also, this is one of the most human brain areas, because other animals don't really have all that developed in prefrontal cortex, and that makes us pretty human. And if you ever saw a dog, you can't really hold his attention really long, except with very good training. There is, it turns out that all sorts of activities have been shown to actually train the prefrontal cortex. For example, meditation in the form of awareness meditation, which was propagated by Buddhism mainly, but in psychology, it often is called mindfulness training or mindfulness-based therapy. And in the most basic sense, you would, for example, concentrate on the sensations of your breathing, and you would try to focus on them, but after a few seconds, especially when you're a beginner, you will just drift away and think of something else, or your mental voice will go on and your concentration is away. Then you bring it back again, if you notice it, which can be minutes later. So it's a very simple to describe training, but very hard to do. Wouldn't it be nice if there was a signal, hey, you're going off track, hey, you are not focusing anymore, and that is basically the idea of neural feedback training. So a computer could give you feedback on your focus state, but it has to have some way of telling if you're focused or if you're not focused. So that is done with EEG sensors, and a big problem with these devices often is they are very expensive and not very practical to use. The company called Neurosky has developed an ASIC module, which is sort of a board with a chip, and it does all the amplification and translates it into a digital signal, also already with some preliminary analysis, which is relatively simple, but works quite nicely. It is used in several devices, about a million or so, the company claims. For example, these board games, if you ever saw them, here you can control physical objects with your brain. They never really took off, and several people also said that this is sort of bogus, but the chip inside, at least, probably does exactly what it is supposed to do. Here, that is a very nice device. It's a headset with cat ears, and these cat ears are mounted on little motors, and according to your focus state, these motors move, which is sort of popular in cosplay. This is a mind-based mobile. Several versions of this exist. One is a USB dongle wireless thingy, where you have a USB dongle, and this communicates with the headset. Here I have a mobile version, which has a Bluetooth connection, which is actually more useful, but they all talk the same protocol. Because they all have the same chip. The protocol tells us about the raw signal, attention and meditation values. These are values which are computed inside the chip. The newest guy doesn't really tell us how they are doing that, but I think it's about frequency bands. I go into that later, and these attention and meditation values are just between 0 and 100. This protocol itself is designed to be extremely hackable, and therefore it is very concise. For example, if you hook it up to an Arduino, the Arduino is able with just reading a few bytes to tell if the user is paying attention or not. Frequency bands, I'm going into that later, what they mean, but essentially they are values computed inside the chip itself. So an Arduino wouldn't have to do foyer transformation. Then there is a detection for blink events. As I said earlier, facial muscles all have stronger signals than brain cells. So it's nice to know if there's a blink going on. Poor signal detection, essentially when the device isn't on or if the device detects no signal connection, it's nice to know. Now we are moving on to the live demonstration if it works. The first thing is to make a connection via Bluetooth. I have to excuse myself because before I went into this room, I had a connection and I hoped it would stay that way. But now I have a connection. So also my reveal, iPython live reveal doesn't really work as well as expected. Here I have a code sample. This is a feedback loop sort of code routine, which I'm going to use later to do some feedback. I have written a parser that writes a protocol, reads a protocol, and then the loop here reads something from the Bluetooth socket, feeds it to the parser and then it yields back to the other function. This is the most simple form of feedback in which I just print out the last attention value. If there is one, it isn't really that fault tolerant, but I have to... So now I'm working. Now the value is between 0 and 100, and it's somewhat more difficult to do when you're standing up. The first time I gave this talk I was wondering why I had such a bad signal, but it turned out that really the whole time I was practicing for the talk I was sitting down and then it's a presentation I stood up and wondered why I didn't get a significant signal. But it's really so that the problem is when I'm standing up my face muscles are a little bit more stressed and the brain is in a different state of mind. It has to stay upright and balance and stuff. So biology sometimes confuses stuff. And also reveal JS is also... Do you think I'm frightened? I'm not afraid of you. And if you have live code you can also mess up the code. Now this is a live animated... The way I am controlling attention is by sort of mindfulness meditation. I concentrate on my breathing. I try to consciously put this focus lever on or this button. And also I concentrate on the area below my eyes that seems to work reasonably well. Now the New Sky module computes a meditation value. I won't comment on if it's really meditation for me what works for meditation is I think of something cute or adorable. Smiling baby or baby animal and that sometimes that mostly works. Now there is something called brain-computer interfacing. Brain-computer interfacing means you would use this EEG device to control something else. Essentially at the most basic level you would want to have a button. But the problem with both of these values is you should not only have a way to consciously push the button but also avoid pushing the button. And that is much harder to do. Now I am going to show you the raw data. Also in the same manner that we had a few frames which were relatively good. I am now going to show you what happens when I move my eyes. And as you can see every muscle is stronger. Now I am going to clench my teeth. And I can abuse this signal to do some sort of brain-computer interfacing but it's not really brain it's really a muscle. And EMG is normally an electro myogram and you would use other sensors but this also works. And I do that by parsing. So I am taking about a quarter of a second worth of data. I compute the first order difference. It takes absolute and then I mean it. So in this case I am having... What happens now? Now the Bluetooth is again down. That is a good sign actually because then usually it makes a connection. If you wanted to program a real consumer-friendly app this kind of automates this kind of connection would be high on the list to make that work. So I have two demonstrations left and how much time is left? Okay then I will do one more time, one more try to get it back working. I essentially need to wait for the blue LED to turn, to not blink anymore and now it works. So on the top is the computed value and I have a cutoff in this example. I have a cutoff of 30 and then I can more or less consciously, mostly consciously turn the switch on or off. In more sophisticated applications you would use machine learning especially if you really wanted to use EEG data not EMG data. But also this is a one channel device so it has only one channel of measurement and usually brain-computer interfacing requires more data. But you can do some of it with just one channel. I now have to go back to frequency bands in more sophisticated analysis of EEG data. I think I didn't explain EEG that was electroencephalography or gram. And the idea between frequency bands is that well the brain-based tell us something about what the brain is doing and how it's doing it. And different frequencies will tell different stories about the brain. But the relationship between what the frequencies mean and how you measure them is very complicated. So I'm just giving a very superficial overview of it. The boundaries here I've taken from Wikipedia below for we have data waves associated with deep sleep. Then there is 4 to 7 hertz setter waves are drowsiness and hypnosis. 8 to 15 alpha waves are associated with relaxation and closed eyes. 16 to 31 in this case is better. Attention is associated with attention and wakefulness. In ADHD therapy new feedback you would try to up train better for attention and down train better. There are other modes, there are other ways of detecting attention but that is one of them. Here I have a code example. There is a library called Pi EEG that can compute various stuff among other things. The power of certain frequency bins. Here I have different frequency boundaries. This vector was noted in the documentation. Here I'm calculating these values from about one second of data and you can try different lengths of time. On the one hand you want the frequency computation to be accurate. On the other hand you want to give immediate feedback not after 5 minutes of running. Here I'm distracting this vector again into different values. Then I'm displaying just the better with the setter. Now the Bluetooth is acting up again. I'm not going to turn it back up because this was the last demonstration. Essentially I can consciously manipulate this value. But it's a bit more tricky than the attention and meditation values. Now on to one of my favorite topics which is data science. I designed a little experiment. At home I recorded two sessions. One is a baseline session, one is a training session. During the baseline session I tried to not pay a lot of attention which is really difficult. Because paying attention is a very physiological thing and happens all the time, especially unconsciously. So I just took 5 minutes of that data and I counted on that I was more often distracted. Then I recorded 15 seconds of mindfulness. And I wrote both of these data to files. So I just took the Bluetooth data and dumped it into a file and later I can pass it again for data analysis. Which means it's a bit more reproducible. So if I have a bug in the parser and I had a bug before and then I correct it later I can reproduce my data quite easily. So here's the data as a histogram. I've taken several values of our time. Well especially in the top graph there are two modes. I think that it's a data quality problem. So if I wanted to be more sophisticated about it I would have to exclude eye blinks and situations where I am not entirely confident about the signal. Because if you move the device or if you move your head and stuff then this affects also the data. We have a general sense that the bottom graph is a bit more to the right. It's hard to tell. We could do a t-test but as always Bayesian know it better. There is a toolkit on GitHub which does that with PMC as an Markov chain Monte Carlo engine. And if I do that I think I have the code here. Yes import best, best plot and then I just sample the model. I already did that. And then I can show this graph. So I don't want to go into too much detail on this but essentially these two histograms show us the posterior belief about what the model believes. Where the mean values should be. The 95% high density intervals, Bayesian don't like, Bayesian often don't like it when you call these confidence intervals. But here in this case they don't overlap which is a good thing but the button interval is certainly a bit broader. That is because I just had 15 seconds of recording. So on to further ideas. You could explore brain computer interfacing some more. As I said it's very difficult to really design workable switches that work between sessions and even between people. Another thing you could try is watching MOOC videos and monitoring when you are getting out of focus. You could make actual games. There are game mechanics. You could make the game mechanics that the character jumps when you concentrate or you could make a shooting game where the jiggle of the gun is controlled by the attention. So the more attention you are paying the steadier your aim. And bioelectric identification is to make basically a password. You try to identify the person just by their brain waves. I can recommend several courses but as I'm already a bit over my time I don't explain this anymore. And I'd like to thank you for your attention. We can do some questions. Yes. So first of all thanks very much Andreas. Especially for his courage which is actually worth an extra plus. Making live demos. Everybody having a question please line up. There are two microphones from each side. Please line up and ask your question. If I want to start doing this right now and doing what you're doing on stage. Is the Neurosky thing what I should get or what do I start. I think there are several devices out there but the Neurosky mind wave is probably the cheapest. And from what I have seen I don't own these other devices but this is really one of the most useful devices. It has a dry sensor. You don't need gel or moisture here to make it work. And just needs a triple A battery with good for safety and it's also wireless. I think the mind wave mobile is the way to go. They have a more comfortable headset which is about double the price. For example if you wanted to use it while making sport or even sleeping that would probably be a better idea. How much does that one cost? I don't know of the top of my head but it's at least double the price. So this costs about 110 euros and the other one about 200 or 300. I think the next question goes. So the question I had was as far as you was mentioning a resolution or trying to get better detail. Is that something the fact that you could have, would a second sensor work? Or if you basically took what the Neurosky one and effectively had a separate sensor on a separate part of the head. Would that help or are you looking for, is there needs to be more? Essentially to get more channels with these devices I think you can't really mount several of these devices on your head that would look kind of stupid. But I think they offer these modules as developer kits so you could, if you are a hardware hacker you might be able to hook something up. It's complicated. So fundamentally when you say the different channels it would be different parts of the brain you're looking at. Yes you would get more spatial information but that is also a bit restricted. Okay, thank you. I've got two questions. First is how much data per second or minute in bytes does this device produce? Roughly. I don't know of the top of my head but 5 or 10 kilobytes. So we are talking kilobytes. I think yes but that's mostly the raw values. You can really discard most of it if you don't want the raw values. And have you locked the brain waves while drinking beer? Not yet. I have a question. Would it be possible to trigger different switches with different feelings? That is certainly a machine learning question and I don't even know if you can develop a reliable switch with just one channel. That requires more work. It's really something of a classification and especially not over training on the conditions that you are actually using. And you have a danger there that you overfit on your current thoughts. So you might try to distinguish between two different thoughts and then some other thoughts do the same thing. And you don't know it beforehand. It's really tough, I think. Thank you. Thanks again. Andreas Klosermann for his talk.