 Okay, we'll do a quick stream. We haven't done those for a while. So we've been playing around with this tool. I actually used the GitHub co-pilot for this. We used GitHub co-pilot for this and as you can imagine it's generally music out of hand gestures. There's all sorts of issues with it that we are trying to resolve. Ken was not using charge of tea back and forth copy pasting. He was using only GitHub co-pilot for this. So it should be better. I don't know why. But look at the screen that keeps detecting a hand on my neck. So to change the sensitivity of it. We're already at 250 lines of code. It's working pretty well. Just understand this quickly. It should be on the site later today or tomorrow we need to play with. But essentially there are these musical notes mapped to left and right hand. We still have to modify this mapping because it's pretty annoying as it is in the monitor. So moving from low frequency to high frequency and this will adjust the amplitude. And the other hand is just mapped to a different range of notes. And again volume up and down. The other thing is yes you can play both of them simultaneously. Essentially the idea was to have two different musical instruments mapped to each hand. So we'll see. Actually try that. Didn't quite work. The other thing it does is if you keep your palm closed it will do this burst. And if you keep it open it will do longer notes. Same on there. You can make it lower this way. There's something with the motion as well but that mapping is a bit elusive at the moment. I have to keep testing it. So probably keep testing it during live streams from now on. Let me know what you think. And yes it will be available. It's currently called Cheshire Groove and it will be available on the site later today or tomorrow. Yeah I don't know. It keeps detecting a hand on my neck all the time. I don't know why it does that. Yeah those are errors before. They were asking to adjust sensitivity and the lack. Yeah GitHub coped a little bit. It doesn't always have all the context. It gave us this line that was obviously problematic to remove it. You still need to play with attack, decay, sustain and release for the synthesizer for each note. But yes there is a length being controlled elsewhere both by palm open, palm close but also the speed of motion of your hand on the screen. And this is obviously related to this obviously related to what we already have live the EEG2 music conversion. We're using the same protocols. Trying to create some Christmas chimes with it. It didn't work quite well. It actually sounds more Christmasy on some of the channels. It sounds a bit down less festive on Channel 7. I think Channel 14 from memory sounded more Christmas like. That's the no seizure but sorry that's the other tool. Doing the head gesture probably should turn it off. It will be not much development today. It's just a quick rundown of what's going on. If I turn it off my video will also turn off because I'm using the browser for the video monitoring. Obviously there should be some sort of volume control to this and other controls later. And yeah I have no idea why it does that to fix. So we can ask GithubCopout quickly. It's able to solve... Are you able to solve false detections of hands? I don't know what the microphone is working differently to the OpenEI website one and there is no read aloud. And I'm hopeless at reading but yes it keeps referring to the model complexity and detection confidence, tracking confidence. So I'll have to play with those to show what this magic number is actually mean. I was actually doing it for increase in mean and detection confidence. And the mean track tracking confidence, this parameter is controlled the minimum confidence score for the initial hand detection. By increasing the values you'll make the model more conservative. So we'll probably send those to low. Let's bring them back to 0.5. That's for the hands. We should have interior less of those false detections now. Yeah max number of faces. I'm not actually using the face anymore. It's only the hands. Yeah it seems like that false detection went away. Which is good. It's still working. That other hand is annoying. The palm open should the notes should be longer. It's calculating the distance between. So those should be longer notes. But yeah obviously this will need more work but yes it's less sensitive now which is good. So we're not getting false hand detections on my face anymore. But yeah have to watch that number. Maximum number of hands is 2. And what's the model complexity in this parameter? What is it? The 5-0 that's corresponding to more complex model that can potentially provide more accurate results cause to increase computational resources. So yeah I think it's already quite complexity. We're not dropping frames in the live stream. Yeah getting error. We're not getting the 4. Might as well bring it back to 1. It seems to work anyway. Yeah that was a false detection of the other hand. I still don't know what this is. Not the error message but the warnings mean there's something with the browser trying to protect the user from notes being played straight away. In GitHub Koopal let's see what it says. Yeah it's feature of the web browser auto play policies which is actually preventing an audio from playing before the user interacts with the page. Sure I'm already calling this don't start. No I'm not. I need to do it miles down. Because we're actually using it for testing. It said there will be no development today. Yeah 8 hand means short note. It's an 1 8 hand note. I'm sure this will not do much. It's the model complexity for the hands. I think we have it again. Ah that's for the pose. That was a false detection. Yeah we're getting errors when we increase the model complexity to above 1. Keep it at 1. Yeah so hopefully we'll be able to try it out very soon on the site. It's pretty much ready. It's all JavaScript. Just make last few updates and release it. Obviously updates don't have time. The note is the which hand is the annoying one. One of them is producing very high pitch notes that could have dropped down the notch which which hand producing higher frequency notes. A scale left scale right fc4 to c5 and right c5 to c6 is higher than c or a higher octave number c5 will be higher than c4 could suggest a different scale for both hands. One that sounds more pleasant. Right so we did the same for left hand. I don't know if this sounds more pleasant or not. Okay now overlap to a larger extent. Yeah the overlap supposed to sound more pleasant harmonious to a third note linked to the shape of the mouth or something. Anyway this is something we were playing with over the break. I mean it is by medical engineering related. You can use it for exercising as well with dancing. Who doesn't love dancing especially after eating all the Christmas food. Could you suggest how the head position could be used in mapping additional musical instrument or something. I think yes we can use translated pitch as we do with the X coordinates. Could use modulation. Yeah we tried this before. We have to download more musical fonts and things which we having trouble with last time. Anyway we'll release it soon for you to play with as well as we can provide your feedback. We also have a couple of blog posts coming about the EEG processing feature the types of feature extractions in EEG. So stay tuned subscribe like comment and I'll see you next time.