 Hello, hello, how are you? That is my face, by the way. In case you were wondering to put it on a side like that, maybe that's the Python code for doing this face mesh using MediaPype NumPy. So what's using NumPy4? Should primarily be MediaPype. And no, it's a bit heavy, too heavy as far as I'm aware to turn into a web application. Wish I could do that, that would be nice, but it would not run in the browser. For anything to be a web application, there's essentially a combination of stuff running in the front end and the back end on the server. This one is actually a bit too heavy, I know it, because when I even using it myself, I'm using it through the website, so the traffic goes onto the internet. So I know it's a bit heavy because if I use it too much, I start dropping frames in my live stream, which I don't want to do. You could probably kill my live stream by over using the website. So I'm using the same machine to host the website and do the live streams. Probably shouldn't give you any ideas of how to use converting EEG into sound first, which is kind of, yeah, no brainer. There's a lot of other tools out there that do that. We want to convert into music using MIDI libraries, to having some trouble loading those. And we don't know how to code, so we're using Chagobt for that. I do know what I want, but the question is, can anyone deliver? So if you're keen to help, you don't like Chagobt, you think you can do better code, please, this is a not-for-profit project. If you would like to contribute, take your code and try implementing it on the site, turning it into a flash application. Otherwise, just use GPT or it's proven to sometimes, not actually most of the time, generate the descent code. So in this case, we just have this description as mentioned, this main JS somewhere. We'll pop that one next. We use plot EEG, converting EEG to music. So this calls a back-end, which we have in our app I sub-back. And as the convert EEG to music function, we only have this three files, so about 300 to 400 lines of code. Chagobt A4 is able to keep all that in context, which is great. The problem is that it's not playing anything out. We do have a file, output MIDI does exist. Let's see if it's being regenerated. I think it's only generated once. Okay, let's go one by one and show that the convert EEG to music route is being called correctly. I don't think it's being called. And check with F12 network. So we restart this page. We have all this stuff being loaded. Data image, get EEG data. Doesn't seem to be called. Okay, can you do this for us? Check the network browser wall. It's not there. In the main JS, convert EEG to music. It says search for convert EEG for music as we have it once in the function. I think this function is actually not being called properly. Don't think it's wired. Wait, is it wired to any event? We do have to play MIDI. It should just play a tone. This doesn't work. It's not generating any errors as well, which is not cool. If you play MIDI as a button, you might want to turn it into click. That's a great suggestion. Okay, we'll work now. Okay, now it's working. Can I hear anything? Should hear a tone for one second. No, it's actually, no, it wasn't working. We had the frequency generator working, but not the tone generator. Why is it not just working out of the box? Yep, that's the console output. It's your code that's working correctly in terms of setting MIDI measuring, but it should seem to be with the MIDI output device. The console shows me through its available MIDI output device. This is typically a virtual MIDI port used for rounding signals in software application and doesn't produce sound on its own. In the MIDI files, you'll need MIDI synthesizer or a software application that can receive some MIDI messages. Understand, so JS initialization. Yeah, so one thing is for sure, we do not have a physical MIDI device. Yeah, let's go with the first option. It gave me fake links again. I hope not. Can get a specific guide, not a general guide. Check if this link is actually, okay, it has something on it. Okay, so in the HTML, where we have a bunch of stuff in there, now in JavaScript, we have an audio context already, where we have constant section in your piano, and then by the MIDI, don't need any of that, do I? I'm happy to try. And let's comment that out for a sec. We know piano, piano, but this work at all, it's a new thing. Yeah, what's this? Era, what is it to be a global object? We need to define it here somewhere. Browser should be compatible. Yes, because problem with this bit as well. Hey, we are in business. Still loading, loading Floreo, right? It was 1.5 megabyte acoustic grand piano. It's quite a bit, but it works. So should not be complaining, should we? Okay, so we are in business. We want to do now display. So this is just a note testing, display the MIDI file being stored somewhere. It's not actually loading. Now can we make sure the EEG into music function is actually being used and we can play it while any of the input values on the front panel are changing. We have this one, it's working, I don't know if you can hear it. That's too soft, but that's okay. It's better than too loud. Yeah, we need to integrate the EEG to music conversion. We have the function. We show the functions works okay, but we need to make sure we actually integrating it properly. And then instead of, so display, the EEG button should be renamed into test piano. Do we need the MIDI file or can we just play straight in the browser? Okay, so instead of converting into MIDI file, convert in a series of musical notes, return the series of notes frequencies as the JSON response from convert EEG to music route. Okay, that sounds good. We already doing this, convert EEG to music. Yeah, saving file to change that. Then we have a play sequence function in shell script. Okay, it looks legit, but how do we change the Python code? EEG data is assumed to be a list of frequency like values. These values are scaled and mapped to musical notes using linear interpolation. The list of corresponding musical notes is returned as a JSON response, which can be easily consumed by your front-end JavaScript. Please adjust the EEG data processing part, EEG underscore frequencies, we're going to have your actual EEG data restricted. This function assumes a simple scaling where your actual EEG data might require a more complex mapping to musical notes. A GPT-4 is stuck. No, it's alive again, it's alive. Let's see, are we able to integrate these changes or not? Okay, function to handle sequence of notes. Function is called something else, fetch URL, it's fetching JSON. It's already doing that. Sponsor K, throw new error at work. Otherwise response JSON, K, this bit. End notes, play sequence. Create function to play the sequence in the slider. Many of these event listeners that we have, three at the moment, like many don't have one for data sliders than usual, can add one here. Just double check that we don't have this already. In update data, we should have function for this existing code. We need all the existing code. Everyone use web midi.js code. We'll do it later. Let's reload this page. So this note is working, okay. Let's check for errors, have seven errors. Any suggestions how to fix this from chgpt? Once you confirm that the static response is working correctly, you can proceed to implement the actual EEG data processing logic. It's always the technical issues that we are having trouble with. Some sort of ceiling is taking this code. We might start a new chgpt chat, try and solve this one. Essentially the problem is that the JSON file is not transferring the musical notes as it should. So we'll try debugging this later. So let me know if any questions, put your comments down below. Anything meaningful, suggestions, complaints, future directions with our collaboration requests, anything like that. Put them in the notes or find me on LinkedIn. And I'll see you next time. Bye.