 We should be alive now, running those two media pipes, so the CPU is doing extra, extra work. We want to change the hour one to be updated version, we want to add the body to it, add the pose. Now create a pose, don't check it along with the face mesh, so mp face mesh max number of faces one, that's okay, minimum detection confidence, yeah that's probably why sometimes detecting my ear as a, as a hand, so I can see it on the screen, but let's add the pose, so we have hands as hands, a bit, okay, we got to change this, just generate the whole code, it works with no errors, while it's generating like it just popped the link to the stream just in case someone wants to join in, put it on Twitter, I don't have much going on on Twitter, I don't have much going on on LinkedIn, well it's still generating, I forget to post that quickly, give this four key, so this code is much longer, it has a pose as well, landmarks, I don't know what's happening with the color, stop this for a sec, we also need the name, this one is called the media pipe avatar, media pipe avatar, yeah this one, two, no wait did we take the code, no, yeah this one is the second avatar, I have to change that, just check that the camera that's using the right camera adds using zero, no it should not be zero, it should be two, can I say two to begin with, no it was zero, okay now I'm confused, well avatar is using zero, so it should be second camera, step my body, went to my face, two select window, media pipe avatar, two, share, okay, don't have to be mirrored as well, but it doesn't matter, we're still mirrored, so it looks like they're talking to one another, probably not, yeah it could be good to have a neck, can do a line dance, it's the elbows, it's the elbows in the frame, should we do just hand detection along because it's obviously trying to detect the face and body in that image, that could be nice, could detect the whole, the whole lot, this way it's like reposition the camera again, so far, yeah I'm trying to improve this avatar, right, lots of trouble doing that, try to be flip, no it's not flipped anymore, it's also a bit jumpy when I, yeah it can detect my face sideways, this camera meant to be wide angle, this camera meant to have a wide angle, is it actually being used, see what the pot says, yeah I don't actually know what's the height and width and I don't want to check it because it probably will stop working altogether, do we look upset at the moment, no, it's a good question, yeah we might move on for now, we actually went doing this turning EEG into music, if anyone is interested, so we really had something working, the fairly complicated wild structure for it, wait I'll have to close this one so I can actually use the, so I can actually use the visuals to your code, you have a bit of less CPU is being used which is great, just move my face a bit, yeah this one has only hands, and nobody else trying to add body and other things, ears, whatever, but yeah so far just just gulping power, CPU usage and not much better than this one, right so we're doing this EEG to music and we had an index HTML JavaScript code for it that we generated yesterday in this white, there are a couple of white scripts, there is a low data filter EEG music conversion and run by, this one should actually display the whole data like that and so that's working okay, except it's being a bit too slow for my liking, but we will deal with it later, the main thing is to get the music going, so it's pre-recorded EEG from many years ago, it would be a very high quality EEG, so there's no doubt this is actually EEG and not the artifact, so there should be very little artifacts from in this signal, well mainly because it was recorded directly from the brain, it's an implant, so it was recorded directly from the surface of the brain and this is an epileptic seizure net, it's 16 channel recording grid, grid arrays, intracranial EEG, should mention the source of the data, don't forget to mention the source of the data, it's coming from the same data as this one, we can copy this line from spectrogram, let's do it now, so we do not forget spectrogram, because we have a bunch of tools on the website that's using the same dataset, pop this in the shit, index, html, now that's an old one, we need one in the templates, index, should have the source of the data at the moment, at the bottom somewhere, yeah this one here, let's pop this in the bottom as well, let's refresh this page, the next thing for us is sorting out the music conversion bit, I'm actually using the filter, I don't think this is working, and this is filter pi, should we start a new chat or continue with this one, what do you reckon, here I had a couple of errors that we were able to sort out, yeah I'm pretty sure this filter EEG is not actually being used, but how about we try to do the music conversion without any filtering first, music conversion, yeah it started with some generic stuff, let's start a new chat quickly, really half an hour, wasted half an hour, okay we have a index html, we have a javascript, what's that for, okay don't need to explain it to me, we have run pi, that's what meant to pull it all together, not only to fix anything yet, I want to fix the music conversion and then actually use it and run pi, that's what we want to do, so I can stop generating, yeah in this one the music conversion, okay we actually don't have that problem anymore, wait, wait, wait, let's try this again, because that problem we fixed last time, that was a simple fix, yeah I want to fix the whole file, the whole code functionality-wise, it does somehow import data into it and they convert it into music using me.do, which is a library for working with MIDI messages and what's, well working with MIDI files, the music files, okay so again a bunch of the chins, time delights between notes, velocity, musical and rhythm, handling extremes in eG data, error handling documentation, what is it currently doing, taking minimum maximum, generating notes, saving into output file, test that output file, what is the output file, okay so output file should be in this folder, music output files, how to change the code, and then from that folder I wanted to be played in the html, when we say music, yeah it's an overstatement, it should be just called noise, to just translating a eG into sound, we don't know if it will sound like music or not, probably not, but we want to see if potentially sieges sound more like some sort of drumming music, maybe, okay so we have the converged music, output file, none, okay we already have that over there, this is the folder, okay I get it, so this default null name, everything else is the same, I can't understand what it's doing, okay do I need to use it in runby, to use in runby, okay in runby, flaskOS, music conversion, we already have it, app static that's fine, okay convert eG start, load that, so loading that again that's okay, okay so we don't need to do it over there, because the folder is already specified in the convert eG function, so we can replace convert eG to music dead, and how to play the music in the html, and how to change the html, play the midi file, there's a midi JavaScript, what are the sound fonts, okay we need to learn about music now, you can host files on your server or use cdn, instrument set to default instrument, such as acoustic grand piano, that's nice, date the file, so we can be hosting your server, okay I need more information on this one obviously, because I don't have those files, then you build the files, okay this wouldn't work with soundfontdownloads.com, okay this was an static soundfont apparently, yeah we just luck, now there's a 404, it's just making stuff up or can I search this thing, soundfonts, okay making stuff up aren't you, you made this website up that I mentioned to you, it's not working links, okay now you're actually doing a search with Bing, okay this is on archive.or soundfonts collection, we need 500 though, probably, what is it, 36 gigabyte, can I just download one, someone's already asking for it, to download just one, that's not possible is it, would I have any installed on my machine, the file name's meant to be SF, those are SF arcs, SF arc, quite large some of them, what's an SF arc, it's a compressed soundfont, okay this will require another software, I should have guessed it's SF arc for archive, okay I really use them, can I, is there a CDN with SF, with SF files, I don't like this, other websites archive.org is okay, just search archive.org, yeah one uncompressed, what's this code, yeah meant to be an audio file, where I could use any file, use a CDN, MIDI, JS, CDN, okay well first we need to change the code, how quickly can we do that, so that's just the MIDI JS, I assume that's for playing the file, so we need to add that one, modify, control the html button, play MIDI, we already have a play MIDI button but with others the file, this is what we have so far, this one is playing mp4s, should we convert to mp4, I mean MIDI should be more native, more suitable for music isn't it, but finally CD, it did, GitHub IO, okay can I use this one, go with this CDN, change html, okay we have the MIDI JS library, including the html, that's fine, add MIDI playback button, where do we want the button, let's put it over here, html button, add the MIDI initialization, playback script, and why, first why is it not in a separate, why is it not in a JavaScript, okay let's script over here, can we, can we get it here, yes still have the channel undefined, why, nothing happens, okay up to move this into JavaScript, main JavaScript file, okay reach the current usage gap for GPT4, even though we're paying for it, haven't happened for a while I wasn't really leaning on that heavily, just came after a break, so I'm not sure it meant to be a moving window, something, but it's obviously not, there's something weird going on, but then the good thing is I can continue, the time penalty is only short, so it says I can continue with the default module now, or try again after 12.41, now it's 12.37, so it gave me a timeout of what five minutes, okay so we need another five minute break, now we do the break, yeah we just like to talk about something else quickly, my face over there for a second, yeah we want to talk about this, so this is my current understanding of a simple neural network, this is done in LabVIEW, a back diagram, we, I used LabVIEW for many years, so that's what I'm used to see, now interestingly enough, popping it into GPT4, it's able to explain what is it we are looking at, so it says the image you have uploaded appears to be schematic representation of a simple neural network, it's a neuron model used in machine-like artificial data, yes, yeah we know that, it shows the basic components of a neuron including inputs, weights, yeah just can read text, that's fine, bias activation function, in this case a sigmoid function, so must just read that title in the output, by the way yes in LabVIEW by default a function like that will not have a title, so go figure what it actually, what this one actually means, that one is also not absolutely clear but all the rest is very straightforward, not sure why not just use simple math equation instead, so it's giving a brief explanation of each part, you have the inputs to the neural network, the weights, yeah it's explaining giving a general explanation what weights are, a bias akin to the intercept in the linear equation and it's used to adjust the output along with the weighted sum, okay and then the sigmoid, so this is a type of activation function that is commonly used in neural networks to introduce non-linearity to the model, okay it maps the input values to an output range between zero and one, fine and the output is the final output of the neuron after the weighted inputs have been summed with the bias and passed through the sigmoid activation function, okay and it does recognize that this is actually LabVIEW because it says the layout and visual style suggests it might be created with a specific software for educational or simulation purposes, possibly something like LabVIEW, okay it's not entirely sure but yes it is LabVIEW which is often used for creating simulations of systems and control models, now if you used LabVIEW a lot you will be familiar with this, yes but most people have no clue what LabVIEW even is so can GPT4 translate this LabVIEW back end into maths which is way more common and yes surprisingly well not surprising I don't know if it's surprising or not you let me know in the comments but apparently yes it can this seem to be a legit equation looks much neater much nicer quicker one line yes you know you need to know what each component is but there is a quick explanation how to use the output this one is represented sigmoid activation function and there is a separate definition of a sigmoid activation function which is a standard textbook equation of a sigmoid function this bit represents the weights corresponding to the to each input that bit represents the input itself and plus the bias is the bias and n is the number of inputs so pretty straightforward I'm not sure if you can get this much information from the above LabVIEW back end not that great and gives you a summary as well oh sorry it says it also says that the sum inside the sigmoid function is a thought product yeah sounds legit of the weights and inputs which when added to the bias term yields the neurons total input the sigmoid function then transforms this input into the output between zero and one which is the final output of the neuron so that sounds great one note though that sigmoid function activation has all sorts of problems with it so if there are many other different models that's that are worth looking at and the more we're continuing further GPT-4 is able to translate this into python which and again I used labVIEW for many many years but this python looks pretty straightforward it has the basic equation for the sigmoid much simpler than the labVIEW a thing or between here somewhere so GPT-4 is also GPT-4 is also able to convert the maths into python which even though I've been using labVIEW for years python code is much cleaner more straightforward it actually shows you what's inside this box here what's inside the sigmoid function as the dot product for the weighted sum plus the bias and returns the sigmoid a function on the weighted sum then you can have your inputs weights this define this whatever you want set your bias and off you go you have your output so really this equation here is just this three lines of code this bit here doesn't even show you what's inside the sigmoid function which could be again another very simple equation for all this bit interestingly when asking GPT-4 to generate labVIEW code again it generates this wicked image that means absolutely nothing unless today maybe it means something to someone else not me let me know in the comments but yeah so so chgpt was able to translate this image into simple maths which is the gold standard the absolute common most common thing that most people are familiar with there's nothing overly complicated this equation sum dot product and the sigmoid activation function that has a separate equation in GPT-4 is able to translate this equation into python in a very straightforward way that's probably because there's many examples on github and other websites not able to translate back into labVIEW I was kind of surprised it was even able to recognize labVIEW code which was quite quite interesting so this this example must be sitting somewhere on the web on which the image models were trained as well so there you go close this might eventually turn something similar into a web application the flask web application more likely there's many so I can learn more about a simple neural network with a single neural but we can go from there right so back to eg into music so reached that says try again later so because the time okay so now it's working still in GPT-4 still trying to figure out how that timeout works okay we already have jjs and so we can get rid of this script here j s hey where does it go let's just see anyway or okay sorry that was wrong oops this is j s okay now we're bombarding GPT-4 before I don't know I got the timeout I was pretty light on the pedal I thought okay so midi file so using acoustic grand piano often that works I don't actually know on success function and player so this midi file should this stuff just be in the static folder I'm not sure copy relative path hold there there's nothing there at the moment main j s in a quick update should be the same function midi player yeah this should be the relative path because we're already in music so it would be just like that it's loading it's loading forever controller 5 yeah music mp3 is not there that's okay hey where's the data right there the data and nothing happens when user presses button to change quite a few of other things okay main j s html we have the button okay play midi document get click function let's go right and double check this one as well now this is the same so first of all on loading the page we're having a problem so remember we select a different channel the data is actually being displayed when we had play play g music nothing happens why okay this is the music conversion there is something wrong how do we know the JavaScript is supported okay one main problem is that we oops don't have anything in music output at this folder empty it's the output yeah it should have that file in it let's do some error handling so where is it being called this is where the function is in runpy that should we had to have the file name in there did we know why doesn't just display the thing on loading there's nothing in music output we have not expecting to have the mp3 there yes something is wrong hey can we do print those as well if there's no output while create one we're already doing that we file track that's the same this instead it's clear this run this with control f5 yeah we're getting an error on loading we have to select the channel it's actually doing something okay pressing this button nothing happens it's not actually getting past it's not being called correctly yeah this bit we're not doing it properly do we we're not calling this function anywhere this convert to e g music and something wrong with this code how to fix it's not not working it's not actually running at all this is a problem with metajas with cdnjs metjs 78 yeah this bit that cloud play cdn is not working none of them are file are you giving me fake giving me fake links chapter 4 yeah this one has some uncompressed sound fonts a metajas there's nothing there is there okay there's something it's jQuery it doesn't have a midi midi JavaScript okay should we what converting converting to mp3 a wave a wave files probably be more like it so obviously this one yeah midi even if it works be having trouble playing it in the browser okay i'll try wave files as well just something easy to convert the e g into should be easily convertible into wave file okay we might continue this next time see you later don't forget to check the website don't forget to check the website