 we used to do something on speech, audio transcription and speech analysis. I don't know if it still works. It should be using JavaScript. It is transcribing. Sorry for that. Well, not sorry. I should keep apologizing for the ads, but yeah, that's how we pay electricity. Yeah, so this transcription still works. Okay, we were thinking doing more with it specifically. Well, currently it's only doing volume metrics and speech rate. We could have been, if there is interest, we could be also doing other things, like our metrics, our parameters of your speech, frequencies and the like. Spectrograms would be nice, but also in the future training model. Ideally that will, well, I can only train it for myself. You could potentially train it for your own voice that will actually recognize better the words that I say. So I haven't been monitoring, but this might be making a lot of mistakes, which, yeah, for example, I didn't say Monday. So it's normally working. Okay, this one should be all JavaScript as well. I can check. We host everything locally. If it's JavaScript, just load it on your browser. So I have less control over what it actually does and how well it's working. Apparently this one, yeah, there is a disclaimer at the bottom that's only working on Chromium based browsers. So I don't think it will work on Firefox or whatever whatever you use on Apple devices. It probably just should throw them in the bin, but that's a whole different story. Yeah, there's a bunch of things currently on the list of barricades, the ones on the website. Let's open it in another, see if the transcriber still works in another tab. What else we got? Yeah, it's mainly around the time series that are like EEG brain waves and cell of a graphy. Would it get me saying that? No. Because I'm probably mispronouncing it. I won't try it again. ECG game that we're currently turning into also creating a robot that will play this game using fuzzy logic later, potentially neural networks as well. And what else we got? Yes, IntelliQ EEG signal generator. Yeah, that one. We had a better version in LabVIEW in the past. Yeah, this one is more basic. You can select your, this is not updating. It's pretty annoying. Yeah, 10 is the default. So it should tell you what the current value is. It's not doing that. And then you can load whichever delta, theta, alpha, beta, gamma. Yeah, we open another tab. Yeah, we push my server. See how well it can handle multiple tabs. There's this inverse F noise. It's actually mimicking baseline EEG pretty well. And this tool, yeah, this one's using using plotly.js. I think you can just generate this stuff by refreshing the page. So I keep getting this monopoly ad. Yeah, it's always tempting to use the ad blocker, isn't it? But please don't because that's how most creators make ends meet. But don't use ad blockers. But yes, it's very tempting. So I was just saying inverse F, yeah, this tool, ideal will have more options there. But the inverse F noise function, mimicking, why is it not updating? It just takes some time. A mimicking baseline EEG quite well. So the standard noise will be Gaussian white. It must be too many data points in there. You have the frequency response of it as well. It's a power over frequency. A periodic is fine too. But yeah, inverse F is more closely resembles baseline EEG. Anyway, what else we've got? Got the EEG single generator. So this one's quite a quite extensive. It's using neurokit to a Python library, a flask and numpy. So this is a flask application. So it will have some sort of back end to it as well. You can change your blood pressure. Yeah, those numbers should update, shouldn't they? That's a bit odd. It's really loaded quickly. Yeah, this one is pretty nice and quick. So you can select your window size. A isoelectric voltage. Yeah, we'll add this DC component to the signal. And those delays for some reason grayed out. So I don't know why those are grayed out. The millisecond for the QRST and PQRS. This is actually interesting because this is something we're looking at currently for the bot that plays the cardio quest game. The labeling of doing more changes to it yesterday. This is what we currently have. So yeah, in this case, the game is meant for you, for a human, assuming you are human. Let me know if you are human watching this. So the game meant to be targeted at anyone who's interested in the ECG to learn about the labeling data, normal, abnormal. Yeah, we spend quite a lot of time on this one, hopefully. So we have the game available for you to play to try out. It's called ECG game. But that doesn't have the bot playing it. So you're just competing against yourself. So yeah, so this game, so this is your score, assuming you are human. And then there's the score for the machine. So you can compete against the machine. And the machine currently is not doing very well when the noise level is increased. When it's zero, it's doing perfectly well. Well, at least last time it was. So those are all abnormal ECGs. And it's looking at the number of positive negative peaks, the amplitude, the R peak sharpness. So it's taking that value and going halfway through the peak and measuring the distance on the left and right. Then it's also looking at PR and RT intervals. So essentially the interval before and after the R peak and gives you an abnormality score. So for example, for the second, for example, for the second was 0.8. But this one is 0.5, just rounded up. And for the oops, I clicked on it by mistake. This was a normal ECG and normal ECG again. And the abnormality score is 0.1. And that's the one, the human, when you play this game, you should not do not click on that waveform, because you will get minus 10 points if you do. That was an example. But the machine, when there is no noise, does extremely well and able to distinguish between this normal and decision normal, because the abnormality score is very low, or abnormal when the abnormality score is high. So that's what it does. Label ECG waveforms and benchmark yourself against the robot playing the same game. And the robot is not cheating. It doesn't have the labels. The waveforms being generated in JavaScript, so front-end. And the fuzzy logic, the robot is operating on the server Python code in the back end. So it doesn't have the labels. It's just looking at the waveform exactly the same way as you do. Well, it doesn't have eyes. It just gets the del points. And there is no filtering involved as well. So it's not doing anything fancy, literally just looking at those parameters. Let's see what else we've got. Let's recover that ECG feature extractor. Yeah, that's working pretty well, but there's not much you can do with this tool. That segment range is essentially the window where we look for the peaks, because if you change the segment range, it stuffs up the detection. Getting an error for one of the waveforms, and that probably means it doesn't mean that it doesn't meet any of the fuzzy rules because the output cannot be calculated, likely because the system is too sparse. Check to make sure this set of input values will activate at least one connected term in each antecedent via the current set of rules. Yes, we need to make sure all the rules are covered. Yeah, this will be coming from analyzed waveform. So if you haven't checked, balickchaos.com, please do so. This is what we are running the streams for. There's a bunch of tools. Some of them are pretty old, they'll need some... It's all prototypes, so it's all under development. Some tools are more developed than others.