 All right, so you can see May 2020, I think that was when I started creating the slides and I think Pazly was asking me even before that. Apologies for the long delay. A lot of things cropped up in between and sort of finally got to this month. So the slides are nothing much to shout about. I actually did this. I can't remember when I did this particular hack to build a thermal camera with the pie. And the whole point about it is it's really more on me trying out this new sensor that I bought on a whim. And I was trying to do something with it. Then the COVID thing came about and I thought, you know, instead of doing that thing that I wanted to do, why don't I build a thermal camera to just muck around with it? And so I did. I can no longer show you the actual camera because I lent it to somebody else. So I passed it to one of my colleagues who actually wanted to have a hack and he couldn't actually buy the sensor anymore because it's out of stock in many places. In fact, a lot of this thermal cameras hardware and sensors are out of stock everywhere. Let me show you my Pi 4 setup though. So I even have it here now. So that's the picture and that's the real thing. Right, so this is my Pi 4. I do a lot of stuff with it because I like to muck around with the pie. And but mostly what I do in it is a lot of software development. I hardly do any hardware stuff. This is one of the few things that I do with the actual hardware. So most of the time it's just a software stuff. In fact, I started off doing this trying to make the Pi 4 a development platform for my iPad. So that sort of gradually graduated into other stuff. Anyway, the thermal camera is quite simple. The concept of it is quite simple. So what I do is I firstly read the data from the thermal camera sensor, which is this one AMG 8833. I convert the data into temperature readings because the data are just numbers. And then from there, I generate thermal images based on the readings. First, the steps are actually quite simple. It's quite straightforward. I will go through the steps on this in Go. By the way, anyone in this call a Go developer? Or mostly what kind of, OK, let me go through the list. Luther, raise your hands. Luther is sticking out his tongue at me. I don't know why, but he looks rather old. So yeah, so it's based on Go. And there were some code that were Python that's around somewhere. And I actually wanted to do the whole thing in Go because I prefer doing stuff in Go. So I actually read the whole software in Go. So I'm going to show that to you guys too later on. OK, let me get back here. So this is the sensor. Pretty small. And the camera is just this guy. Can you see the pointer? Yes, we can. Yeah, so this one. And the licks here, actually, now that I remember it, was sorted on by King Ming. So King Ming very kindly helped me to sort around this. And in fact, he did it on both sides. I think I asked him to do it on one side. And the reason why I asked him to do it is not that I'm bullying him or anything like that. Just I couldn't see the thing properly to sort it anymore. My eyesight is quite bad. So I asked him very kindly, and he sorted both to me. And they actually worked, I mean, he actually stands on both ends. Otherwise, he would be lopsided. And so these are the different inputs to the sensor, the VIN. This is a power pin. So this one, this is a power pin. I connected to the 3.3 volt in the Pi 4. So this is pin 1. The 3.3 output, I don't use it. A ground, I connected to pin 9 on the Pi 4. Then these are the two more important ones, the SEL pin and the SDA pin. And this connected to pin 5 as well as pin 3. So these are the two. The interrupt pin, some people use it to detect movement or changes. I didn't use it at all. So I just read the data and I use software to do the detection. I don't use any of this interrupt to do the monitoring. And this is my hacked up a little bit of monstrosity at DIY from some scrap styrofoam. Basically, I cut some spare styrofoam that I happen to have on my desk. And then I just stuck everything in and I put it on my Pi 4. And it actually works pretty well. So I'll show you how it works later on. So for the software, I use this particular package to access the MG sensor. And then I also use a pure Golang image resize package. So there are a few out there as available, but I use this particular one. No particular reason, in fact, I tried a few of them and they all work about the same, but this is one of the smaller ones and I wanted my code base to be relatively small. So I just stuck with this one. OK, so this is the code itself. And so it's relatively simple. So I actually have a mockup. This is a part where I do the mock data that's easier for me to do testing and also to do any sort of development. I will show the actual mock code. My main computer actually conked up this morning and I got a switch to an older MacBook. That's why I can't do virtual backgrounds now. But my code, I'm not sure where it is. I only have this right now. So I'm not even sure whether I can show you the code. But anyway, so this is a screenshot of the code that I built up earlier on. This to do the mockup, and this enables me to do development easier. And if I'm not mocking up, then what I do is I connect to the sensor. I have some settings. I set it to normal mode. I do an initial reset, and then I set the frames per second. And then once I can connect to the sensor, I run a Go routine and then I jump on to this particular function. OK, I set up everything as a web server. So this is the part where it is a simple Go web server. And this Go web server will continually serve out the images. So it will basically continually pump out the images. And then you can view it from a browser anywhere that's on the same network. So the start terminal cam here, again, pretty simple. What it does is it will read the pixels from the sensor and then add a certain interval. And this interval is set at the page itself, at the HTML page, at a particular number of milliseconds. So it will just basically loop and refresh. OK, so that's the, and then we'll keep on doing it, right? It will keep on doing it and it will not stop. So that's the Go routine for starting and reading the pixels and then just pushing it on to this particular variable, called grid. And this particular index page, which is the only page that's in the software, what it does is it will generate the images. Using this particular Go routine again, it will generate a frame. And then it will send the information to the HTML page. And then this one itself will be the one that will create an image. And this is in the base 64 string. So this is a data string. And then it will send it to the page. This will, in fact, show the image. This is the function that shows the image, OK? Creating image here, I create an image. And then I sleep and then I will push the image again. So again, goes infinitely. So it doesn't stop. You have to control C to stop it. To create the frame from the image, I take the data from the buffer. I encode it. And then I create this frame from base 64. I make this into a base 64. And then I push it out to this guy. So this frame, this frame here, it's basically this frame. So and then I use a particular color heat map to display an image, which is from blue to red. There's 104 values. So I didn't show everything here, but basically this is it. The image, what it does is it's actually quite straightforward. What it does is it will create a RGB image from the sensor. And then from there on, I get the R, G, and B. And then I set the pixels. And after that, I use resize. Remember the package I talked about earlier on? I use resize. And then I use this algorithm, Lexus 3. I think if I pronounce it correctly. And I resize it to a particular size. Because the sensor itself is quite small. It's 8 by 8. There's only 64 pixels. So if I want to see something clearer, I can't because it's only 64 pixels. So what I do is I use this to approximate and resize it to something larger. And I wasn't sure it would work initially. But I tried it and actually it works pretty well. And that's how I managed to get the image at the end of the day. And then of course, if I want to get the color index, this is it. This is get the R, G, and B. That's quite straightforward. And this is how I do it. So it's an image capture of I use VNC into my PyFORC. And then I capture this. This is the VNC part of it. And this is from the browser. And this is a captured image. So if you think about it, this is originally a 8 by 8, which is 64 pixel image. And it's extrapolated to look like something. Do you know what this looks like? Do you know what this image is? Anyone want to take a guess? You waving your hand. Yeah, basically I was sitting a little bit like cross-legged. And then I was just raising my hand like this. And it's 8 by 8 pixel and extrapolated this way. So actually, it works pretty well. Let me show you. This is from YouTube. This is how high it looks like at the end of the day. OK, so you've got to use a little bit of imagination. It's not super clear. But basically what happened is, let me just go back to beginning here, this detected me coming into a room, sitting down on my armchair. And then I raised my right hand. And then I put down my right hand. And then I raised my left hand. And then I put down my left hand as well. And I stand up again to leave the room. Oops, yeah, it's on YouTube. So that was the image. And basically that was the camera. All the source code is on GitHub. So I forgot to put the link in here. But you can catch it on GitHub. I will share the link afterwards. I'll probably send it to Buzz Lee then you guys can take a look at it. And if you want to see step-by-step how I actually did this, I also have a blog post where I describe this blow by blow. And what I just described earlier on verbally, it's basically what I wrote down on my blog. So if this particular presentation wasn't as clear, I hope the blog will be able to clarify certain things. So anyway, your questions, please feel free to ask. Great, thank you, Sashong. Yes, please, any questions? I think there's one from Jiaxing. Jiaxing, do you want to say it yourself or? Yeah, yeah, your Python, you do need so much processing power. Doesn't it overheat as he doesn't have a fan on the case? Mine overheated really fast. Let me show this to you again. You see this? I stuck this on it and it helps a lot. So initially it overheated like crazy and I was afraid it was going to burn a hole into the pie itself. Then I stuck this fins on it and I bought it like two bucks, three bucks off of Shopee and I stuck on it and it helped quite a lot. So it doesn't really heat up too much now. It is still hot, but it is OK. I also tried using a fan, but the fan was a bit noisy and then it's kind of distracting when I was trying to write code. So I gave up on that. I just used this. Great, any other questions? I'm curious about the distance because there's no actual optics to the... Yeah, there's no optics. What kind of distance can it really... You went quite far or how far? No, I wasn't quite far. I think I was maybe one to two meters away in this picture, which is basically this video. This video was... Let me try to do this. This video I was sitting maybe like one and a half meters away from the camera itself. So the resolution is not very good. And the sensor itself is a very cheap sensor, a relatively cheap sensor that sort of works. But in reality, if you want to do actual thermal sensing, even though the spec says yes, it can capture the temperature. But in reality, it doesn't really work too well if you want to do real temperature sensing. So I was about to get another sensor. I forget the name now. And that was supposed to be a lot more... It was supposed to be more accurate because it has more pixels and it was sold out everywhere. So I never managed to get the other one. I was also at contemplating getting an even more expensive one like $100, maybe closer to $200 sensor. And that would have been a lot better. I think they have a few hundred pixels. But again, it was sold out. So I couldn't manage to get it. I think there was a kit for the Raspberry Pi with the... I think it's the Laptone from Flea. I think it's like 80 by 60 pixels. Yeah, I couldn't get it. It was sold out. Yeah, everything sold out. Even this guy sold out. The MG883 now is sold out. That's why I have to... In terms of... Can you calibrate it? Or do you get 0 to 124? Can you actually... If you have, let's say... I mean, you don't have a black body to really calibrate it. But if you really see yourself, you say, okay, that's so much. How is the kind of real temperature with the real color? No, the temperature is not very accurate. I think it is off by four, five degrees from the body temperature because the sensing goes like... And you can calibrate it, but you need to use software to calibrate it. There's no hardware calibration. You can use software and then you can adjust it accordingly. But the changes are by half a degree Celsius. So in terms of actual body temperature sensing, it is not very useful. Just half a degree is really not that useful and it's not very accurate. You need to be relatively close before you can actually sense quite well. So I would say as the actual thermal camera to do body temperature sensing, no, that practically it would not be... It would not work. I don't think it will work. Maybe somebody can tune the software even more to do it, but I find it difficult to do it. I think some better hardware might be able to do it, but I, unfortunately, I mean, I couldn't actually get any other hardware after I got this one. So that was that. There's another one that has a... Let me try to dig up the name of that sensor. Just give me a second. I didn't know that Panasonic was making those sensors. Panasonic was making those sensors as well. Yeah. This is a sensor from Panasonic, apparently from the top. The actual sensor was from Panasonic, but I think a few people sort of made different versions of it and then used it as well. Let me try to go and dig out the notes from my... From my... To try to... Is it the MLX one? I'll give you a second. I need to go back now. Refer the notes. Refer the notes. Give me a second. So there was another one that I was trying to get there. Apparently had a better resolution. Yes, it's the MLX one. MLX 90640. And they have a medical version, which means that the accuracy was guaranteed up to a certain degree. Again, this is all out of stock. The Fleur-Lapton one, it's a more than $100. I was also trying to get that, even though originally I was... I mean, I don't want to spend $100 to get one camera. But at the end of the day, it was all out of stock. So it's kind of hard to get. The MLX one is 32-pimes, 24-pixel. So it has 768 data points. So that's a lot better, but I couldn't actually get it anymore. The Lapton one, I think they have a different version. And some of the versions were pretty good. But the prices were a bit exorbitant, that's one. The other one was that it was also out of stock. So the Lapton was 160 times 120 compared to the AMG, which was eight by eight compared to the MLX, which is a 32 by 24. So I think the Lapton one could possibly do something much better. Yeah. What do you think such a sensor is good for? Oh, okay, so I originally bought this to do a little bit of detection and automation. What I wanted to do was whenever I sat down in my armchair to do Netflix and chill, it will detect my presence because, you know, and it does that pretty well. And then it will turn on the air con for me and then turn on the TV and then so on and so forth. I never actually got to that because I was distracted to do this. And then after that I was distracted to do something else. So yeah, that is the unfinished project. Nice. So we'll see the finished project soon. Next hack. Everything's in. Yeah, maybe. Well, okay. Is there any other questions? If not, I guess we'll thank Sao Xiong for the interesting talk and we hope to see his completed project soon. Book a slot.