 Hello everybody. My name is Federico Arisa. I work at Matrix here in Montreal in the West Island, if we can call that Montreal. Actually, I just proposed this talk because I'm looking for somebody. I need somebody to work with me at Matrix, so if there are physicists that know Python and they love to code and they love to go into the lab, please contact me afterwards. Okay, I'm going to start with something simple just to describe what is to characterize a camera. I guess everybody knows what a camera is, right? I mean, I guess, I hope. So let's see if we can do it. Man, the resolution is crappy. Is it focused or is it my resolution? Okay. Give me a second. I always wanted to do this. I always wanted to try to do... What? Okay. First, I'm just going to show you more or less what it implies camera characterization. Let's say we have a camera. Because it's a second screen, I can... Sorry, I'm going to turn myself a little bit. Oh, yeah. This is not going to... Okay, okay. From... And on top of that, this little device, it has a little bit of lag between what I type and what I see on the screen. Camera, let's say from... All right, whoops. Just make sure. Okay, let's try it. Okay. Yeah, it worked. Okay. Numpai and... Mad blood live. Yeah. No? Mad blood. What's... Mad blood live. No. Man, this is not going to work, actually. I mean, this is taking too long. Let's try just opening one that I have already just in case. Okay. You see? Okay. Basically, characterization, a camera implies I have a camera. Here, it's just a little camera simulator in Python in the EMV81238 model that I'm the author of that model. And what a camera does is just give you great images. Very exciting, you know, all the sexiness of noise. Here is... No, because a camera, you don't characterize a camera with the lens and with a hot cheek in the front. No, that's not the way to characterize a camera, unfortunately. If not, I would have a lot of fun in my lab. Is basically taking images without the lens of a light source that gives you gray images. In this case, it's a fake color, so it's kind of bluish. But that's what a camera does. And what you do is basically you just increase the intensity of the light and measure some things. For example, here, I measure the radiance that I'm giving to the camera, the intensity of the light, basically. I measure the mean value and the standard deviation of the image. And what it gives me, if it gives me something, sometime, yeah? Let me see. I don't see it. You see it? I don't see it. Whoops, it's not here. Why? Oh, okay, here. It tells me, camera tells me, I increase the amount of light here in the X axis. And what I get is the output of the camera increases, and then it gets to saturation and it doesn't increase anymore. It means it's white. Very exciting. And then if I check the noise, I say, hey, look at the noise. It increases and increases. It gets to saturation and then, bam, back to zero. Again, that's what it's supposed to happen. This is a simulation assimilated camera, so it works fairly well as a simulated camera. Now, let's go for the presentation, right? What steps are involved in characterizing a camera? The first is the image capture, then I'm going to do some data extraction from the images that I capture, then data processing on the data and that data, then showing the results, and finally generating a report. And believe me, it's Python everywhere. There is no single part that doesn't have Python. The image capture is the most difficult part. Why? Because you don't know what kind of camera you are getting. If you have a USB, a geeky camera, a camera link, one of those freaky wireless cameras, you don't know what it is. You don't know what format it is coming. You don't know what libraries you have to talk to the camera to be able to grab images. And you don't know what format they're going to give you the image in. Everybody thinks a format is PNG, but no. If you're lucky, you get teeth. In industrial cameras, I'm talking about industrial cameras, not the everyday camera. Even everyday camera, you take a Canon, you take it with a hacker development kit and you get the raw images. Good luck opening those. That extraction. Okay. Just to show you something, for example, here, I'm going to give me a second. I'm going to show you what we're not going to do the image capture here. I'm just going to show you what it is. It's hundreds and hundreds of images with different exposure times, with different lighting. That's it. That's all. Just to make it simpler, we get a file that tells me each image, what exposure time, what amount of light do I have for each image. That's it. Now, in this guy, let's see. I have it here. I'm going to start. Then I just call the directory that we were talking about with all the images. And I call a script that just parses the file and then parses the files and tell me what images do I have. And tell me, ah, you have all those images. Then I'm going to load the images. Group. It takes a little bit because it's... Then I'm going to show the data that I can extract from those images. It's very exciting data. It's additions, variances, temporal noise, spatial noise, a couple of extra information regarding the amount of light. All that. So, the data extraction is basically a bunch of little scripts, like open the image, get the image, and then append, make an array with the main values and the standard deviation values, these kinds of things. It's a lot of different functions that do these kinds of things, but just to give you an idea. Then it comes the data processing. Now I have all my data, what do I do with it? I start doing that analysis for the data guys over there. I do some little regressions on my data just to fit lines, to make sure I have a straight line, what I'm supposed to have a straight line. I do a couple of FFTs for the physicists. This, for example, these kind of things I do. FFTs in the X axis, but average on the Y axis. It's fun. Did I say I was looking for an employee? Yeah. So, I take my data, I process it, and I get results. Boom. I get a bunch of results. All kind of things that are important for camera characterization. What kind of noise do you have? What is the sensitivity of your camera? What is the quantum efficiency of your camera? What is the dark current of your camera? That kind of things that will affect the performance of your camera while using it, actually using it for whatever you are using it. If it's for space exploration, you need one kind of behavior. If it's for industrial, you have another one. If it's for commercial use, like taking photos for a catalog is another kind of noise, another kind of characterization that you need. And then, for example, here, we see the results. We're going to check the results. What is important in the results? I have the quantum efficiency. It tells me 45. What 45 said? It said 45% of photons are transforming to electrons. That's physics, but it's done by Python. And I just can print all the results that I have. If you might be interested, I guess you are not. And then I can plot the results. Let me see if it went somewhere. Yeah. And I get tons of images, tons of graphics showing the behavior of my camera. It says, okay, for example, these feats that I made, I say, oh, okay. Look, I have a little bit of deviation here. How is the noise? Oh, it's not that bad. It closes to ideal. Sensitivity. Is it sensitive enough? Yeah, kind of. In the dark. How does it behave in the dark? Normally, one important thing, if you are going to remember one thing from this stock is if your camera doesn't have noise, it's not a good camera. Really. If it doesn't have noise, it means it's crap. They are cheating you. It has to have noise, have to have noise. Linearity. How linear is my camera? Deviation from linearity. Oh, look, I deviate from linearity. If it's perfectly linear, it's not a good camera. There are no real cameras that are perfectly linear. Sorry, sorry. What is linearity? Linearity, it means every single step in intensity of light should give you the same increase in response from the camera if it's perfectly linear. But it's not exactly like that. It always deviates a little bit. I do, we do histograms. The histograms that we're talking about, the FFTs that we were showing before, we do a little bit of histograms, FFTs, sorry. And a little bit of profiles to see how our images are doing. Let's see. It doesn't matter. CCD is going to behave a little bit better if it's a good CCD. If not, but the regular camera is a CMOS. So it's kind of basically the same today. The behavior is not that different. Results that we were talking about, what kind of things do we see? How sensitive is the camera? How much noise do I have? How much, how much the noise increases with temperature? That's important for you if you are taking pictures in the hot, you're in Cuba. Or I don't know. I mean, I would like to, that's performed well in low light. Does it highlight intensity if you are taking pictures of cars in the traffic? That kind of things. And this is the kind of graphics that I had to produce with that. Then it comes the report generation. I did a huge amount of templates with GINJA and I put all the results that we saw before with all those little and LATIC included of a LATIC name and all that. And it's supposed to work. Let's say, let's see. Report and let's see. I say generate my report. Come on. I have to close the image, sorry. Ah, nice. Now let's see. I'm going to get my terminal over there just to show you. And the report is generated here. I'm going to compile it. Let's compile it three times because everybody who knows LATIC, it's better to compile it three times. No? And FitPage. Of course, I didn't do it with my company logo. This is the generic report that is generated from that. And it's basically this with all the information that we saw before passing by GINJA. Have you ever used GINJA before? It's awesome. Right? So, and I just include all the graphics and I have a nice PDF to send to my clients. Okay. That's basically what it is. We have four minutes for questions. Please, questions. I beg you. Is the reference implementation for the standard. I am part of the group of the standard group. And we meet in Germany twice a year to talk about the standard, to discuss what the algorithms we should use and all that. And this is the reference implementation of the standard. The standard is aligned with the code. I mean, it's kind of of, let's say I had the option of making my own code and making my code the same as the older companies code. Or I had the option to make my code the standard. So, guess who has to be exactly the same as the other? They have to be exactly the same as me. Right? It's a little advantage of the open source. Oh, sorry. In vacuum? No, I'm not measuring quantum dots. I know. I'm measuring quantum efficiency. Quantum efficiency is just the amount of photons, the percentage of photons that are transformed in electrons during the photoelectric process. Because it's a diffused light, the alignment is not that important. You have a diffused light? It's not a laser. Exactly. It's not a laser. It's just an LED source with a narrow band spectrum. So, you just have to put your camera in front, more or less in the center. A couple of millimeters off is not a big deal. And you're good to go. So, you measure visible light and does that matter? Do you measure only in the visible light spectrum or do you measure in red or all the while? And does that matter? Yeah, it matters because it depends on the response of your camera, what you want to measure. In general, if your camera is for, let's say human use, we would measure in green specifically and if we have time in red and blue. But you can measure in infrared or in ultraviolet, whatever is point of your interest. Sorry, you have a question before? Yeah. It's, why did I mean, I suck at this. Really. I should put it here. We don't have network. It's in GitHub. If you go to GitHub, check EMBL 1288. And there you go. It's open source. It depends whether you're going to use your camera for. It's basically that what parameters of all those long list of parameters you're interested, it depends what are you using your camera for. Low light, high light, the middle, the noise, the dark current, temperature, all that. And not really because it depends on the specific camera. It's not, the best guess today would be to go for a USB 3, USB 3 vision camera. That would be the easiest kind of. No, because you have to characterize the camera that you need. It's not, it's not the other way around. You don't, you don't find a camera to be able to characterize to use it. If you find a camera that you want to use and then you characterize it. Yeah. Sorry. I never use a simple tip. I just fall in love with Ginger too and we are lovers since the beginning. Yeah, I'm faithful. You're right, girl. The task is basically answering questions from clients. The task is, is this camera good enough for X? Is this camera going to behave well doing that? Hmm. Is this camera, how could this camera is more expensive than this one? Is it worth it? Why do you think we have a standard and an open source tool to characterize the cameras? Marketing, you can never trust marketing. To work underwater? That's nice. Apart from the fact that it has to be IP67 or IP68 or something like that to be able to go underwater, then it depends on the, what light wavelengths you are going to look for. It really depends on that. And then temperature is not going to be an issue, I guess, unless it's boiling water. So you can, you can get rid of the, of the dark current problems and that kind of things that you concentrate on, on your range of intensities. I think we're done. Okay, thank you everybody.