 All right, so welcome back. Yesterday we had a nice visit about autonomous vehicle lidars and also atmospheric lidar. And today we will look at a different kind of lidar application. And let me get to the right screen. That is, we're going to talk about lidar for finding fish and insects. So that might seem strange to you because it's fish and insects. It doesn't sound like what you would normally do with a lidar. I'm going to tell you the story and hopefully it'll make some sense to you. I told you yesterday that the first 12 years of my career I spent working at the National Oceanic and Atmospheric Administration, or NOAA. You will see that name here multiple times. And so you need to understand that that's a government agency. It's a research lab. And what we did there is we developed remote sensing methods and instruments for the atmospheres and oceans of Earth. And the guy that I worked with was very famous in the turbulence field. And then he changed and started working on lidar for finding fish. And the purpose of his work was to find fish so that you could harvest it for food. And I ended up doing a little bit different thing. So that guy that I'm talking about who pioneered this method is Jim Chernside. I don't know if Dr. Concertini is with us again today, but she was yesterday. And she knows Jim. I know because I've talked to her before. So this was my colleague and boss for 12 years at NOAA. And then we still collaborate even today. He's retired now, but we still we're going to be flying together this summer. So in the 1990s mostly, he pioneered this method of putting lidars in airplanes for finding fish. And so here's his early fish lidar. And let's talk about what makes it a little bit different than some of the other lidars we saw yesterday. It's a very simple lidar. It's a neodymium-yag laser. So it's a nid-yag laser. Oh, let me see. I always have a problem where to put the faces when I'm doing this on my laptop. I'm going to get my laser pointer cursor so that you can see better. So here's the lidar mounted in the airplane. It's looking through a hole in the floor. You can't really tell in this picture, but it's looking through a hole in the floor of the airplane. The airplane flies at about 300 meters. It's a nid-yag laser, 532 nanometers. We do not use 1064 nanometers at all because there's no penetration in water at that wavelength. So it's a very simple laser. It's just a 10 nanosecond pulse, 100 millijoules per pulse. So again, very bright, this would damage your eyes if you were to put your eye down here near the laser. But we expand the laser beam, we diverge it so that when it hits the surface of the water, it has a diameter of about five meters. And then the energy density is low enough that there's no damage to human eyes or animal eyes. For a receiver, we use an eight-inch refractive telescope, so that's 25 centimeters or so. That's the biggest one that we ever used, and then he started going smaller. But look at the field of view, how wide it is compared to what we talked about yesterday. Yesterday we talked about many lidars having micro-radians or tens of micro-radian fields of view to reduce the background. In this case, we purposely open up the field of view to be about three degrees. And the reason is that we can get multiple scattering. There's lots of useful information in the multiply scattered light in the water. And so we open it up and collect that. We also do dual polarization detection. You might remember yesterday we talked about with a waveform recording lidar, one of the big challenges technically is getting enough dynamic range and getting enough speed. And so that was a problem because remember, this was the 1990s. The best we could do at that time was we bought a one giga sample per second analog to digital converter. And it only had eight-bit resolution. In order to get one giga sample per second, we had to accept only eight bits. That means we only have 256 levels to resolve multiple orders of magnitude of dynamic range. So what we did is we enhanced that dynamic range by using a logarithmic amplifier. That doesn't really change it. It still means you only have 256 levels, but those 256 levels are spread now logarithmically. So it's kind of like looking at a plot logarithmically. You don't have any more resolution, but it looks better to your eye and you can see more information. That gives us about a meter range resolution in water. We get 11 and a half centimeter samples, depth penetration in the daytime down to about 30 meters and 50 meters at night. Now, when I first moved to Montana State University 21 years ago next month, I gave a talk about the things that I had, the research that I had done and the research that I was planning on doing. And I mentioned FishLightR and I said, I don't think I'll ever do FishLightR here. And somebody in the audience raised their hand and said, well, do you know about the lake trout problem? And I did not. So he explained to me the lake trout problem at Yellowstone Lake. And that is at Yellowstone Lake, there are non-native, so invasive lake trout who eat the native cutthroat trout. The native cutthroat trout live in shallow water and they are the primary protein source for bears, otters, pelicans, all kinds of animals. Lake trout live much deeper in the water and they cannot be eaten, therefore, by bears and otters and pelicans and other surface animals. And so this is an ecological crisis and it is in the middle of Yellowstone National Park, which is the world's first national park. And this is a region that is cherished here in the United States. And I think probably appreciated, if not cherished, worldwide. And so they were very interested when I talked to them about trying to use LIDAR to help find where these non-native lake trout were spawning. So here's a little bit of information about Yellowstone Lake. It's 2376 meters of elevation, so it's a little bit high elevation. It has a large area, 350 or so square kilometers. It has nearly 200 kilometers of shoreline. This is a satellite image showing Yellowstone Park. My home is just up here, just north of where this image was taken. Yellowstone Park is just south of where I live. And this is the entire park and here's the lake. And you can see the road goes along the north side. This south part of the lake is back country. It's wilderness, very, very difficult to access wilderness. And that's why the airplane isn't a damage because we can take the LIDAR down there easily. So what we did is we collaborated with NOAA, with Jim Shurnside's group at NOAA. And we flew the LIDAR in a rented airplane, round and round and round and round the lake. And we spent a lot of time over here in what's called the West Thumb, because they know there are lake trout here. They do not know, well, they know now, but they did not know at the time where else there were lake trout. They suspected that maybe there were lake trout down here in what's called the Southeast Ark, but that's too far away for them to really check easily. So we flew around here and we analyzed all the data by hand. We flew in snow in September. That's what the weather is like where I live in September sometimes. And so here's an example of what the LIDAR looked like. We do not scan the LIDAR. We just look straight down and we just build up these images. So this is altitude on the vertical and time on the horizontal. And then the gray scale shows the intensity of the return. So the first thing you see is that we actually detect the snow falling below the airplane. And then we see the surface of the water and we get a signal several meters into the water. This is not ocean water. This is lake water. So it's much, much less clear. So I told you that in the ocean we can penetrate during the daytime to 30 meters. In this lake, we were penetrating maybe 10 or 12 meters at the maximum. Here's some underwater surface. This is like the ground under the water coming up. And we get some... We easily detect that underwater surface when it's close enough. And what we found is that we actually found the signature of fish often near the edges of these underwater shelves. And so I'll zoom in and show you what that looks like. And you might look at it and think, my gosh, how do you know that's fish? These little wiggles right here and these little wiggles right here. The only reason we know that is because Jim Churnside has flown his lidar many, many times looking for fish. And many, many times he's had ground truth with a boat in the water dropping a net to pull up the fish in real time. And so he learned what the signature looks like. Now, those words should tell you these days that, gosh, you should try machine learning. And we didn't do it in those days, but now we are. We're actually using machine learning to find these kinds of features. And it's working very well. The machine learning works extremely well and we are able to at least identify the many opportunities, the many places in the data where there might be fish. And then we look at it with our eyes and determine what we think is the probability for fish. And I'll show you a little more about the data a little bit later. Okay, so one of the things that came out of that early study, well, the primary thing that came out of that study was this map of Yellowstone Lake with dots where we found fish. The green dots mean we think we found, or no, the green dots mean we were confident. Those are fish. The red dots mean maybe fish. And you'll see down here in the Southeast Arm there were several hits very close to each other in a place that the ecologists and the biologists were very excited about because their ecological models suggested that places like this might be a spawning location but they never had time to check it out. So now when they had our LiDAR data, they went down there with their boat and they dropped in that and they pulled up fish. And here's a lake trout and here's what happens when you cut open the lake trout you find baby cutthroat trout. So these are the native fish. These are the problem fish. And you see the problem that if you want to get rid of a species, the best way is to kill off the young. So there's the problem. I hope none of you are eating dinner right now because this looks a little bit gruesome. Okay, so then we went to work designing our own LiDAR. We wanted to have a smaller LiDAR that can fly in a smaller airplane. And so I did not put up the equations again because yesterday we looked at those equations for what's called the LiDAR equation that predicts the signal as a function of range. And you might remember that in that equation there was one of the parameters is distance and we had a one over distance squared. And we also had an aperture diameter and we had what else? Well, you have the laser power and the extinction coefficient. And so what we did is we used that equation to model a system. And this plot shows the received power, in this case in DVM as a function of field of view angle for a fixed five centimeter aperture. And this is for three different depths of water. And you can see that when you increase the field of view you get more signal and then it sort of flattens out. And that's because you've captured all the multiple scattered light and there's really not much more to gain. And so as you open up the field of view now you're just getting more background light which is hurting you. And over here on the right side is a plot that shows the received power versus aperture diameter. So we fixed a constant field of view and now we're varying the aperture diameter. And you can see that as you increase the area of the aperture you get, of course, more light. And then the bottom plot shows the signal to noise ratio and the signal to background ratio for what we chose which is a five centimeter aperture and a 15 milliradian field of view. That's a very small aperture compared to what I showed you before and that's because we had learned that we don't need quite as much of quite as large an aperture as what Jim used originally. The signal to noise ratio looks very good all the way down to 15 meters. The signal to background ratio, however, reaches zero at about 12 or 13 meters. And that tells us that we are background limited. We could do some things to try to reduce the background even further. But when you have this wide field of view, remember I told you that that's the problem is you get a lot of background light. So we are background limited with this LiDAR and that's okay because the depth that we needed was about 10 meters. Here's a picture of what we built. This is the LiDAR receiver. We have two separate, actually two separate receivers. The transmit laser is over here on the other side. This metal frame, this rectangle, this metal rectangle bolts to the floor of the airplane. And this then has to be a hole in the floor. Here's the large aperture, which is the cross polarized signal, which is orthogonal to what we transmit. Here's the co-polarized aperture, which is smaller. We did that because we get more co-polarized light. But the cross polarized light is actually what we use most of the time because even though the signal is smaller, the contrast between the background and the fish is bigger with the cross polarization. And we designed this to be very small so that it could fit in a very small single engine airplane. And so it's driven by a laptop computer, the electronics bolt down in the back. Here's a picture showing the transmit laser, the co-polarized receiver and the cross polarized receiver. And here's a picture showing it in the airplane. If you look closely, you can see that there's trees down here. So you can see that you can see, there's a hole in the floor of the airplane. Here's the electronics. It has a NOAA logo on it, but this is actually our LiDAR. Mike just put that on there because we were collaborating with NOAA. But we have a power inverter over here for providing power from the airplane and our batteries. And we have a, here's our two receivers. And here's the hole in the airplane floor. So you can see that we're flying in this small airplane. This is Mike, he's my PhD student who built the system for his PhD dissertation. And he's a very tall guy. The pilot sits here, the co-pilot sits here and drives the LiDAR. And then you take out the backseat and put the LiDAR in where the backseat used to be. So it's a very small fit, but it fits. And here's an example of the data that we got when we flew with this new LiDAR system. Again, depth and time on the, time that has been turned into distance on the horizontal. And then the color shows the, the color shows the intensity of the signal. And once again, you see the same kind of thing where the fish show up often near the underwater edges. This is a totally different location, but it looks similar. So we have a bottom shelf that is only two or three meters below the water. And then it falls off and gets deep. Right at that edge is where the fish tend to cluster. And again, we know that that's fish because we've trained our brains to recognize it. And the machine learning algorithms are now learning to recognize that very effectively. We also found that we accidentally flew over an underwater feature that when we looked at it closely, we realized this is an underwater thermal vent. You might be aware that Yellowstone Park is a famous place for geysers and thermal features. Some of those exists underwater as well. And so this is actually hot water coming out of the top of the vent. And it creates a layer that is attracting algae and little critters, little animals in the water. So this is a layer of, actually, this is a layer of plankton. That's a plankton layer that is gathering near the outflow of this underwater thermal vent. OK, so let's talk about what wavelength you want to use. When you design a lidar for this kind of purpose, you really only have a very small number of choices because there's very few wavelengths that propagate well in water. And this plot doesn't even show it as well as it really should. But generally speaking, the curves look like a U shape. And the minimum, this is the extinction in the water. The minimum extinction occurs down here in the green. And actually, for open ocean, it occurs down here in the blue-green. But for lake water, it's very much green. And so our 532 nanometer wavelength is right here, which operates pretty well. On the short wave side, scattering dominates. And on the long wave side, absorption dominates. OK, so I'm going to skip this. This is just some pictures from the early fish lidar work at NOAA. These are photographs that I took on the airplane with the NOAA lidar. And you can see, these are little balls of fish that we were finding, in this case, in the Gulf of Mexico. And here's the signature showing that cluster of fish very near the surface. OK, we've already looked at all this stuff. But now I want to show you what the data looks like. These are copolarized depth profiles. We have an algorithm that detects where the surface is. And that algorithm wasn't working very well when we made these plots. So this peak should be at zero. That's the surface of the water. And then the signal decays rapidly. And then it becomes constant once you run out of photons. When you see fish or something in the water, you get wiggles on that. Now, I wish I had it, but I don't think I have a plot here. If you take the logarithm of these plots, you get a straight line. So this turns into a straight line. The slope of that straight line gives you the extinction coefficient in the water. And then when you see wiggles on that straight line, you know that you have something in the water. It might be a fish or it might be something else. OK, so we talked yesterday about the equations. And Mike did his analysis a little bit different. So I'll take you through it. He took the signal all the way to an electrical signal because he has an RF background. So he was used to thinking like an RF engineer. This is the laser pulse energy. This is the aperture area. This is the water surface transmittance squared. There's the PMT response, the water refractive index, and the speed of light in the air. And then, of course, here is one of the most important things. This is the volume backscatter coefficient. And then that is attenuated by e to the minus 2 alpha z, where alpha is the attenuation coefficient. In this case, you might remember yesterday we talked about how you can estimate the atmospheric transmittance with a LiDAR signal. In this case, we don't really care about the atmospheric transmittance because we're flying at somewhere between 100 and 300 meters. So the atmospheric path is short. And the water extinction, the water attenuation, is much, much larger. So that's what we put in here. But this quantity beta is what we're really, when we're flying without, when we're not seeing fish, this is the property that we're trying to derive. You might remember that back here a little ways ago, I showed you some design curves. Those design curves were calculated using an estimated beta for the water in Yellowstone Lake. But we had no idea what the real number was. And so just a couple of years ago, one of the papers we published was about the attenuation coefficient measured by our LiDAR in Yellowstone Lake because nobody knows what it is until we measure it. Okay, so because this analysis is being done with electrical signal, we have ADA, which is the PMT response. That incorporates the quantum efficiency and actually the responsivity, which has units of amps per watt. And so it turns the optical power that's detected into an electrical current. And then we square this, we multiply it by the load resistance and that turns it into electrical power. And then we can calculate the signal to noise ratio as the electrical power of the signal divided by the power corresponding to shot noise and the power corresponding to quantum noise or quantization noise, sorry. And the signal to background ratio might calculate it as his electrical power over the power from the background signal plus the power of the dark signal. And those are quantities that you can measure. In this case, you just measure your signal with the lens cap on. And in this case, you measure it when you get to a depth that is below where you can get laser photons anymore. Okay, and I already showed you these plots. Sorry, I was adjusting things right before and I left them in. Okay, so now let's shift gears. Let me look at what time it is. Oh, perfect. So we're right at the half hour point. So at this half hour point, I wanna shift from fish. We're still doing fish measurements. And in fact, I will say that we actually are flying our lidar this summer with a different group at NOAA in Lake Michigan and Lake Huron, two of the what are called Great Lakes, which are even larger than Yellowstone Lake. Large lakes, they're not interested in finding fish. They're interested in finding algae and plankton. But we can do that as you can see here. Here's the plankton layers that we're seeing in Yellowstone Lake. So we're still doing fish lidar research and we're developing new systems. But I wanna now shift gears and talk about insects. The way this came about is, I was at a regional Montana conference and I had a poster talking about the fish lidar work. And actually it was talking about, actually now that I think about it, it was talking about atmospheric lidar. And then we also talked to some people about our fish lidar work. One of the professors from a university across the state from me came up to me and said, I'm a biologist and I study insects. So I guess that makes me an entomologist. Can your lidars measure insects? And I said, I don't know, but why not? Probably we can see molecules, so why not insects? And I asked him why? And he told me I can train honeybees to find landmines. And I laughed because that sounds crazy. And then he explained to me, I feed them sugar water that has chemicals inserted into it. And the chemicals are chemicals like DNT, which is an explosive. And so then they go looking for sugar water that smells like what they just ate. And so we did this experiment. We took the fish lidar out because we did not have an insect lidar at that time. So this entomologist was at the University of Montana. I'm at Montana State University. And he put all his honeybees, he put beehives on a trailer and he fed them sugar water laced with DNT. We scanned our lidar in a trailer back and forth and back and forth and we built up histograms of where we saw something. Now, in order to find something, we had to make sure that we were finding something interesting. If you go and look at this original paper that I have cited down here from 2005, that paper talks about these measurements. This was done with a conventional fish lidar and the beam was scanned back and forth and back and forth repeatedly. In order to be low to the ground and not detect grass, we actually had to mow the minefield. So we had to mow the field. That's a problem, right? You cannot mow a minefield safely, but in this case it was a military minefield and there are real mines buried out here, but they do not have fuses. So they supposedly cannot explode, but it still made me very nervous. But they mow it with a big four-wheeler with big balloon tires to reduce the pressure on the mines. And we built up a histogram of where we detected insects. And this histogram was so interesting to the military that they stopped talking to us. They did not make this secret. I was surprised that they didn't classify it, but they just stopped talking to us. But we have reason to believe that we found where their mines are located. Now, we went back to the lab and we thought, well, this is really a great method for humanitarian demining. And so we wanted to continue work on it. We had a bunch of money from the military. Oh, these are pictures from the early Belidar experiment. This is a film crew doing a documentary. Here you can see the field and you can see these flags marking the grid points. They don't mark where the mines are because that's a closely guarded secret. Here's the beehives, here's the trailer and here's our experiment. And I want to show you this because it shows you what kind of experiment you can do on very small budgets sometimes. We had the fish lidar in the trailer. We cut a hole in the wall of the trailer. We put black cloth there because the bees were coming in and bothering us in the trailer. And then we went to the store and we bought a garbage can because the sun was shining into our receiver too much. So we bought a garbage can to use as a sun shape. And then here's the laser beam coming out down below. This is the receiver till it's still. And when we came back, we said, okay, we need to do something different because we need to separately detect insects from a blight of grass. So we came up with a new idea and it was let's speed up the laser, send out many, many more pulses than we normally do. And when we get the signal back, let's use the Fourier transform to calculate the frequency distribution of the return signal. And then we patented that method because nobody had ever done this before. So in 2009, we got a patent that has this title, but we call it Wing Beat Modulation LiDAR. So let me show you how Wing Beat Modulation LiDAR works. This is a layout of our very simple LiDAR system. This is what it looks like. This is a recent version. It's just on a scanning mount and the whole LiDAR scans. Let's see, where should we start? Let's start on this top picture. Here's the laser. We go off of two mirrors so that we can align the beam. There's a detector sitting here that just detects the ambient light so that we know when the laser pulses. We send the beam, what we're not showing you is that there are some, well, I guess they're here. The beam comes then across and down. And when it comes down, we expand the beam so that it's about six centimeters wide. And that's five or six centimeter beam then goes out from the center of the telescope. It's right in front of the secondary mirror, the telescope. Then the rest of the aperture is used to collect light. This is the primary mirror. This is the secondary mirror. And then it focuses between some lenses, some filters onto a photo multiplier too. So it's a very simple direct detection LiDAR. But the secret is in the processing. And okay, so here's a picture of us deploying this at Teton National Park, which is just south of Yellowstone National Park. Here's the LiDAR under the tent. These are my students enjoying themselves. Here's the LiDAR on the scanner. So here's the laser box. Here's the optical tube. Here's the thicker optical tube that has the expanded beam. And then the beam comes out this way. Here's what we do. We send out the pulses and for each pulse we collect signal at each range. So we have range been one, two, three all the way out to about 200. And at each range, we collect signals for multiple pulses, about 1,000 pulses. And that 1024 point electronic signal is a time domain signal because this is pulse number one, pulse number two, pulse number three. So that's time one, time two, time three. So this creates a time domain signal. We take the Fourier transform of this signal and it produces a spectrum. So here's an example from some honeybees. So here's the signal amplitude on the vertical and time on the horizontal. So at some range, this is at 85.4 meters. At that range, the time domain signal looks like this. We get a big blip. But that's the same kind of blip you would get from scanning across some grass or a flower. But here's the secret part. This, not secret, but this is the useful part. This oscillation comes because the laser is reflecting from the wings and the wings are flapping up and down. And when they flap, you get different reflections. So you take the Fourier transform of this and you get a big DC signal that we do not care about. And then you get these harmonics which tell us the harmonics of the wing beam frequency. That's very important because that tells us, first of all, that we have reflected light from something that's moving at 200 hertz. So even a blade of grass blowing in the wind does not produce 200 hertz. It produces maybe 10 hertz. So we have a distinct signature of the insect. Not only that, but the harmonic content tells you some information about what kind of insect it might be. And the lidar is small enough that we've operated it out of the back of a vehicle multiple occasions. Here we are operating it just two years ago out of a trailer and we're looking scanning across. So we dwell here and then we scan over to here. So we might spend a few seconds here and then a few seconds here and then a few seconds here. And oh, I guess that's, we still have time. So let me, that's all the slides I put together, but let me talk to you about what we're doing right now and then we can have some, this is good because yesterday I was having a problem going over. So I removed some things and now I'm actually a little bit short. Once again, the need is high for signal processing. Let me actually stop sharing for a moment and I'm going to pull up, I'm going to open a different document because I want to see if I can show you some of the things that we're doing right now. Some of the recent work we've done is developing, is developing machine learning algorithms. I need to share this. So I'm going to work from a document that we created for a talk from one of my students. I'll just shift from the PowerPoint over to this PDF. Now let's see, how do you do, I'm trying to remember how to do this with, with PDFs. Well, I don't remember, but this will be good enough. This shows you. Okay, so here's the concept, right? The insects, wings are flapping. The tree is not flapping. So this gives us a DC signal. This gives us a frequency that is not DC. What we get is a signal that looks like, looks like this. If this is range and time, we get an insect signal, we put it into a machine learning algorithm and it sees all this stuff and ignores it and it's trained to find the insect signal. And so here's what it looks like is the hard target, the trees and the grass and the things that you hit with your lidar that you don't care about, just give you a constant signal with some noise on it. But the insect gives you this oscillating signal and the oscillation frequency, of course, is the important part. Now, I also should mention something that's interesting about the processing of the data is that clearly to get a good estimate of these frequencies, you need to have lots of samples. And so the longer the insect is in the beam, the better it is for detection. And because of that, our next design that we're building now is going to take our five centimeter diameter beam and try to make it the full diameter of the telescope. So maybe 20 or 30 centimeters. That reduces the energy density and it makes it harder to detect the insect at long ranges. But it makes it easier to detect them from the viewpoint of this, that the insect is in the beam longer. Because if the insect flies through the beam rapidly, we don't see the insect with, we don't get enough oscillations to get a clear picture of the wing beam frequency. So that's another method that is kind of interesting. Okay, so these are not my slides and so I can't talk about all the details, but there's a graduate student right now who's developing machine learning algorithms. And what I wanna show you is that he did some analysis and found the parameters that really carry information. And he looked at it in the time domain signal and in the frequency domain signal. And what he found is that the most important parameters had to do with the skewness of the probability density function and also the width of the harmonic features that we were finding in the Fourier transform and then the harmonic height ratio. So basically the amplitudes of those harmonics. And so the wing beat harmonics and their width and the statistical distribution in the time domain all matter. And so we're actually doing joined time domain and frequency domain analysis. And I'm not going to tell you all the details here because I don't know them well enough, but the important message here is that we have long data sets. And in this case, it's a long data set with very few insects. And so in the observations, there's like 344,000 measurements without insects. And over here, we find that we get, let's see, what's he doing? Oh, this is the true, this is predicted. Yeah, so what this shows us is that we are finding about two thirds of the insects with the initial analysis. And we kept about 72% of the images, but what he means by an image is we create an image with 200 range points and 1,000 laser shots. And he ruled out about 98.25, so 98% of the images. In other words, we have a lot of data to look at. And if you have to look at it by hand, just with a human observer, it's crazy, it takes forever. So the machine learning algorithm, we did that. We went through it and found them by hand and used that as the training and as the truth. But what he found is that his algorithms could rule out 98% of those images and successfully find about 72% of the ones that we found looking by eye. So it's not perfect, but it's a pretty good, it's a pretty helpful method for finding the insects in this large data set. And I don't think that gets talked about enough is the hardware for LiDAR is quite often fairly simple. The software is where all the magic happens, right? That's where all the analysis happens. And it's really, in some cases, very tedious and very difficult. And so in both fish LiDAR and insect LiDAR, we are using machine learning to rule out all the many, many data points that we have without something interesting and finding the points that are most useful to look at. So that's things, that's fun. All right, I think I'll stop there and see if we have some discussion. Yesterday, we had enough questions that we could easily use the time. And this is good that I'm stopping on time so that I don't chew up John's time. Oh, okay. Thank you so much, Josh for this wonderful talk. The first question or discussion is from Professor Anna Constantini. Okay, I think you can unmute and talk. Excuse me. If I understand, you are utilizing a green laser, green laser beam. That's right. This is not dangerous because all animals are used to the sun light. And it's not dangerous. But can this produce some change just during the measurement in the insects? It's an important question. Excuse me. It's an important, I have to get some water, excuse me. It's an important question. And we don't really know the answer, but we think that that's right. We think that we might be actually altering the behavior of the insects a little bit. Our entomologists colleagues tell us that the insects do not see green light as well as shorter wavelengths. So humans actually have peak sensitivity in the green, kind of the yellow green. Insects don't have quite as much sensitivity. So they think that we're not causing too big a problem, but we're concerned about it. And so the next design that we want to build is an infrared version because that will be invisible to both humans and insects. I actually have been proposing for many years to do this and we just have been so busy doing other things. We've never done it, but we have a proposal out right now that if it is funded, we will build an infrared version. And now I have another personal question. I was many times in a bolder making measurements with the ginger side, red hill and so on in the once a long summer time to make measurements and in some subsequent years. Were you already there at that time? In 1987, up to I think 1991, when Professor Tatarski came to Boulder and the time that I utilized all the money for him, we stopped the global collaboration. We already did what we wanted to do and we published the several papers on the matter. Yes, so I do remember that. I was, I joined the group in 1989. I know, so you were not there. So I was not there when you did your experiments, but I was there when you were publishing them. And that's where I first learned your name is from Jim Cher inside it. We made the measurements on top of the Table Mount. It was a very beautiful location for our measurements. Very, I have a nice souvenir of it. Thank you. I think I have a picture of that. I'll find you a picture of Table Mountain while we wait for the next question and then I'll share it and we'll bring back to you. Is this location still working? Still utilized for measurements and so on because it was a wonderful, with a lot of apparatuses, not only for us but mostly for electromagnetic waves, radio waves and so on. Right, it's a very beautiful. It's a facility that is still used. So here we go. If you don't mind me, everybody, I'm going to put up some pictures to bring back memories for her. So here is Table Mountain. Table Mountain. And you can see all the instruments out here for measuring turbulence. And this is a picture of our fish lidar and Jim's fish lidar. And these are inside the dome. And we are actually calibrating the lidars by looking at a target and trying to figure out where the target is. The target is way out here. We are looking at a distant target. Yeah, so here's the picture. There's the target. Thank you. Yeah, good memories. Memories, yes. Thank you. Okay, other questions? Yeah, the second question is again from Hugo. You know, he's from Japan. So Hugo, you can unmute and you can turn on video if you want and you can ask. Please, thank you. Thank you once again. Actually, I'm from Guatemala but I got a scholarship to study in Japan for, I guess I'm not even like, I'm not even like a master's student. I'm like below it. I'm a research student. Once again, very, very informative talk. Very impressive as well. I found it very interesting that your research and your colleagues' research was so effective that the military stopped speaking with you. Yeah, but I generally have a question. You said about the penetration of water is usually a problem. In the ideal case, that is when there is no sort of eutrophication in the lake, for example. What happens if the lake is hyper eutrophic, meaning that there is blue-green algae, like in Lake Erie in the 1970s, if I recall correctly? It increases, it obviously, it's a very good question. It obviously increases the scattering and so it increases both, it increases the extinction of the laser beam but it also provides a scattering signal that we can use to detect that algae. So yes, we can detect the algae from the scattering signal but if you're trying to look at fish instead of algae, then it just causes background signal. So it's either a help or a hindrance depending on which view point you're looking at. And thank you very much. There's additional factors but that's to first order, that's what it's all about. Yeah, very good question. Thank you. It's like Nicoleta's got a question. Okay, I think no, you can unmute. Another question for Nicoleta. Yes, hello. Very interesting talk yesterday and also today. I have two questions. I'm not working in LiDAR but I am doing the since two years ago some optical coherent tomography investigations. So it's a kind of a small scale LiDAR and I'm wondering regarding the insects and the heart targets, rigid targets, the plants. If for example, if I have a sample textile material and some contamination with fungus, it is possible to make a discrimination between this type of behavior because one is rigid and the other one may be alive, still alive, not dead. Can we use the frequency of the movement to discriminate between this type of species because the aspect is the same, almost, yes. I can send you some pictures recorded last week and it's amazing, but I cannot make the difference because I cannot see details. Okay, the fluorescence which can discriminate between them. So it is possible to see some differences with scattering? Maybe. Maybe you can use a method similar to our wing bee modulation method if the motion is rapid enough. If it's not? I wonder, is the motion rapid or is it slow? Or if they are already that, can we discriminate just because they are different source of materials or the environment is different? Because otherwise it's impossible, it is something. I think the answer is that with LIDAR, you can detect the difference in two objects with LIDAR if there's a difference in reflectivity or a difference in the frequency content. In other words, with the wing bee, we just look at the frequency of the oscillation. Or another way to do it, of course, that we talked about several times yesterday is to use polarization. But for your problem, I don't know if any of those apply, unless the living pieces have a different reflectivity than the dead pieces. I'm not sure. I think it sounds very difficult. Yeah, it is. Somebody gave me some samples and I am a detective for an sake. Yeah, the second is something useful to check the impact of the pieces in a museum among the curators, because it's important, yes, to keep safe the samples inside of the museum. But sometimes it is a problem for the guys who are keeping these samples in the museum. The second, I have another question. This is technical and directly correlated with the way to record the images with LIDAR. You have the opportunity to take images from objects situated at different depths inside of the water. So how the noise, because we are not recording from the same plan and we have therefore a lot of noise in between how you remove that noise. For example, recording 3D images, because if you take a shot in 2D, you can have black and white images. But if you want to reconstruct this 3D, when you create the whole image, you have a lot of noise. And in my soft, I can play with the noise. I can decrease the lower limit and reduce the higher limit. And in this way, I cut a lot of parts upper and down there. But sometimes you can lose very important information doing such things. Sure. Well, we have one advantage with LIDAR, which is that it's all range gated. So that all the scattering that happens between my LIDAR and the object, I can ignore. I can't do that. Because I just wait until these photons come back to me. So that time gating creates a very large advantage. The other thing that we do is we very carefully filter out the background light that creates interference. But the fundamental problem of noise still exists. As our signal gets smaller and smaller, eventually it's just in the noise. And then we just, we're done. We cannot do good measurements once. We can time average and we can do all the usual tricks. But fundamentally, noise will eventually stop on the measurement. That's what determines the depth to which I can measure. Yeah, it was very smart that the thing inverses the solid angle you reuse and you increase. Yeah, this is very smart and I will use this. Thank you very much. Thank you. Yeah. Okay. Thank you very much again. Thank you. Looks like we have one more question. Joe. Yeah. The counter. Yeah. Roger. Roman. This is from India. Yeah. Thank you very much for a wonderful talk. Thank you. I have been working in the LIDAR field for the last few years. Especially for errors and profiling and cloud profiling. Of course, I was also involved a little bit in the system development when I was in Taiwan, academia, cinema. Yeah, so these two topics, the fish scanning LIDAR and the insect LIDAR is very much interesting for me. And I have two questions. We have the mapping LIDAR looking down from the flight, the aircraft and the aerosol LIDAR looking down from the aircraft and the fish LIDAR looking down from the aircraft. So what is the major difference between these three LIDARs? The mapping LIDARs typically use infrared wavelengths. Yeah. Do not penetrate water. That's why we cannot use commercial LIDARs for this. We have to design a custom LIDAR. Yeah, aerosol LIDAR. If you do aerosol LIDAR from an airplane, which some people do, I don't, but some people do those. The aerosol LIDAR is or can be quite similar to what we use for the fish LIDAR. For example, I showed you yesterday a green 532 nanometer aerosol LIDAR and it's essentially the same system as the fish LIDAR. The biggest difference is the beam is kept small for the aerosol LIDAR and I purposely diverge it for the fish LIDAR. But other than that, the primary differences in the software. The data processing. Yeah, the data processing, the analysis methods. It's a good question. Thank you. Okay. Okay. I think we have another question from Nicoletta. Okay. You can unmute, please. Sorry. I heard the discussion. The question, you enlarge the, you diverge the LIDAR for fish due to the, to increase the area of scanning or to reduce the risk of affecting the fish LIDAR. It's both. We, we increase the area because we want to sample a larger area. Yes. And it makes the beam, I say. Okay. You reduce the intensity and it's safe. Thank you. Okay. Thank you so much, Joe Shaw. I think we can wait for a few seconds. If there is any other question from the participant, they can raise their hand. I've got a question, Joe. Did you ever solve the fish problem at the invasive species? Well, we're helping them. They're still working on it. They're still trying to. They're still trying to find and get rid of the. Lake trout. So the problem is not solved. Yeah. But LIDAR is a useful piece of the solution. Right. Yeah. You can map it out. That seems to be a common problem. Also in my state. Oregon. Yeah. Yeah. Okay. So I don't see any more. Questions. Any hands raised. No more questions for now. Okay. So Joe, do you want to, do you want to introduce John? He's here. Oh, I think you know. John is, let's see. Are you. I'm here. I'm here, Joe. I scrolled through till I found your name. I, I see you that you're there. I'm not sure we know. I'm not sure you know too much of my background. I'm going to tell them the bit that I do know. And then you can correct. You can add to it. So John Howell is a, is a professor in, I believe the applied physics department. I think at the Hebrew University of Jerusalem in Israel. Did I get the department right, John? No, but that's okay. I'm just in the regular, regular ones. Yeah. Yeah. So he's in the physics department at the Hebrew university of Jerusalem. I visited in there a couple of years ago and had a great time. We've done some work recently together, but what he's going to talk to you about today is some work that he's been working on. I guess I first heard about you doing this work was probably five or six years ago, John. When you first started it at that time, he was a professor at the university of Rochester. And I don't remember where you got your degrees, John. And so you'll have to tell them that bit, but, but I do know that he's done some really exciting work in the mostly quantum optics area. And now he's, he's working on compressive LiDAR. And so this is really a fascinating topic for me. I'll probably try to stay on at least for most of it today. So John, go ahead. Okay. Thanks. Thanks, Joe. And I don't have anything quite as cool as Joe there. I'm not scanning for insects or finding landmines or mowing lawns with inflated tire or wheelers. That sounds pretty.