 Well, that's up to Joe Shaw, but I think we can just keep going it. Okay. That's great. You just led right into it with those questions. So it's a good time. Unless you feel like a break Joe. No, I think we should just go on. I'm fine. Yeah. Okay. I had to hurry at the end there quite a bit. Sorry about that. But well, I'm happy to keep going to finish things up at the end, even if we have questions. I need to go back to sharing. Okay, so now we're going to talk about light art for the atmosphere, which really came first light light ours for the atmosphere came first because like I said, one of the first things people did with the laser was pointed out and start measuring distance to airplanes and distance to clouds and that kind of thing. I spent the first 12 years of my career working at NOAA, which is the in the United States, the National Oceanic and Atmospheric Administration. That's this logo right here. And at that lab, I learned all about atmospheric light ours because we were developing light ours for both atmospheric and oceanic research. So we're going to go, we're going to see the differences between autonomous vehicle light are an atmospheric light are in this talk. In, I guess the short answer is autonomous vehicle light ours are designed to be much smaller, much lower cost. And to get a very rapidly updated signal compared to atmospheric light ours, which need to see much farther away. So they generally use much larger lasers. And they need, they do not need to operate so rapidly, and they generally do not need to be so low cost. So now we change our design principles accordingly. So let's talk about the basic principle here of. Of atmospheric light are the idea is that we send laser light into the atmosphere where it scatters. It scatters off of particles in the air and off of clouds. And some of that scattered light, of course the scattered light goes in all directions, but a tiny bit gets into the receiver where we detect it versus time and we use the speed of light to turn that into range. The range equations are all the same. And this. The uncertainty equations are pretty much all the same. Okay, so if we have a laser transmitter that puts out n sub T laser photons per laser pulse. And the pulse has some width that corresponds to a range bin delta R. We have a tech transmittance of the atmospheric path. Scattering in the hard target case that we looked at for the self driving cars. We just multiplied by a reflectance and divided by a solid angle. In this case, we're just going to multiply by a scattering coefficient, which describes the amount of scattering per distance and per solid angle. We have the same atmospheric transmittance again. And then we have this, this quantity, which is the receiver area divided by distance squared. This is actually just the solid angle of the telescope as seen from the scattering point. And then we have this overlap factor that we talked about briefly in the previous discussion. And then we have our transmittance of our optical system, which I was asked about just a few minutes ago. And that produces some count as a function of time, which we express as a function of range. So if we multiply all those pieces together, we find that the number of photons received from range R is equal to the laser pulse energy divided by the photon energy. So this is number, this turns the pulses. This thing in parentheses represents the number of photons that are transmitted. And then we have all the usual things. The only extra piece that I'm adding is that we always have a background signal. So here's the background signal, which is coming from scattering of, scattering from the same particles in the atmosphere, but the source is the sun usually. So this is scattered sunlight and all of this is scattered laser light. And that, that's the photon counting version of what we call the LiDAR equation. We can do the same thing with flux. This is optical power. Oh, I just realized I, I have a inconsistency here. I used power here and flux here. I'm sorry. Radiometrically, we often talk about flux, but I often also talk about power. So I apologize, I'll fix this, but these are the same quantities. These are just the optical power that's received. This is the transmitted power. This is the Delta R. That's the range bin atmospheric transmittance squared. Angular scattering coefficient. Receiver area divided by distance squared. That's the small angle approximation of the projected solid angle of the telescope. Overlap factor and optical transmittance. Let me get my, yeah, this is easier to see. This right hand term is the background light that's collected. This is the radiance of the background radiance multiplied by the throughput of our receiver, which is the area of the receiver and its field of view solid angle. Multiplied by the optical transmittance. Okay, so you remember this slide where we talked about the equations for a hard target. We had all these equations that we talked about before. And I just wanted to remind you that we had this equation that was expressed as one over R squared and we had an F number. This is the same basic equation. It's just that here we have a reflectance of the hard target. And here we have beta, which represents the distributed scattering. And so that's the scattering coefficient, which is in units of inverse length and inverse solid angle. So that's really the only difference in the equations is that now we are dealing with distributed scattering. In other words, volume scattering. Oh yeah, so now I can skip this. You also might remember that we talked about the range resolution being inversely proportional to the bandwidth, the electrical bandwidth, and that for a pulse laser, that's that bandwidth is expressed as delta T. Basically that that's determined by delta one over delta T. And that gave us a 15 centimeter range resolution. And I told you that for hard targets, we can improve that by pulse edge detection. That's no longer the case in the atmosphere. We do not have hard targets generally. We are generally looking at distributed scattering. You also probably remember that we looked at this. Sorry, I'm having a hard time with my slides here. We looked at this slide that had a time to digital converter. This is the biggest difference between an atmospheric LiDAR and an autonomous vehicle LiDAR. Instead of just recording the time that it takes to get a bright. Scattering return. We usually record the entire wave form. So. Here's a little diagram that shows you kind of how that is done. We send laser light out. And some of that comes back. Down from the atmosphere gets collected in a telescope. We have a filter to limit the optical bandwidth. It may not be here, but we have a filter somewhere in the system, usually. And then we have a detector, often a photo multiplier to more an avalanche photo diode. And that electrical signal is amplified and it goes to an analog to digital converter. This is the most important difference between what we talked about just previously and now. Because this analog to digital converter can be very expensive. We. With today's technology, we can get 16 bits of digitization. And we can add about a giga sample per second. And that gives you about 15 centimeter samples. And that's. That's pretty good in the atmosphere. We don't, we often don't need better than that. But if you need faster sampling. To get shorter. Range resolution bins. Then we have to start giving up something on the bit depth. We have 16 bits. This is 12 bits. But we get one giga sample versus six giga samples. The problem with. The problem with. Lidar is that the signals have very large dynamic range. And so we have many orders of magnitude. Between large signals and small signals. And so therefore we desire. 16 bit digitization. Whenever possible. This, this is actually very good. But these things cost a lot of money. These are, you know, 10 or even $20,000. Again, talking from the US dollar perspective. But that allows us to record the entire wave form with high. And then we have to figure out how to do that. 10 years ago, this was a much bigger problem because to get one giga sample per second. You could only have maybe 10 bits here, something like that. But in the last 10 years, we've made huge improvements in. The analog to digital converters that are available commercially. Okay. So let's talk now about how we lay out the optics for a Lidar for a Lidar. This is a system on the left. This is a photograph of a system we built when I was at Noah. And it was built into a box so that the whole box could be scanned. You can see here's a computer on the bottom. Laser power supply on the side. This is the receiver and transmit optics. And this on the top right is a telescope. You might remember that I talked about by static and mono static optics. In the atmospheric Lidar world, sometimes we do mono static, sometimes we do by static. It doesn't really matter that much. It's a little bit easier. To achieve overlap with mono static optics. And so that's what I'm showing you here. So in this system, let's start at the laser. We have a mid-yield laser. This is basically just like mid-yag. It's a 1064 nanometer. Or 10, very nearly 1064 nanometer. It goes through a half wave plate. A diverging lens, a converging lens. And so the combination of these two lenses acts as a beam collimator. We steer it over here to a polarization beam splitter. The beam splitter is oriented so that the transmit light goes this way through a pocket cell. And reflects off of the secondary mirror, primary mirror, and out to the atmosphere. The back-scattered light gets collected by the telescope. It goes through the pocket cell. And the pocket cell is now. Transmitting the polarization and the polarization beam splitter is transmitting the polarization. We have a, we have a time variable polarization state here because of the pocket cell. And so basically we're, we're transmitting the beam. And then we change the polarization for the back-scattered beam. So then we come through this polarization beam splitter. And we have a very narrow band optical filter to reject scattered sunlight. We have a beam splitter here that puts part of the beam here and part of the beam here. What we have is there's a narrow field stop, a little tiny 100 micro-radian field of view. And this one is 640 micro-radian wider field of view. And it's the, these little pinholes act as the field stop in the optical system, which means they determine the field of view, which is of course the angular field of view out here. That is done with a narrow channel and a wide channel because in the narrow channel, the overlap range is several hundred meters away from the telescope. With the wide field of view, the overlap begins at just a few meters in front of the telescope. But it's so much wider that the signal gets buried in background light at longer ranges. So the wide field of view channel is only for the signal close to the telescope. The narrow field of view channel is to get the signal farther from the telescope. Oh yeah, and here's a photograph of it. I showed you a photograph on the opening slide of that same system. Here's a smaller LiDAR that is not, oh, and by the way, let me back up and tell you that this whole box then was scanning. This is a big scan motor. And in this case, we were doing polarizations. And so because we're doing polarization sensitive detection, we did not want to use a scan mirror, but we wanted to be able to cover much of the atmosphere. So we scanned the whole LiDAR. The whole LiDAR was scanning. This system also is dual polarization, but it is not scanning. And so it just has a transmit laser on one side and a telescope on the other side. This is just a commercial astronomical telescope, just a small little 20 centimeter kind of telescope. And it has a photomultiplier tube receiver on the back end here. But before that, there's a box of custom optics, and that custom optics box is shown here. And it basically has a filter to reject any light that is not laser light. And then there's a liquid crystal variable retarder and a fixed polarizer. This is kind of a clever technique that I developed about two years ago for doing polarization switching. And what it does is the liquid crystal allows us to electronically switch our polarization state, very similar to what we were doing with the Pockel cell in the previous system, but I think the liquid crystal is easier to use than the Pockel cell. This is the port through the roof that we point it. So this is at my university. The first thing I told them is that I had to cut a hole in the roof. And luckily they let me do that. So this LiDAR sits in a room that is temperature controlled. And then we point the laser beam out into the sky for measuring clouds and aerosols. Aerosols are just particles in the air. The question that I had earlier about the question that I had earlier about measuring PM 2.5 particles, those are aerosols. So when I talked about aerosol detection, we're talking about those kinds of particles. Here's a schematic diagram of the LiDAR that I just showed you. And so let's see where's the laser. Let's start at the laser. Here's the laser. It fires 30 pulses per second. So this is not ISA. I told you that the autonomous vehicle LiDARs all had to be ISA. This one is not ISA. It sends 30 times per second, about 100 millijoules, which would burn your eye seriously if you got that light into your eye. Which means that we have to coordinate with the government. In the United States, the organization is called the Federal Aviation Administration. This is the organization that controls the airplanes. And they announced to the airplane pilots that we are firing this laser so that they can steer away from our university. Okay, so the laser fires through some diverging lenses, goes out into the atmosphere. Some of the light comes back, gets collected in that telescope that you saw in the photograph. It goes through a field stop aperture that is variable. It gets collimated, gets passed through band pass and neutral density filters, and the liquid crystal and the fixed polarizer, and then the photomultiplier tube. Notice that there's a collimating lens here. Many of you probably already know this, but it's very important that in an earlier graph, I showed you a interference filter at the focal point of a telescope. We don't usually do that. And the reason we do not like to do that is because the filter transmittance is changes with angle. So the wavelength of transmission changes with angle for an interference filter. And because of that, you don't want to have steep rays coming in. You want to have collimated light coming in if possible. So that's something to remember if you're designing a LiDAR is to try to put the filters in a collimated space. Okay, so that's the layout. This is what it looks like when it's operating. It's shooting light at 532 nanometers. So this nice green beam, he has laser eye safety glasses on. And you can see the computer is showing you the signal falling off of distance. And then there's a peak and that peak is a cloud that we are studying. This is what it looks like outside. And in this case, this is a photograph of my LiDAR beam going up to study this cloud. And this cloud is creating this colored ring. This is called a Corona. And the Corona is a set of diffraction rings around the light source. In this case, the light source is the moon. And I had been using this LiDAR earlier in the day. And I noticed that this cloud was these clouds were very, very high, high altitude clouds. Historically, many textbooks have taught that a Corona only is formed by diffraction by liquid water drops. And I take many pictures of things like this. And I, I did some meteorological analysis in the 1990s and realized that more than half of my photographs of very beautiful cronies were actually in conditions where the clouds probably were ice. And that was confusing because ice crystals are much larger than liquid water droplets. So it should not be possible to get visible diffraction rings from ice crystals. So we were, we had a mystery. And we went to a person who is an expert on mountain wave cloud micro physics. And he explained that they have measured the particles in these wave clouds. And very often these kind of wave clouds have tiny little ice crystals that are not normal crystals. They're just little balls of ice. And they're about the same size as liquid water droplets. They're about 10 microns in diameter. So that was the, that was the solution to the mystery. And this photograph actually shows the first ever measurement that proved that that was true. Because this dual polarization LiDAR proved that this cloud was ice. And the photograph proves that we are getting a diffraction pattern. And so let me show you what those signals look like. Here we have a linear display of altitude on the vertical and signal on the horizontal. And I have co polarized signal on the left and cross polarized signal on the right. What that means is over here on the left, we are detecting the same polarization state that we transmit. So let's say we transmit something that we will call horizontal polarization. That means over here we are also receiving horizontal polarization. And here we are detecting vertical polarization or orthogonal polarization. The co polarized signal is scattered by the molecules in the atmosphere. So this is all molecular scattering. As we go up, then there's a thin cloud layer followed by another cloud layer. And so notice that the signal is getting smaller as we go higher. Over here the signal is zero until you hit the cloud. And that's because the cross polarized signal is only generated by ice crystals, something that is not spherical. And so these ice crystals create the cross polarized signal. And the molecular scattering is not generating cross polarized signal. So we have zero all the way up to the cloud. And then we have only the cross polarized signal. This is how we prove that that cloud contains ice is because liquid water will not create this signal. Only ice will. And so in LIDAR, in atmospheric LIDAR, we often use what we call the cross polarization ratio. A lot of meteorologists in LIDAR people actually call it the cross polarization ratio. I don't think that's a good name because it's not really depolarizing. It's changing the polarization state. So I like to call it the cross polarization ratio. But it's the ratio of the cross polarized signal to the co-polarized signal. And you can see it gets very noisy because we're dividing by zero nearly, especially up high. But you can see that it's near zero until we are hitting the cloud. And then in the cloud layer, we have about 0.4, 40%. That's a very large cross polarization ratio. And that tells you that that is absolutely without any doubt ice. So these are ice clouds as they really must be because at this altitude, the temperature was about minus 50 degrees C. And at minus 50, we have to have ice. Down to about minus 35 degrees C, you can have liquid. That's called supercooled liquid. And that's why this kind of measurement is so important. OK, so now I'm going to show you the same kind of plots, but in a different form. This is now altitude on the vertical again, but instead of signal amplitude on the horizontal, this is time. So this is what we call a time height plot. And the color shows you the signal intensity. And so what this is, is a case where we have a cloud near eight kilometers and another little layer near, what is that maybe six between six and seven kilometers. This is the co polarized signal. And this is the depolarization or cross polarization ratio showing you that we have, you know, 40 percent here and maybe 20 percent here. Both are high enough that we know that these are ice cloud layers and they're thin clouds, which is why we can see two layers with a thick cloud. You only see the bottom layer. Here's a case where we have two layers. There's a thin layer up here and a thin, a thicker layer down here. And the cross polarization ratio shows you only the top one. The bottom one is gone, which means that we now know that this bottom layer was something that was either liquid clouds or smoke. And this was ice. It turns out there is one other thing that it can be. And in addition to ice, the other quantity that can scatter and create significant cross polarization is desert sand. So some of you live in the part of the world where we get big storms of desert sand from the Saharan desert. And it comes across the Atlantic Ocean. Well, it can come up into Europe. It can come across the Mediterranean. And it can come even all the way across the Atlantic and be measured here in the United States. Some of that high altitude desert sand has little crystals of sand that are not spherical. And so it also generates a cross polarized signal. And I've seen that. I've seen Asian dust. I've seen desert sand from Asia come over where I live. Okay, so let's move ahead. This is a, this is a LiDAR that was, this is one of the first LiDARs that was ever launched into space. This is called Calliope, which stands for a cloud aerosol LiDAR with orthogonal polarization. So it's very similar to the LiDAR that I just showed you, except for it's in space looking down. And so it has a new jag laser. It operates at 532 and 1064. 110 millijoules each pulse. Remember, I told you 100 millijoules was very dangerous. But that's, if you're close to it, this LiDAR is so far away that by the time the beam reaches earth, it has diverged enough that the energy density is eye safe. So you can stand outside if you're in the right place at the right time and look up and see a flash of green light when this LiDAR goes overhead. The laser fires 20 times per second, 20 nanosecond pulse width, one meter telescope, 130 micro-radian. Remember, I talked about how field of view on a telescope for a LiDAR is often in the micro-radian dimension because we're trying to minimize background light. Photomultiplier tube and avalanche photodiode for detectors. On the 532 channel, we have a 37 picometer etalon filter, very, very narrow band to reject the scattered sunlight. You don't need to work so hard to reject the scattered sunlight in the infrared, and so that's no problem. Remember, I talked to you, though, about how the analog to digital converter was the big difficulty. This LiDAR was launched in 2006. It was built in the 1990s. Well, it was built in the early 2000s, but it was designed in the 1990s. And so this is 20 to 30-year-old technology. The sample rate was only 10 megahertz, 10 million samples per second, which means that they were faced with the same trade-off. They need a lot of dynamic range. They claim to have 4 million dynamic range, which means they have a lot of bits. And to get that high-bit depth, they had to accept a smaller sample rate. But that's OK because if you're sampling the entire atmosphere with 50-meter samples, that's still a lot of data. And so this is perfectly acceptable. The layout is that the lasers... I need to move the pictures here. The laser fires downward. The scattered light comes back. Remember, it's over a very long distance because this is in orbit at, I forget the number, but something like 500 kilometers. And all the scattering is happening mostly in the bottom 10 kilometers. So the 1 over R squared losses are very large, but the signal comes back, gets collected by the telescope, gets split by a dichroic beam splitter to a green channel and an infrared channel. The infrared channel is simple. It just goes through a filter to the detectors. The green channel goes through a depolarizer for calibration some of the time. Then it goes through an etalon and an interference filter for rejecting scattered sunlight. Polarization beam splitter splits it into two components, the parallel and orthogonal. This is the copolarized and cross-polarized term. They're doing the same dual polarization method that I described to you before. And then after the detector, there's an amplifier, probably a transimpedance amplifier with an op amp. And 14 bit digitizers, they would like 16 bit, but you just could not, you could not buy them back then. And so we have 14 bit digitizers operating at, what was the speed? 10 mega samples per second. And here's what the signal looks like. So this is altitude and signal level on the horizontal. The signal level is logarithmic. Because again, I'm showing you how many, how large the dynamic range is. We have very weak signal from the upper atmosphere. And down in the lower atmosphere in the troposphere where we live, this is where all the aerosols and clouds are. The dashed line shows the molecular backscatter signal with an offset. And that's calculated. That's just calculated from the pressure and temperature of the air. And so up here, we can use that Rayleigh scattered molecular signal to calibrate the LiDAR signal, because we know that it's pure Rayleigh scattering. But down here, of course, the scattering is generated by things like aerosols and clouds that are much different than molecules. Now to remind you, a jet airplane flies somewhere in this range, 10 to 12, let's call it kilometers. That's the top of the troposphere. And so all of our weather occurs down here. And then this is the stratosphere. And up here is just the outer reaches of the atmosphere. So the signal goes very long distance. And because of that, the signal is very weak. And you can see what they do to overcome that sometimes as they'll average. This is a 20 minute average for the signal. Now here we see some plots as the LiDAR was flying across Africa and sort of Mediterranean region. And what you see is altitude versus distance. And this is the 532 nanometer signal showing serious clouds up to 15 kilometers smoke down here in the bottom three kilometers and desert dust. So how do you think the LiDAR people were able to determine that this is dust, this is serious and this is smoke? How do you tell the difference? Well, the difference is because you look at the cross polarized signal that's shown in the second panel, the perpendicular backscatter. You see that you're getting lots of cross polarized light from this stuff, but not from this over here. Therefore, this is something, it could be ice or it could be desert sand, but given that we are over the Saharan desert, it's not likely that this is ice. So this is desert sand. And over here, this is smoke because it is not desert sand that we would see it in the perpendicular scattering. Here's the 1064 beam. It also shows the same kind of thing. You can see there's less scattering in the 1064 channel than the 532. And so we can use the ratio of those two scattering measurements to tell us something about the size of the particles. And that's what I was explaining to the gentleman who asked me about measuring PM 2.5 particles. It's not a perfect measurement, but it gives you an idea of the size of the particles. Okay, so this is a complicated argument, but I'll try to be clear and I'll, but I also need to be quick. When we do this kind of LiDAR measurement, we have lots of unknowns and we have too many unknowns. The first thing is that the backscatter signal arises from both molecules and aerosols. So the backscatter coefficient has a term from molecular scattering and aerosol scattering, but the extinction, which means the absorption in the scattering losses comes about from a molecular term and an aerosol term. So this is absorption by molecules. This is scattering by molecules. This is absorption by aerosols and scattering by aerosols. So so far we have one, two, three, four, five, six unknowns. We have way too many unknowns to make sense out of the measurement. So what we do is we use Rayleigh scattering theory to give us the molecular coefficient up here and the scattering molecular coefficient here. So that's just Rayleigh scattering theory. If we know the pressure and temperature of the air, we can calculate that. If we operate our LiDAR on a wavelength where the molecules are not absorbing, then this term goes to zero. So now we have three unknowns. We went from six to three. So what we usually do is we take the aerosol absorption and the scattering, I'm sorry, the aerosol absorption and the aerosol scattering losses into a single extinction term that we just called the extinction coefficient. And then that leaves the final unknown as the backscatter coefficient for aerosols. That's what we're trying to measure with the LiDAR. So now we have two unknowns. We have the extinction coefficient and we have the scattering coefficient. But we only have one measure. So we still have too many unknowns. And so what we do is we combine these two into one parameter called the extinction to backscatter ratio, which is also called the LiDAR ratio. So it's the extinction coefficient over the backscatter coefficient. And we either assume it or lots of people published papers where they say for these kinds of conditions, it's generally in this range of numbers. But this is a problem. So when we, when we are trying to determine particle sizes or particle scattering coefficients, we have to either assume something or have to, we have to get another independent measurement. One of the more clever solutions to this problem is called high spectral resolution LiDAR. And in this case, we solve the problem by separately measuring the molecular and the aerosol scattering. And so if you look at the spectrum in the frequency space of a backscattered LiDAR signal, you see that it's got a broad component and a narrow component. This is aerosol scattering. And this is molecular scattering. And why is this the case? It's because the molecules are being, the molecules are moving and they are randomly moving back and forth much more rapidly than the larger aerosols. So the larger aerosols have less motion. So they have a narrower Doppler shifted frequency spectrum compared to the molecules. So if we can tune our laser to measure over here, we can measure the molecular scattering. And if we tune it here, we can measure the aerosol scattering. And that's the idea behind high spectral resolution LiDAR. And one way that it's implemented is by measuring the center channel and then using something like an iodine vapor filter to block out the center channel and measure only the wings, where you get the molecular scattering. And that's shown here. This is a paper from 1999. So it's not new. It's a very simple high spectral resolution LiDAR. And so the laser light goes out, the backscattering comes in the telescope, goes through a field of view, fixing iris. And then we beam split to two different channels. Part of the light goes through an absorption cell to knock out the center. And part of it goes through just straight to the detector to get the total signal. And by combining the two carefully, we can separately determine the molecular and aerosol scattering. Modern systems are a little bit more complex than that. And I'll show you an example in just a few slides. Okay, so we can also measure gases in the atmosphere. So far, we've really just talked about measuring particles, cloud particles and aerosols. And this is a technique called differential absorption LiDAR, or dial. And in this case, we also rely on a tunable laser. If we have a tunable laser that can be on an absorption line and off the absorption line. And if these are very close to each other, then what we can do is assume that all of the Rayleigh scattering is the same. And the only significant difference between these two signals is the absorption by this gas. And so if we write the LiDAR equation for the on channel and the off channel, many parameters cancel out. And what we're left with in this ratio. Oh, I guess I didn't write the final equation, but we're left with pieces of the equation that just relate to the scattering coefficients and the extinction coefficients. And so we are able to. Oh, I see what I'm going to just skip this one because it's too complicated for the time that we have the ratio of the on to off signal gives us essentially this tau ratio, which is the tau is just using a beer's law kind of factor. And it's just e to the minus path integral of the number density and the scattering cross section or the, sorry, the absorption cross section of the absorbing gas. This is the extinction, which we can separately calibrate out. And so what we get is we get an equation from the ratio of the two signals, we get an equation that is essentially e to the minus two times the absorption coefficient of the target gas. Well, that tells us, if we know the cross section, which we learn from spectroscopy, then the only unknown in this equation is the number density, which means we can both detect the presence of some gas and we can measure its concentration. We can measure the range dependent concentration of some target gas. One of the gases in the atmosphere that is most important for radiative balance is water vapor. And this is the layout for a dial light are that was developed at my university recently for measuring water vapor. And it uses diode lasers, which is very, very exciting because now we're going to do atmospheric light are with very small low cost lasers, limited range of course, but we can, we can do most of the troposphere, most of the boundary layer, you know, maybe five or six kilometers and that's where most of the water vapor is. So the way it works is we have diode lasers on the absorption line and off the absorption line. In the early prototypes, we tuned the laser back and forth between two wavelengths. And then we decided, let's just operate this laser at that wavelength and this laser at that wavelength. And they're controlled by a diode laser tuning controller. They get transmitted one at a time through it through an optical amplifier. And out the telescope, the back scattered light comes down here through a beam splitter. And we do the same kind of trick where we have a detector for the near field channel and a detector for the far field channel. And these are photon counting detectors. And by detecting at these two wavelengths that the lot diode lasers are operating on, we can generate a ratio. And that ratio tells us the water vapor profile. So here's examples of data over three days, three days and nights. This is height on the vertical and time on the horizontal. And you see on the top panel, this is the backscatter signal. So this is a conventional backscatter LiDAR signal showing you clouds and aerosols down below. And on the bottom panel, though, this is the actual water vapor density. This is grams per meter cubed of water vapor in the atmosphere derived from the on and off signals up here. And so you see that there's a lot of structure that we don't see in the scattering signal. We don't see any information about the water vapor. But in the water vapor signal, of course, we see a lot of data. So here's a low level bunch of water vapor that changes over time. Over here, the air is much drier. So this is dry. This is moist. And this kind of information is really important for both short term weather forecasting and for climate under climate modeling. And then here's a layout of a recent sick system. The two that we just talked about, it's a high spectral resolution and dial LiDAR system. So the blue part is a water vapor dial, the one that I just basically just the same one that I showed you. And the green part is a high spectral resolution LiDAR that is actually measuring temperature in the atmosphere. And so this is a really exciting development that lets us do thermodynamic profiling with a single LiDAR that has two, really four channels, right? There's two, there's four wavelengths. These wavelengths doing high spectral resolution LiDAR near the oxygen band. And these two wavelengths being out in about 830 nanometers for water vapor absorption. And this shows you that we have very similar data when we integrate this system next to a Raman LiDAR, which is the classic accepted way of measuring water vapor. And what do I have here? I forgot. Oh, this is the aerosol backscatter. Because we're doing high spectral resolution LiDAR, we can obtain the backscatter coefficient without having to assume, remember the problem with the too many unknowns. This removes that extra unknown. And this automatically calibrated backscatter coefficient. So now we can measure the PM 2.5 and all the other particles, much more accurately. Okay, we're just about out of time that I'm going to wrap up with just a couple of things. Another very popular way of measuring gases in the atmosphere is with Raman. And all the LiDAR I've talked about so far has been elastic scattering. And this uses in elastic scattering, which means that the frequency of the transmitted light is different from the frequency of the scattered light. And so for a given gas, there is a fixed Raman shift. So you can design your receiver to measure that Raman shifted light. But of course the Raman scattering is very weak. So these have to be very high powered lasers and very high frequency. So just to give you an idea of how this is done, I show you here that from the telescope, we split the signal into a 355 nanometer and 408 nanometer channel. 355 for aerosols, 408 for water vapor. 387 for nitrogen. The 387 and the 408 are both these are both Raman channels that are used. This one is used to calibrate, and this one is determining the water vapor. The 408 is the inelastic channel. 355 is the, that's the transmit wavelength. And we, we double this because there's long range and short range channels again. And this is just showing you an example that, that you can, you can then derive the water vapor profile versus time from this method. Notice that it gets noisy during the daytime because the scattered sunlight competes with the Raman scattered signal. Okay. So once again, we come to this idea of coherent light R, we already saw this plot where we talked about that we can generate a local oscillator and mix it with the back scattered light and get a difference or some frequency on the detector. And here's, I'm going to end by showing you some examples of wind LiDAR. We had a question earlier about wind LiDAR. So I will take the time to show you this even though we're pretty much out of time, but I think we're probably okay to take another minute or two. This is a wind LiDAR that I helped to build at, at NOAA many years ago, but it illustrates the point. In this case, we were operating out near 10 microns, but this can be done at many other wavelengths. We have two lasers. They go through acoustic optic modulators. The purpose of the acoustic optic modulator is to put a frequency shift relative to the local oscillator. These beam splitters are picking off a piece of the transmitted beam to operate as a local oscillator. Then we raise the frequency of the transmitted light by 80 megahertz with these AOMs. We go through an electro optic modulator just for polarization control. And in this case, these are, these are lasers that are operating with RF discharge optical amplifiers. So just think about this as a gain stage. And the more times I pass through the gain stage, the more gain I get. So I go through the gain stage, go through another bunch of gain stages, and the, oh, and when we come back off of this one, we go through a quarter wave plate. And so going through a quarter wave plate twice makes it a half wave. And the half wave shift rotates the plane of polarization, which means that we are now rotated so that we can use polarization splitting to separate between the transmitted light and the received light. And so basically what happens is we send out, usually we implement this with circular polarization. That's what this quarter wave plate is for here. So you send out one polarization and you get back the other polarization. So you go out the telescope, the back scattered light has rotated. So it gets passed by this beam splitter and comes onto the detector, mixed with the original local oscillator. That local oscillator is shifted by 80 megahertz by these AOMs so that the mixture, the difference frequency no longer sits at zero frequency and instead it sits at 80 megahertz and it can be processed with RF electronics. And then these are plots of measurements of winds in these canyons. So here's mountains on one side in a valley and we were sitting in a valley scanning the winds around the valley. So you can see these, again, using Doppler shift to determine the direction of the wind. These are winds blowing toward the lidar from the south. These are winds blowing toward the lidar from the north. And you can see the different patterns shift over time and with angle. This idea has been used to create much smaller, lower cost systems. This is a commercial Doppler lidar that was built. The guy who helped me build that previous one actually started this company and now he runs this company Halo Photonics and they put these out at airports, for example, and they just are doing constant monitoring of winds, looking for dangerous winds that could cause an airplane to crash. And this is 1550 nanometers and so it's eye safe. So it's an eye safe wavelength and we just don't have to worry about it. It just runs all the time. That's all I have to say for now. And so I see some hands up. Let's go to answering your questions. Okay. Thank you so much, Professor Joshua. So far we have two questions from two participants. The first one is from Susanne. Susanne, I think you are already on mute it so you can ask first. Thank you very much. Thank you for the very nice presentation. Thank you. We have seen a few times the transmittance of the atmosphere in the equations. And I was wondering how do you calculate it? Because I suppose it is very difficult and it is complex to calculate and maybe it includes absorption and scattering of the molecules. Right. One way, I'm going to go back to a slide that I showed you near the beginning. One way to do it is by looking at the satellite light signal. And you can see that the background transmittance is the only thing that's affecting this signal up here. Down here. We need to know this underlying transmittance, but it's too complicated to sort it out. So if you have a signal that if you have a LiDAR that is powerful enough to see far away where you have just clean air, I can back out the atmospheric transmittance just from this signal right here. And so that's one way, is to use the signal in a region where there are no clouds or aerosols. But if it's only a short-range LiDAR, then that just becomes a problem. We can't really do it. So if you want to, and really that's the problem that I was talking about when I talked about this, let me find it, that's what this problem is all about. The atmospheric transmittance, the atmospheric transmittance is this extinction term, right? This is the absorption and this is the scattering. So that's exactly the problem I was talking about, is that in the end, with a single wavelength scattering measurement, we have two unknowns. We have the extinction, which is essentially the transmittance that you're asking me about. And we have the scattering coefficient that we want to measure. In that case, you have to get some other information. One way of doing it is to use a solar light at the sun and determine the extinction along the path. That will not give you a range-dependent number, but it will give you a path-average extinction, and that can be used to bound the retrieval. But the other solution, of course, is this high-spectral resolution LiDAR, where you essentially remove the transmittance by separately measuring aerosols and molecules. So yeah, really good question. Does that make sense? Thank you very much. Okay. The second participant, Hugo, he's from Japan. He wants to ask some questions. So Hugo, you can unmute and ask, please. Thank you once again for this very informative talk. Thank you. I think for atmospheric LiDAR, it's one of the main research teams I'm trying to investigate. And as you previously mentioned, that the extinction event, we have too many unknowns, but the extinction event, for example, on a Raman scattering LiDAR, it can be determined directly by the back-scattering signal from the molecules, correct? But that requires a very high-power laser. Is there any reason that there is no trip pulse amplification done for Raman LiDAR? Oh, that's a good, that's an interesting question. I don't know the answer to that, but that's a very interesting idea. Maybe nobody's done it because I don't know. With that kind of power density, it might be really difficult to do. I can't think of how to do it without burning up my components. That's the trade-off, right? The chirp is done with sort of low-powered elements, but yet we need high power for the Raman signal. So, yeah, that's the trade-off. I'm curious, Hugo, where are you in Japan? Right now, I am in Sapporo in Hokkaido. Oh, wonderful. Are you at the university there? Yes, I'm actually under the research team of Dr. Tomohito Yamada, and he is very concerned about understanding the hydrological cycle and global climate modeling, and for that it's better to have smaller errors on things that are known and trying to determine things that are unknown yet very complex. Yeah, I spent part of my sabbatical at Hokkaido University about six years ago, and I didn't meet him, but I wish I would have now, because that would have been fun. So, yeah, you should maybe email me, and I would like to communicate with you guys some more. Japan is a favorite place of mine, so I'd love to come up and visit you guys sometime. Thank you very much, Dr. Shah. Yoroshiku onegaishimasu. Okay, any other questions? Okay, I think this is the end. No more questions from participants. So any final remarks from Joseph Nivela, please. Yeah, sure. Well, thanks very much, Joe. Really very, very well presented for a lot of people who even aren't experts. So I just want to say, I had some questions, but they're all dumb questions, so. Yeah, that's all right. I didn't want to send them to me on email. I had to move on fast, so I apologize. Yeah. I was just interested about whether you actually put these things on airplanes or not. Because I can imagine the short term. People for light ours on airplanes. I have tomorrow in my talk tomorrow, we will see some airborne light our work. Okay. All right. Yeah. Well, anyway, so. I think tomorrow, anyway, thanks very much again. And tomorrow you're going to give your last lecture on insect and fish lighter. I think Montana has no shortage of insects. Or fish. Very beautiful fish. And then also, John how will be speaking right after you. Photon counting compressive lighter and John. I will try a little harder tomorrow to stay on time. So I don't show up John's time. Yeah. We're kind of a little bit loose here. Well, you can't get too far off schedule with all the two talks. But for those that don't, don't remember John how is the ICL president. So anyway. Good friend of mine and wonderful guy. I'm looking forward to hearing from him. Right. And a collaborator, I guess, right? So you guys published together. Yeah. So it's really great. Okay. Good. So anyway, thanks very much again. And I look forward to seeing everybody back again here at. Five PM.