 Hello and welcome to lecture 2 of module 1. So, as part of the previous lecture that is lecture 1, we learnt about electromagnetic waves, microwaves, the properties of waves like their amplitude, frequency, phase, wavelength and we also briefly touched upon the superposition of waves and we discussed that for identical waves whose amplitude and frequency are same, we saw how waves interfered constructively as well as how they interfered destructively. So, in remote sensing, electromagnetic waves are characterized by their wavelength location within the electromagnetic spectrum and we have also seen the electromagnetic spectrum earlier and by now we know that the portion of electromagnetic spectrum used in remote sensing lie along a continuum, continuum that is characterized by magnitude changes of many powers of 10 and hence it is very common to use logarithmic plots to depict the electromagnetic spectrum and we have also seen that microwaves belong to the long wavelength electromagnetic radiation, long wavelength. From basic physics, we know that waves obey the general equation what you see on the screen, c equals nu lambda where c is a constant, nu is frequency and lambda is wavelength and wavelength and frequency are inversely related, is not it? So, higher the wavelength, lower the frequency and so on. Now in lecture 1, if you remember, we briefly touched upon the idea of active and passive microwave remote sensing as in what they are. Now in this lecture, let us try to understand it a bit further using radar that stands for radio detection and ranging radar. You may have noticed a radar speed gun which is used to measure the speed of moving vehicles that are constantly used in law enforcement. You know this can be handheld or it can be mounted on a vehicle. Now when a vehicle with a siren like for example an ambulance, when it passes by, you may have noticed a difference in the sound of the siren. So, this is due to a Doppler effect. Say you as a stationary observer, say you are standing on one side of the road and the vehicle which has the siren on is moving towards you. Then the siren sounds louder, is not it? And as soon as the vehicle crosses you and moves ahead, the siren changes its pitch. Of course, we are taking the example of sound waves here. Nevertheless, the difference in sound of the siren is due to something known as Doppler effect. Metas Doppler effect is a change or an alteration in observed frequency of sound due to motion of either source. As here we started with the example of a stationary observer. So Doppler effect, what is it? It is a change or an alteration in the observed frequency of sound due to motion of a source because we started with the example of a stationary observer. Now let us come back to what is displayed on the screen that is corresponding to a radar. Now with respect to a radar guns, how it works is that there is a change in frequency of the returned radar signal which is measured, which is caused by Doppler effect and radio signals are sent out as a narrow beam through the transmitter. So the transmitter sends out the signal, it hits the target, bounces back and the receiver receives the same signal after it bounces off the target. And due to Doppler effect, the frequency of the return signals, frequency of the return signal will tend to be different and this difference is used to calculate the speed of moving objects. Now these examples are discussed for you to understand that in active remote sensing, there is always a transmitter and a receiver. And the time taken between transmission of pulse and when the reflected signal reaches the receiver is measured by a radar which is used to estimate the range that is the distance. For example, in the case of Doppler weather radars, the moving targets can be aircrafts, it can be a swarm of insects or it can be clouds. And we have an antenna, pulse transmitted travels to target and hits it and the hit pulse gets reflected from the target and travels back to the antenna where it is getting detected. And we get to estimate the range that is the distance from the antenna to the target. Now when we start our module on active remote sensing, then we will learn in detail about how microwaves propagate in the atmosphere and more details about the Doppler weather radar data processing shall also be dealt with. But for now, these small examples are for you to understand the concept of a radar which is part of active microwave remote sensing. Alright, now before we move on to passive sensors, let us quickly try to understand what is energy of an electromagnetic radiation. So, according to particle theory, electromagnetic radiation is composed of photons or quanta and the energy of a quantum is given as E equals h nu or E equals h c by lambda. Now remember microwaves come under the category of long wavelength radiation, lambda will be high, energy will be less. So, a larger area needs to be viewed to make a detectable signal at the antenna. We will come to that shortly. So, coming on to the passive sensors now, because we have discussed about active sensor, we have discussed about active remote sensing. Now on to passive sensors. An example of downward viewing space bond instrument that operates on passive microwave remote sensing is called as a radiometer. Radiometer is a passive sensor and radiometers on board satellites are downward viewing instruments that operate on passive microwave remote sensing. So, what does a radiometer typically see? It is displayed in the form of a small caricature here. So, this is supposed to mean a satellite and this is the sun. So, we have the rays of sun that strikes the earth and then we have rays that are moving towards the satellite. So, a downward viewing space bond passive radiometer senses the upwelling electromagnetic energy in the microwave region that can emanate from the surface which reaches the top of atmosphere after something known as attenuation. So, here the absorption and the scattering properties of atmosphere and the background emissivity are highly important because they vary with frequency as well as polarization. So, by now I am assuming that you do understand what is meant by polarization. So, shown here towards your right side is the example of a handheld radiometer. So, this is a handheld radiometer. This does not mean that all the radiometers are of the standard size. No, there exist radiometers which are huge and heavy that they need to be mounted on trucks that is truck mounted. This is just one example. And as we discussed earlier, the microwave energy can be emitted by the atmosphere. It can be reflected from the surface. It can be transmitted by the subsurface and it can be reflected from the surface. So, the microwave energy it can come from the atmosphere, reflected from the surface, emitted from the surface and transmitted from the subsurface. And passive microwave remote sensing has wide applications in meteorology, in hydrology and in oceanography. We will see a few of these examples in detail as part of the tutorials. But for now, I hope that you understand the distinction between active sensors and passive sensors. Now, moving on, this is the right time to make again a distinction between imaging radars and non-imaging radars. Now, shown here is an example of a non-imaging radar, non-imaging microwave sensors, altimeters, they can be scatterometers. They are profiling instruments that measure in one direction as opposed to the two-dimensional representation seen in imaging sensors. So, when you use a digital camera to capture your digital image, you are getting data in the form of a two-dimensional image, is not it? So, an altimeter or a scatterometer are profiling instruments that measure in one direction. And shown here is an example of data from a non-imaging sensor. So, whatever you see in red dots, red circles or green circles are the data that are being collected by a non-imaging sensor. Example, altimeter, radar altimeters. Now, the frequency with which we get data is indicated here, green shows 1 hertz and red shows 20 hertz. Remember, this is the sample data from JSON satellite. Now, the fundamental principle of satellite altimetry is it helps us to estimate the water surface elevations from space, which means what is the depth of water in the reverse, you get an estimate of that from radar altimetry. So, a radar altimeter measures the distance from satellite to the surface of water and for this the sensors on board satellite transmits the microwave signals and these signals which are transmitted towards the earth, they result in echoes, which are then received from the target surface. And this two-way travel time taken by the signal is measured and then converted to one-way distance that is a range between satellite and the target surface because we know the speed of light. Since we have seen how an imaging radar output looks like, let us come to imaging radars. So, shown here towards your right side is an image from Allos Pulsar, just to make a clear distinction that in microwave remote sensing you can get data like this, you can get data like this. So, we will learn how each of these data are being captured and processed as part of this course, but just to make a distinction that we can have imaging sensors as well as non-imaging sensors. So, now let us try to understand what is different as in what is special in microwave remote sensing when we are comparing it with optical remote sensing. So, in optical remote sensing the visible and infrared regions of the electromagnetic spectrum are used for sensing whereas in microwave remote sensing the microwave region of the electromagnetic spectrum are used and we have already seen that microwaves can penetrate through clouds and it offers day and night all-weather sensing whereas visible or infrared regions of the electromagnetic spectrum they get blocked by the clouds and hence all-weather sensing is not possible. Another difference is that polarimetry and interferometric phase is very much applicable to microwave remote sensing whereas the same is not applicable to optical remote sensing. Moving on, now just to understand the few satellites that operate in the microwave region. So, we have NASA ISRO synthetic aperture radar abbreviated known as NISAR mission which is proposed to be launched in the year 2023. NISAR is a dedicated U.S. and Indian NISAR mission in partnership with ISRO for environmental monitoring and to study hazards. Also we have something known as RISAT-1 stands for radar imaging satellite which is built by ISRO and which uses a C-band synthetic aperture radar for earth observation RISAT-1. So, RISAT is aimed for disaster management and agricultural monitoring. Please remember that owing to the all-weather ability of synthetic aperture radar satellites they also find application for military surveillance and then we have ERS-1 which stands for European Remote Sensing Satellite. It is a C-band SAR which was mainly used for environmental monitoring and then we have NVSTAT which is the successor mission of ERS launched in the year 2002 and we have Sentinel-1 which is a constellation of two polar orbiting satellites operating in the C-band. JERS-1 stands for Japanese Earth Resources Satellite. This carried L-band SAR and as you can see we have LOS-1 and LOS-2. So, LOS stands for Advanced Land Observation Satellite, LOS. RADARSAT is a Canadian Remote Sensing Earth Observation Satellite program. The imaging frequency used was C-band and next we have TERASAR-X which is a German SAR satellite mission which used an imaging frequency of X-band and we have TANDEM-X. So, the primary goal of TANDEM-X and TERASAR-X was to generate a highly precise digital elevation model abbreviated as DEM. So, what is DEM and how it is useful in this course that will also be explained in subsequent lectures but for now I want you to be familiar with these satellite names as well as the bands and the different terminologies which are slowly introduced as part of the initial lectures. So, all well and good. So, we discussed about passive sensors, we discussed about active sensors, we discussed about the difference between microwave remote sensing and optical remote sensing and we also discussed about a few satellites that operate in the microwave region but then the big question applies what are the applications. So, let us try to look at a few applications that are relevant for this course. Starting with precipitation. Now, a key variable that drives the atmosphere's general circulation through latent heat release accurately quantifying the spatial as well as the temporal variability of precipitation is essential for applications in many disciplines one of which is water resources. So, what you see in front of you is the data from a Doppler weather radar DWR which is used for measuring precipitation shown towards your left side is a reflectivity image and shown towards your right side is a velocity image. Now, we shall be seeing details about DWR in future modules hence I shall reserve my explanations about them in the upcoming modules but for now I want you to visualize what are the different images obtained from active microwave remote sensing. Now, measuring precipitation is also very much possible from space because rainfall measurements from microwave sensors have a very, very long history mainly as they have a better physical connection to precipitation processes when compared to the visible and infrared sensors. So, the radars as well as radiometers on board satellites they enable us to get a better idea about precipitation and shown here is the global precipitation rate from iMark which stands for integrated multi-satellite retrievals for GPM. Now, coming on to a slightly different application that is land subsidence, vertical displacement of the earth surface land subsidence it can also be measured using INSAR interferometric synthetic aperture radar INSAR. Now, subsidence can be monitored with a very high precision and one gets to combine the information from two radar images acquired over a length of time and create something known as fringes that help to measure the amount of displacement. Shown in front of you are examples of image generated from Sentinel-1 satellite showing coherence towards the left and phase towards the right. I know that a few terms may sound new to you no worries they will be discussed as part of upcoming modules. I just want you to be familiar with what all are the applications that are possible using the active sensors as well as the passive sensors. So, we discussed about precipitation and here we discussed about the land subsidence. Now, the ability of microwave remote sensing to capture soil moisture that depends on a very sharp contrast and dielectric constant between the dry soil and the water at low frequency say 1 to 5 gigahertz. And the sensitivity of dielectric constant to soil moisture increases with decrease in microwave frequency. So, shown here in color is the sample data of soil moisture over India from the CMOS mission that is soil moisture and ocean salinity mission which was launched in year 2009. Now, this is the first mission to provide L band measurements. Another example that is measuring soil moisture. Now, coming to measuring water levels from space as I mentioned earlier we have already discussed about radar ultimetry is not it? So, shown here in green as well as in red are the tracks or path taken by Searle satellite and Jason over Rana Pratap Sagar Dam. This is towards your left and towards your right side you can see the comparison of Searle water levels with respect to in situ water levels. So, measuring water levels from space is possible using microwave remote sensing. Now, coming on to agricultural crops. Now, synthetic aperture radar images they have an innate property known as texture. It is very similar to how you feel a cloth and you understand its texture in a similar manner. A texture is an innate property that are more prominent in radar images and it contains highly useful important information about the structural arrangement of surfaces and their relationship to the surrounding environment. Now, studies indicate that texture based approach can be followed for classification of SAR image. All right? Shown here are microwave images as well as for comparison we have images captured in the optical region. Just for comparison to mention that classification of agricultural crops are also very much possible using microwave remote sensing. All right? So, moving further let us see flood mapping. You know accurate mapping of floods is highly relevant for emergency management, for post disaster reconstruction, for flood prevention etc. So, in this regard, synthetic aperture radar SAR imagery can provide valid backscattering measurements of inundated areas, areas under water, inundated areas through the cloud cover because as we mentioned earlier microwaves have the ability to penetrate through clouds. Shown here are the images for representation purposes only. So, moving on let us try to understand about digital elevation models or DEMs. So, DEMs are digital representation of the terrain and DEMs can be generated by many methods one of which is using interferometrics SAR, interferometric radar and shown here is the Cato DEM produced from Cartosat-1 stereo images. Now, there exists various hydrologic as well as hydrodynamic models that are capable of simulating floods and these models rely on numerous inputs and this is where microwave remote sensing is helpful. We will be learning more about these as part of tutorial 12 but for now this section was to give you an overall summary of the different ways in which microwave remote sensing can assist hydrology and water resources engineering. So, to summarize as part of this particular lecture we tried to understand more about active sensors about passive sensors and then we saw different satellites that use microwave remote sensing and we briefly touched upon the different applications that are of interest in hydrology and water resources engineering. So, let me hope that you found this lecture useful. Thank you.