 Hello everyone, welcome to the next lecture on the course remote sensing principles and applications. In the last lecture, we discussed about the spectral characteristics of the system. In this lecture, we are going to discuss about the other two characteristics that is the temporal and radiometric characteristics of a remote sensing system. And also we are going to see little bit more deeper about the data characteristics of visc broom and push broom scanners. So when I first introduced you the characteristic of remote sensing system, I told four important resolutions spatial, spectral, temporal, radiometric. We covered in detail about spatial resolution and spectral resolution. Now we are moving on to temporal resolution. Temporal resolution in general indicates how frequently we can get data over a particular region of interest. Say for some applications like weather forecasting or reduction of hurricanes, we may be needing data once every 30 minutes. Data in the sense it may be images or any other form of remote sensing data. So the images may be required once every 30 minutes or 15 minutes. For certain applications like crop monitoring, we may need data once every week. For some other applications such as urban sprawl monitoring, mapping of urban growth for such applications, we may require maybe like one good quality image every year covering the turbine area and so on. So based on our needs, the frequency with which we will need the data will vary. Similarly, all remote sensing systems has an inbuilt characteristic of how frequently it can give the data and that is known as temporal resolution. So temporal resolution it is in general determined by the orbit in which the satellite is. Say for example, if the satellite is in geostationary orbit, it will be constantly looking at one particular region of the globe and hence will provide image once every 15 minutes or 30 minutes depending on the image collection characteristics. So in general, satellites in the geostationary orbits will have the highest temporal resolution. Mostly weather monitoring satellites will be launched in geostationary orbits or they as they are required to look at the same region continuously. Now satellites in the near polar orbit which revolves around the earth from north to south, the frequency with which we can get the data depends on the orbital height that is the height above which they are circulating the earth and also the swath width. We noted that swath width right, we came across the term swath width that is a total area in the across track direction covered by the satellite. These two will define the temporal frequency with which we will get the data. So this is given in this particular slide. So temporal resolution in general means how frequently we can get the data. For example, Landsat satellites provide data once every 16 days. So this is in near polar orbit. Similarly, satellites in the geostationary orbit such as INSAT 3D, Argos, these satellites give data once every say 30 minutes. So this is in geostationary. So now we will look at this particular point. I told you that for satellites in the near polar orbit, the temporal resolution is characterized by orbital parameters and FOV that is the swath width. So how these things will define that is let us like take two examples, one is Landsat, another is Modus. Both of them are in pretty similar orbital heights around like 700 kilometers. Say this is like Landsat 8 I will take. Both of them are in similar orbits around 700 kilometers above the earth but Landsat 8 or whatever previous lands are like 7 or 5 whatever the swath width is roughly around 185 kilometers. If you look at Modus that is also in like similar orbital height but it has a very wide scan angle. It can scan 55 degrees. So it has a wider FOV and it can collect data with a swath of 2300 kilometers approximately. For Landsat, the swath width is just 185 kilometers. Its FOV is if I remember correct it is something around like 15 degrees that is all. For Modus it is 55 degrees FOV and its swath width is 2300 kilometers. So what will happen is so the satellites when they revolve they will go like this. Say this is north this is south they will come like this. The next orbit may be somewhere here. Say when satellites rotate, earth is rotating from west to east satellite is rotating like this. So when satellite finishes one orbit earth would have moved to certain distance. Now the satellite would have would be like coming in over a different region. So like this it will start covering different parts of the globe. So that is given here. This is maybe orbit number 1, orbit number 2 may be here, orbit number 3 may be here and so on. So this there will be certain ground distance between these two orbits in the order of like few thousands of kilometers in the equator. Let us say the orbit is just 185 kilometers for Landsat. So this is just 185. This is 185. You can see from this particular schematic example that there exists a large gap between two orbital parts and the swath width covered. On the other hand for Modus the orbit will look more or less the same but due to its very wide swath like its swath is something of the order of like 2300 kilometers. So this is maybe swath width and for second orbit it can overlap with. So essentially due to its very wide swath width it will be able to cover the land mass or whatever feature on the earth surface almost up to its next orbit level. Say one orbit goes like this satellite. Let us imagine both of them are going parallely. Just for explanation sake I am telling. Let us say two orbits, two satellites are moving at certain distance from each other. Let us say. Now if this can image all the area and this sensor can image all the area without any gap see these two combined together can cover all the region between them without any gap means what will happen they will provide gapless data between them right. Similarly you can imagine when Modus goes from multiple orbits around earth due to its wider swath it can cover a large area of the globe in one go and hence every one day or every two days maximum it can cover the entire globe. So the temporal resolution or the or temporal frequency of Modus is 1 to 2 days. We will get we will get one image at least one savory 2 days in case of Modus because of its wide swath. On the other hand for landsat due to its very narrow swath width we will not get image as with that high frequency but we will get image only when the satellite overpasses the area. So only when the satellite goes around your region of interest it will acquire image because of its narrow FOV and smaller swath. But for Modus even if it is flying like this it can due to its wide scan angle it can scan regions that are far off from it it can cover the large region underneath. So if Modus is going like this it can scan like from a region far off on both its on its either sides landsat cannot scan like that. Landsat scan is restricted to a very small angle and it can swath width is just 185. So based on the orbital height and its FOV the angle of scan the temporal resolution of systems will vary. So based on our applications we may have to choose data from several satellites combined together for needing it. Say for example landsat has data only once every 16 days. So if we need any other data with such high resolution we may combine landsat with Sentinel-2 which provides you data once every 5 days or 10 days depending on whether you are using one satellite or two. So we can combine multiple satellites in order to get frequent data. Combining data from multiple satellites has its own issues but that is one way. But in general what I wanted to say is temporal resolution means how frequently we can get the data from a system and the temporal resolution of geostationary satellites is pretty high in the order of like few minutes 30 minutes 1 hour and so on and they have the highest temporal frequency. For satellites in the near polar orbit the temporal resolution is fixed by the orbital characteristics especially the orbital height, the path in which it goes all those things plus the scan angle FOV that will determine the temporal resolution. The next topic that we are going to cover is radiometric resolution. We already got introduced to this particular concept when we discussed how digital images are formed that is I told you that the incoming energy will be sampled and quantized. Quantized is converted to a integer number based on some scaling factor. So the range of integers numbers used it depends on how much quantization we give it to the system. Some system has 8 bit quantization so the dns can vary between 0 to 255. Some system has 10 bit quantization so the dns can vary between 0 to 1023 and so on. So the number of quantization or the number of bits you provide for every band of data for every pixel of data will determine how many different gray levels we can get. Maybe first we will see an example then we will come back to the theoretical concepts related to radiometric resolution. In this particular figure given in the slide we are having a same image with different different quantization levels that is for each pixel for representing the incoming signal in each pixel here we use 8 bit of memory. So 2 power 8 that is 0 to 255 values it can take. Here it is 6 bit so 2 power 6, here it is 1 bit so it is 2 power 1 just 2 levels and so on. So each image is the image of same area represented with different different quantization levels. Here we can see the number of different gray shades we can normally all digital images are nothing but different gray shades contain just different gray shades. So here the number of different gray shades we can see in the 8 bit quantization images pretty high. We can see like certain features here this bright spot here we are able to see two tank like structures here there are lot of tiny dots on so on. As the number of quantization bit reduces the features the amount of information that we can get from the image goes down drastically. The number of gray levels that our eyes can see reduces a lot here actually it is just black and white only 2 levels either it is 0 no data or it is 1 that is black or white that is it. So here we have 4 different levels of data like this. So the number of gray shades contained in an image will be determined by the number of quantization bits we use. Higher the number of quantization bits the number of gray levels that we can see will increase. So the finer changes in the gray level we can see pretty clearly in the image. So I just again repeat all images are nothing but it can they contain different shades of gray mostly like they are like gray tone images black and white images. So color image how do we get color image we get by mixing different gray shade images in different different bands that is we know the 3 primary colors blue, green and red. Each will have its own gray shade image let us say it will have all are quantized at 8 bit level 0 to 255, 0 to 255, 0 to 255. Based on the objects reflectance in these 3 bands say row blue, row green, row red the dn numbers will vary higher the reflectance higher the dn number. So when we produce a color image in our display systems we will assign the dn in blue to blue display. We will give different shades of blue now when the display system works to us similarly for green we will like display it in like different shades of green similarly red different shades of red all of these will combine together to give us a color image. This is like a true color image or a natural color image because we are using blue to blue green to green red to red in display system. In remote sensing it is also possible to form images with what is known as like false color like remote sensing we can take images in different bands NIR band SWIR band thermal band and so on. So thermal we can leave it we can talk about one is reflectance bands NIR SWIR and so on. So what we can do we can we cannot see NIR directly. So that image it will be different shades of gray we can give some color to it say red color different shades of red we can give and then data acquired in red band we can give different shades of green and so on. This is what is called false color composite. So basically all remote sensing images are essentially containing different shades of gray the color images produced by how we combine them and how we display them maybe we will just little bit look deeper into it in one later classes. But these different shades of gray how many shades of gray that we are using is going to determine the amount of details contained within an image how much fine details we are able to see ok. So this is a generic information about the number of quantization bits. But this does not actually indicate everything about the radiometric resolution. You know like if you look at the concept of resolution I told you it is whether are we able to clearly distinguish two features or clearly resolve two features. So spatial resolution means are we able to resolve two nearby features spatially. Spectral resolution means are we able to resolve two features spectrally by having like narrow by looking at narrow absorption bands or so on. Radiometric resolution means are we able to resolve two features using the radiance values that came in maybe like one road with like a tar coated and surrounded by a pavement with cement cover on top of it. They will have different shades of gray one will be black one will be little bit grayish in color. So just by looking at the reflectance by collecting data are we able to identify them as two different features. That is what is radiometric resolution just by looking at the radiance value recorded are we able to distinguish two features. In radiometric resolution one important aspect is the number of bits we use. So based on the number of quantization bits we will have many different gray levels in an image. As the number of quantization bit increases we will be able to see clearly the different features present that is one aspect. But the one more important aspect is how accurate our sensor is in collecting the data. That is all sensors will have certain level of noise being produced within the system. So no system is perfect that we know if the system itself will generate some level of noise within it which will cause some data to be created and stored within the system itself. Everything will be created by the system and stored by the system. Some noise will always be there. Let us say some signal is coming from the ground like from two different features are coming from the ground. Let us say the difference between them say I will take like two examples. Let us say one number is 102, one number is 100. These are these represent the spectral radiance of two different features. The difference between them is just 2 watt per meter square per micrometer per radian. Very small difference between them. Let us say the system has a noise level of let us say 3 watt per meter square just a rough example I am telling these may not be the real numbers or real values just to drive home the concept I am doing it. So the system itself has some noise which is more than the difference between these two features. So what will happen essentially there are like always high chance that these two numbers will be recorded as one same feature that is we will not be able to distinguish them. Let us say there is same signal but now the noise of the system is just 0.5. So in this case what will happen the noise of the system is quite low in comparison to the difference of radiance between these two features. And hence even when the noise is getting added whether in a positive direction or negative direction there are high chances that we will record the radiance from these two features as two different objects and based on the gray levels we use they may get a different DN altogether or okay. So the DN and the accuracy of the system how much noise it will produce or how fine it can deduct the difference between two signals both of them combined together will determine the radiometric resolution of a sensor. So one important concept to notice noise equivalent delta L that is noise equivalent differential radiance that is I said all system will have like a constant or not constant maybe like a varying noise level. So at different different radiance the noise level will vary. So if this any delta is L is pretty low that is what should be the difference in radiance level between two different features. So difference in radiance level between two different features to create a signal inside the system such that this will be more than the noise that is produced within a system. I repeat let us say two features are there they are having like some difference in radiance between them like in the example we just saw I said one feature has 102 another feature has 100. So the difference in radiance between them is two if the difference between them is more than the noise equivalent radiance due to the noise within the system it will produce its own radiance. If the difference between them is more than the noise within the system itself then these two will be identified as two different objects. But if the difference between them is lower than the noise within the system they will be treated as one different object because they are having very similar radiance values. So if the noise of the system is very low if it is any delta L is pretty low then objects with smaller differences in radiance can be imaged as two different features. If the any delta L if the noise in the system is high then objects or features with smaller differences in radiance may not be imaged as two different features they may be imaged as one different features. This is purely depends on the accuracy of the system that is the noise content of the system how accurately the system collects the data. Now let us combine this with the radiometric resolution or the number of quantization bits we use. So I said an example we will get back to this example 102 100 radiance value from two features let us say the noise is 0.5 watt per meter square per micrometer per stadium. So these two will be directed by the sensor as two different features that is fine but let us say I am just going to use a 2 bit quantization. So 2 power 2 4 levels 0 1 2 3 that is all I am going to use for my DN levels. If I use this kind of low gray levels low number of quantization bits there are high chances both of them will get the same DN values that is the sensor has recorded them as or deducted them as two different features the noise of the system is pretty low. So the two features are recorded or deducted as two different features fine but when the quantization happened because of poor quantization because of very low quantization bits given to the system these two may still get same DN and saved in the images one feature because having same DN means we will see them as just one same feature. I will just go back to this example. Now here just look at whatever the portion having white here everything appears white as with one particular DN value but here this block contains lot of finite and different features right but everything got stored here as one DN value because of the very low level of quantization we have used. So essentially the number of quantization will work in combination with the accuracy of the system in order to or which will finally decide our ability to distinguish different features by looking at the DN values. So there are two parts one is the accuracy of the director all the elements itself and second comes the quantization. Both of them combined together will give us the radiometric resolution or how accurate we will be able to identify two different features. Similar to any delta L radiance concept in thermal remote sensing we have a concept of any delta T how different the temperature of two objects should be in order for them to produce a radiance which will be higher than the noise inside the system itself noise equivalence delta T noise equivalence differential temperature. So these relate to the accuracy of data collection and the number of quantization bits will tell us how fine or how many gray levels we will use. So if we combine them both of them together the any delta L or any delta T they will determine the accuracy with which the data is being observed by the system and the number of quantization bits will determine with what precision we record the data. Even accuracy may be higher like again I am going to the same example, same data when expressed at different different lower values of quantization bits the amount of information we get from the image decreases or here we are decreasing the precision of the data. So these two effectively combined together the accuracy of data collection the any delta L or any delta T combined with the number of quantization bits will determine the radiometric resolution of the system how accurate we are able to distinguish two features radiometrically that is by looking at the radiance values itself. So just here again I am going to the same example here you are distinguishing two, three different tank-like structures in 8-bit quantization image and 6-bit quantization image but you are not able to distinguish it in say this 4-bit quantization 2-bit quantization image and 1-bit quantization image it is not visible this affect our ability to distinguish features. So till now we have covered all the 4 important characteristics of remote sensing system spatial, spectral, temporal and radiometric resolution of a system. But depending on the way the data is collected the data may have certain distortions or certain characteristics like distortions means I will be mostly talking about like geometric distortions like how accurate the position of each point on the ground is imaged perfectly. So based on the way we collect the image whether it is like a scanner or it is a push broom sensor or it is a normal photograph whatever the geometric accuracy of the ground points will vary. So now we will look at some characteristics of the data collected by these remote sensing sensors. First we are going to talk about the characteristics of data collected by visc broom scanner. First thing that will happen is visc broom scanner image collected by visc broom scanner will undergo what is known as a tangential scale distortion and resolution cell size variation. These two they are basically distortions these two are inherent in the data collected by visc broom scanners. What are these things? What is tangential scale distortion? I told you when visc broom scanning happens sensor will be moving satellite will be moving like this sensor or the scanner will be scanning like this in the across track direction and also I told you that at every given constant interval of time delta t the data coming in from the scanner will be sampled and it will be quantized that also we know. So the sampling time delta t is fixed. So collect data once every say 6 microseconds or 3 milliseconds based on like the scanner type let us say like the sample is collected every 6 microseconds. So as 6 microseconds elapses so sensor may start from here every 6 microseconds it will move continuously like this continuously will be collecting data from the ground but that data will be sampled every 6 microseconds producing that many pixels on the ground that concept we know that is what we came to know as GSI or GSD ground projected sample interval or ground projected sample distance. Now just think the time between two samples is fixed and the angular distance that is moved by the scanner within a given time interval is also fixed that is it will make small angle of delta theta for every delta t time that is given in this particular slide maybe you can have a look at this. So for every time interval of say delta t the scanner will move an angle of delta theta from this point it will keep on changing. This delta theta is fixed for a given delta t because as the scanner let us say the scanner starts from here after time delta t it would have come here it would have covered an angle of delta theta after another time of delta t it would have come here it would have come here again it would have covered the same angle delta theta. So the angular distance covered and the time taken for between each sample is fixed but now let us say the ground let us assume the ground is flat. Now the scanner is collecting data the ground is flat the angle with which finally it is being projected on the ground that is the the distance between the samples let us say because every delta t times every delta t seconds a sample is collected. So the ground distance covered within the delta t will became larger and larger as thus can angle increases the ground is flat the angle is fixed the delta t angle or the angular distance is fixed between which two samples are taken because of the flat ground the ground distance between two samples between will be keep on increasing. This is one thing that is the GSI will be keep on increasing as the scanner moves away from nadir. So if it is from here the GSI will be without what is basically fixed for the system as the scanner moves away from the nadir as it scans at an angle then the for the same sample time delta t the ground distance or the GSI will be keep on varying the GSI will be keep on increasing as the scanner is moving away from the nadir. Similar to GSI the GIFOV also will undergo some distortion that is I told you what GIFOV is GIFOV means what is the projection of a single detector element on the ground. So as the single detector element if it is looking at nadir it is okay if it is a circle it will be on the ground as a enlarge circle if it is a square it will be on the ground on as a enlarge square that is fine. But let us say this GIFOV is now looking at an angle away from nadir. So when you project it onto like a flat horizontal surface what will happen the circular detector element here will be projected on the ground as an ellipse because of a larger scan angle. So two things will happen first thing is the GIFOV increases as the scan angle increases similarly the GSI will also increase as the scan angle increases that is a circular GIFOV element as it moves away from this is the IFOV okay. IFOV is fixed for a system but as the scan angle goes away from nadir the same IFOV angle of the system will have a larger ground area coverage this is enlargement of GIFOV. Similarly I also told you that the sampling interval delta t is fixed for the same sampling interval the sensor would have covered larger ground distances between two samples. So these two concepts change in GIFOV and change in GSI or important characteristics of visc broom scanner. We will see little bit more in detail about these concepts in the next lecture. So as a summary in this lecture we have seen the concepts of temporal resolution, radiometric resolution and also started discussing about the data characteristics of visc broom scanner. Thank you very much.