 This is the Lesson 1 instructor lecture, working with remotely sensed data. In this first lesson, we will focus on the following objectives. First of all, there should be a clear understanding of the basic principles of remote sensing. This includes the electromagnetic radiation, the electromagnetic spectrum, the meaning of irradiance and irradiance. Irradiance is the light coming from the sun illuminating the earth, and radiance is the light reflecting off of it that gets captured by a sensor. You need to have an understanding of atmospheric absorption and scattering processes that happen as the light is transmitted through the atmosphere of the earth. And once again, you need to have a clear understanding of the idea of ground irradiance, that is the solar irradiance, the light coming from the sun bouncing off the ground, versus the sensor radiance, which is the light that arrives at the sensor, at the satellite sensor, or the aerial sensor and is captured as remotely sensed data. And you need to have a clear understanding of the reflectance curves, that is how the different types of land cover features reflect light. Then you need to be very clear about the idea of two types of remote sensing, passive remote sensing and active remote sensing. We will focus on passive remote sensing, particularly multispectral imagery from Landsat satellites. And you also need to be aware of the National Agricultural Imagery Program or NAPE imagery, which is aerial imagery typically to the order of two foot to about one meter resolution on the ground, whereas Landsats are moderate resolution and capture imagery up to 30 meters or at 30 meters spatial resolution. We will also look at some active remote sensing sensors like LiDAR. So LiDAR basically has its own energy source and is beaming the energy down onto the earth and making measurements on earth. And similarly, radar is used on both aerial and satellite platforms. And both of these comprise active remote sensing because they have to have their own power sources to light up the study area with electromagnetic energy and catch the reflections and make sense of what kind of surface is being imaged. We will also be exploring publicly available remotely sensed data sources, including the USGS Earth Explorer, the USGS Glovis, the NASA Landsat Look, the Google Earth Engine and the Amazon Web Server, which is the engine behind the Esri Living Atlas. And we will also look at visualization of remotely sensed data, particularly multispectral data and what is unsupervised classification and you folks will be doing an activity with a lab activity with unsupervised classification and how remotely sensed data is displayed and visualized on a computer monitor. So let's take a look at the field of remote sensing and how it connects with the other fundamental sciences and geospatial sciences and technology in general. So we can see that remote sensing is rooted in the physical sciences and is based on physical measurement but also has a great overlap with GIS with geographic information systems and geographic information science in that in GIS is basically computer science where every data element has a location on the earth, latitude and longitude and time associated with it and remote sensing is also related to cartography because remotely sensed data can be converted to thematic maps which can then be used for GIS modeling or can be used for decision making. We can also see that remote sensing has an overlap with the social sciences such that if you're looking at remotely sensed imagery you're not just looking at a physical landscape but you're looking at a socio-economic landscape as well and it also has a overlap with the biological sciences such that remote sensing of physical measurement can be used to quantify biological systems like forests, agriculture and vegetation on the ground. So here is an overview of the electromagnetic spectrum. So you can see that the electromagnetic spectrum ranges from large wavelengths to the order of 100 meters or longer. That would be radio waves, then FM waves which is to the order of centimeters, then we get into microwaves, then we get into infrared as we come into shorter wavelengths and then we come to this region over here which is visible light to us. So our biological detectors are coupled to electromagnetic radiation only in this particular region over here that ranges from a wavelength of 400 nanometers or 0.4 micrometers to about 700 nanometers or 0.7 micrometers which is what we deem as visible light. So most of the remote sensing that we will look at will be in this region in the visible region and will include the infrared region as well and then when we look at a little bit of radar remote sensing we will look at microwaves as well. So here are some fundamental concepts about waves and it turns out that electromagnetic radiation is a wave very much like a wave on the surface of water or a wave propagating through a string and we can see from this diagram over here one complete wavelength is one complete cycle or a distance from a trough from a peak to a peak or from a minimum or trough to a trough and that we can also see that if the wavelength decreases the frequency increases the wave is oscillating more frequently in the case where the wavelength given by the Greek letter lambda right over here is shorter as opposed to when the wavelength is longer the frequency is lesser so wavelength and frequency are inversely related to each other when one increases the other decreases also all electromagnetic waves travel at the speed of light given by the symbol C which is equal to 3 times 10 to the 8th meters per second or 186,000 miles per second which is very very fast in vacuum and in air and the frequency is defined as the number of cycles per second where one cycle per second is one hertz that's the units of frequency wavelength is given by the Greek letter lambda and is the length of one wave cycle and it has a unit of length and it turns out that for all waves in general and therefore for electromagnetic waves as well the speed of the wave is equal to its frequency times the wavelength and this is a very fundamental relationship that relates the once again the speed of the electromagnetic wave C with its frequency and its wavelength given any two we can solve for the third quantity so once again frequency and wavelength are inversely proportional to each other such that as one increases the other decreases and vice versa so as an electromagnetic wave moves through space or air it turns out that it has two mutually perpendicular components one is an electric field component and at 90 degrees to it you have a magnetic field component and both of these two components are oscillating and that once again a reminder that electromagnetic waves in the visible part of the spectrum have a wavelength range of 0.4 micrometers which is violet or blue to 0.7 micrometers and that is where we have red and then beyond that we have the near infrared and the infrared region so once again an emphasis on basic definitions light incident from the sun is known as the irradiance such that you have a irradiance a solar irradiance at the top of the atmosphere it comes to the atmosphere and hits the ground and then you have the surface irradiance and then you have reflection off of the surface which is known as surface radiance or reflectance and light reaching the sensor so therefore the light once again moves through the atmosphere and reaches the sensor and the light reaching the sensor is known as the sensor radiance and in the case of a satellite is also known as the top of atmosphere reflectance the reflectance from the earth that is arriving to the top of the atmosphere where the satellite camera captures the information so this diagram highlights the remote sensing process and also highlights the need for the atmospheric correction of remotely sensed data so we have solar irradiance the light coming from the sun that is incident to the top of the atmosphere and that is known as the top of atmosphere irradiance then the light works its way through the atmosphere and has several absorption and scattering processes that occur to it and then we have the surface irradiance the light that falls on the surface after it has traveled through the atmosphere the reflected light is known as the radiance so you have the surface radiance or the surface reflection then which once again works its way back up through the atmosphere to the sensor and as the light is working the surface radiance or the surface reflectance is working its way up through the atmosphere more information is lost due to absorption and scattering processes and the information that is captured is the sensor radiance or the top of atmosphere reflectance one or the other both of these two terms mean the same thing so therefore it turns out that the surface reflectance or the surface radiance is what is needed for the process of remote sensing such that the reflective surfaces can be characterized or identified properly so therefore atmospheric correction is necessary to add the information that was lost from the surface reflectance or the surface radiance as it made its way through the atmosphere to the camera if that information is added back in to the sensor radiance we recover the surface radiance or the surface reflection and that is the desired product for land cover mapping and identifying the surfaces that are being imaged so let's focus in on solar irradiance on earth and the separate phenomenon of thermal emission from the earth so in this graph we can see the light that is incident on the earth and if we just focus in on the region of the visible light we can see the bell shaped curve and we can see the visible region right in the middle of it and this is the light that is arriving on to the earth whereas if you look at the graph on the right that is because the earth is cooling off so earth still has geologic activity within it and has lots of heat within it in a very cold environment of space so it is just given off this thermal radiation that peaks at a wavelength of about 9.66 micrometers and that the emission phenomenon, thermal emission phenomenon from the earth is independent of the sunlight falling on it and for remote sensing purposes we are interested in making measurements of the radiance, the reflection that occurs when the solar irradiance is incident upon the earth so this is a recapping of these ideas that we have discussed up to this point once again very fundamental, very important the idea of sensor radiance versus the surface radiance or the surface reflection, sensor radiance is also known as the top of atmosphere radiance and that is what is captured by the camera the whole idea is to recover the surface radiance from the sensor radiance by accounting for the atmospheric effects that happened as the light traveled back from the surface of the earth to the sensor so please do bear in mind that it is ideally the surface reflectance product is the most appropriate for land cover mapping if you have it available as the light moves through the atmosphere it goes through several scattering processes the most significant of these is the Rayleigh scattering that happens off of gas molecules and as we will come to see Rayleigh scattering is inversely dependent to the wavelength to the fourth power of the wavelength in other words shorter the wavelength the more the Rayleigh scattering and therefore blue light scatters more than other lights in the visible spectrum me scattering is scattering that happens off of smoke and dust and then non-selective scattering is what happens off of water vapor so why is the sky blue? it is blue because of Rayleigh scattering so once again Rayleigh scattering is proportional to the inverse fourth power of the wavelength so what that means is as the wavelength decreases the Rayleigh scattering increases in a hurry and what that means is that the smaller wavelengths are scattered non-linearly to a much greater extent for the smaller visible electromagnetic wavelengths that is violet or blue is scattered way more than the other longer visible wavelengths and this preferential scattering of the blue part of the electromagnetic spectrum is what makes the sky appear blue to us and therefore blue band is not very good for land remote sensing because you end up getting a little fuzziness in it due to the Rayleigh scattering but it is important for coastline and shallow water remote sensing the most significant atmospheric process is atmospheric absorption that occurs as the light flows moves through the atmosphere and this absorption occurs due to the different gases the different molecular species that exist in the atmosphere and so from these graphs which show absorption one means complete absorption zero means no absorption we can see that the main absorbers in the atmosphere are H2O which is water carbon dioxide notice that carbon dioxide really absorbs very highly in the infrared part of the spectrum right at the end oxygen and ozone also absorb radiation and you have nitrous oxide also absorbs radiation and such that the sum total effect is that you have regions of complete absorption notice at less than 0.3 micrometers all the light is absorbed such that we have an opaque window available to us in the atmosphere over here from 0.3 to roughly 0.7 micrometers you can see that we have a region of transparency where we can look through the atmosphere and it makes sense to deploy remotely sensed sensors remote sensing sensors in these open window regions only and then we end up having other successive regions where there is great absorption and these are the regions where we do not put sensors for remotely sensed data only where you have windows and to highlight the atmospheric absorption and its impact on the solar irradiance onto the earth on this graph you can see that you can see the light the solar radiation that is falling on top of the atmosphere which is like a skewed bell shaped curve and then you can see that the light that arrives on the surface of the earth the surface irradiance and then it reflects and becomes the surface reflectance of the surface irradiance and you can see that there has been substantial absorption that has taken place in the light as it moved through the atmosphere and hit the earth and once again as the light continues back from the earth after reflection after becoming irradiance and arrives at the sensor you have more of this atmospheric absorption that takes place and we would have to remove these atmospheric absorption effects such that we can recover the surface reflection or the surface radiance from the sensor radiance or the top of atmosphere radiance that arrived at the camera so let's focus in on the idea of surface reflection the surface radiance and take a look at this graph that gives you the percent reflectance versus the different wavelengths of light that are reflecting off of the earth so here we are looking at surface reflectance or surface radiance the light reflect in off of the earth and we can see the major land cover classes that are visible to the human eye on satellite imagery mapped over here what is significant is this dark curve for grass which ends up being similar for all of vegetation low in the red a peak in the green which means vegetation is reflecting the green absorbing the red absorbing the blue and then highly reflective of the near infrared and so therefore a high green peak in the green and high reflectivity in the infrared will end up being a sign of vegetation and you have high reflectivity in the infrared due to the chlorophyll molecule in vegetation which does not like heat it reflects the heat back because it does not need it if vegetation was to absorb heat it would shrivel away so nature has just designed the chlorophyll molecule to be highly reflective in the near infrared region notice that there are these other absorption peaks as we get to higher wavelengths but the behavior of the reflectance for vegetation from 400 nanometers to about 800 nanometers is significant enough for applications in mapping as we will come to see also note that the reflectance of water is very low in the infrared and one way to identify water is then by seeing is very low reflectivity in the infrared part of the spectrum which is about 700 to 900 nanometers and then we can see urban rocks and soils which would be concrete sand and so forth asphalt they have somewhat equal reflectance across the blue green red and near infrared and that is distinguishing spectral characteristic of urban rocks and soils so once again recapping the three major land cover classes would be vegetation that can be broken up into forests or agriculture pretty easily water which is has a very low reflectivity in the infrared whereas vegetation has a very high reflectivity in the infrared and urban rocks and soils have similar spectral signature in that they are somewhat roughly equal across blue green red and near infrared so once again let's focus back in on the reflectance curve and zoom in on to the region of great interest to us in so far as imaging by the Landsat satellites is concerned so here we are looking at this reflectance curve that goes from 0.4 micrometers to about 0.9 micrometers 0.4 micrometers is 400 nanometers 0.9 micrometers is about 900 nanometers you can see that the near infrared region begins from 0.7 to 0.8 micrometers and once again in recap your blue band is roughly between 0.4 and 0.5 micrometers the green band is between 0.5 and 0.6 micrometers and the red band is between 0.6 and 0.7 micrometers so it turns out that the four major land cover classes are pretty readily discernible in moderate resolution Landsat multispectral imagery and these are vegetation which will typically have a peak in the green and a high infrared shoulder a high infrared reflectivity and the healthier the vegetation the greater the reflectance in the infrared because there is a greater chlorophyll content in healthier vegetation such that the typically forests might have a greater near infrared reflection than just grasslands or agriculture water has low to zero reflectivity once again in the near infrared such that if you look at water in the near infrared Landsat bands it will typically be very dark very low reflectivity and urban rocks and soils are roughly constant across blue green red and near infrared here's a reminder for the band for blue green red and near infrared and it is left as an exercise to the students to correlate these to the Landsat bands specially for Landsat 5 Landsat 7 and Landsat 8 now this basic reflectance curve is a very important tool for remote sensing and it allows for the interpretation of multispectral imagery such that if you are displaying a multispectral image on a computer monitor this the knowledge of this reflectance curve will help you interpret the curve and see what kind of features are being highlighted with the assignment of which color guns to which bands and we are going to get into this idea on the next slide so as we begin to get into multispectral image interpretation and display this reflectance curve once again is a reminder you can see the three major land cover classes water being very low in near infrared urban rocks and soils roughly constant across blue green red near infrared and you can see the classic vegetation reflectance curve over here with the with the peak in green and then the large peak in the near infrared so I want you to consider these two questions posed over here and hopefully you can come up with the right answer so once again this is a reminder that basically a image a multispectral image consists of stacks of bands of different imagery each band corresponding to a particular wavelength so let's say band one let's say is blue which is the case for landsat five and landsat seven and each pixel has a digital number that shows the brightness of the blue reflectivity that is being received by the camera and since this is landsat is a eight-bit image it basically means that if no signal no energies coming in then the sensor is going to register black and is going to give you a digital number of zero and if the sensor is receiving the maximum possible energy that it can measure then it will give you a digital number of two fifty five and that will be absolute white and all the other shades of gray in between are a measure of the reflectivity of blue light arriving at the sensor and for a eight-bit image it is chopped up into two raised to the eighth power which is two fifty six so it is chopped up into two fifty six values ranging from zero to two fifty five so similarly if we had a twelve-bit image then this region from no signal completely black to with the sensor being saturated to being completely white if it was a twelve-bit image then this entire region would be broken up into two raised to the twelfth power whatever that number might be it will be chopped up into those finer segments and will set to have greater radiometric resolution so having covered the fundamental ideas let's focus in on how a multispectral image is going to be displayed on a computer monitor and how can we interpret the image and the basic idea over here is that the primary the fundamental colors are red green and blue and if you combine red green and blue you can create any other color with different combinations of different intensities of red green and blue combined together so you just have three color guns that are available to you on a computer whereas in a multispectral imagery you may have multiple bands ranging from three from four to seven eight or more bands up to thirty six on modus and up to two hundred and twenty let's say on a hyperspectral image so you can only assign three color guns to three bands is all you can do and so therefore you're assigning color guns to reflectivity that is arriving at particular wavelengths and in order to be able to interpret the image you need to know the connection between the reflectance curve and that particular wavelength that is being imaged in a particular band so we will do exercises in this lesson where these ideas of band color assignments and and interpreting the resulting image will be clear to you once you work through the activities so always know the sensor band characteristics before working with that sensor is very important in other words band one may not be blue in all the sensors available to you so for example the French satellite the spot the earlier satellites did not have a blue band and then and then the band two was green instead of blue and so forth bottom line always check the sensor characteristics what are the wavelengths assigned with the particular bands and know your reflectance characteristics well such that you can then interpret the imagery that is being displayed on the computer monitor so once again we have this graphic that brings together the ideas of the reflectance curve of the major land cover classes and the bands that which the landsat thematic mapper which was deployed on landsat five does mapping in these bands so you can see the region where the blue band does the imaging green red near infrared middle infrared and and yet another middle infrared band and all of these bands are places where we have atmospheric windows available for the transmission of light through the atmosphere so you should be familiar with the idea of the natural and false color composites and you have natural or true color when you have the red color gun is assigned to the red band the green color gun is assigned to the green band and the blue color gun is assigned to the blue band and then you see the colors in the imagery as you would in true color or natural color but if you start switching on the color gun assignments to different bands you get false color composites so please watch through the following short NASA video to review the idea of natural and false color composite viewing on a computer monitor so in this course we will begin with a primary focus on the landsat satellite constellation and landsat is the world's oldest moderate resolution satellite program with continuously available radiometrically and geometrically calibrated multi-spectral imagery this is very important because lots of trouble goes into making landsat imagery very radiometrically pristine such that you can compare imagery taken from a long time ago to different vintages very effectively and global 30-meter imagery with a 16-day revisit cycle is available since early 1972 so actually right in the early landsats the revisit cycle was 18 days very quickly it became 16 days as that worked better landsat data archives became available free of charge in 2008 and understanding the downloading processing and display of landsat data serves as a very useful introduction to working with remotely sensed data in general the principles you're going to learn with landsat imagery applied to other sensors as well is just that the other sensor characteristics will be different and you will need to be aware of the sensor characteristics what are the different wavelength bands being imaged before you start working with it but the reflectance characteristics remain the same always for land cover features and please students will be expected to have a good working knowledge of the landsat missions and their sensor characteristics and please note the similarities and differences between landsat 5 thematic mapper landsat 7 enhanced the matter thematic mapper plus and the landsat 8 o li or the operational land imager sensors so this is a very good graph that plots the spatial resolution versus the wavelengths of the different bands for landsat imagery and for the spot imagery which is the second most well known second oldest environmental mapping satellite that was launched by the French so this is a very good diagram by which you can very quickly look up sensor characteristics you can see that the initial landsats had 120 meter thermal infrared and that ended up then becoming a 60 meter thermal infrared in the etm plus missions and so forth you can see that for landsat 1 through 5 the bands are pretty much the same over here in the multi spectral scanner and then in the thematic mapper series of sensors in landsat 4 and 5 you can see that the spatial resolution was 30 meters you can see that in etm plus landsat 7 you can see the different bands as well so this is a very good reference chart and you should be able to read this chart and make sense of it and so it turns out folks that landsat imagery and other remotely sensed data that is publicly available is available at the sites that I have listed over here please sign up and get an account on the USGS Earth Explorer website and the USGS Glovis which stands for Global Visualization Server Portals and also please request a trusted tester access to Google Earth Engine as you're supposed to do in lesson 0 by clicking up on the sign up icon on the top left of the Google Earth Engine main page it is preferred that you sign up on Google Earth Engine with a Gmail address the lesson one lab activity will involve unsupervised classification on ArcGIS Pro this is an activity that you should have done earlier in geography 480 but nonetheless this will be somewhat of a review for you in that case or if you have not done it earlier you can catch on pretty quickly we will do unsupervised classification to begin with using landsat multispectral imagery on ArcGIS Pro and we will be using the classification tools available underneath the imagery tab as shown in the graphic below extensively in this course so for the first activity we will just begin with the classify part over here but later on we will be getting into the segmentation and training samples manager and labeling objects for deep learning in successive exercises so this lecture covered the basic ideas that you need to be familiar with in lesson one and if you have any questions or comments please post them on the lesson one discussion forum thank you