 Let me begin today's class with a 3 dimensional image that is a hologram, a 3D virtual image of an object. So there are different types of holograms and all of them work by recording the interference pattern of light waves which are caused by an object and then recreating that scene when illuminated. Now looking at this, I do have a question as an can interference pattern create a 3 dimensional surface of earth? Well that is what we are going to find out as part of today's lecture. So welcome to the second lecture of module 6. Before we move ahead, a quick recap on what we learnt in the previous lectures. So in the last class, we were trying to understand Young's double slit experiment indirectly by using the analogy of sound. And we learnt that when two sources which can be either two receivers or two transmitters. So when these two sources are placed very close to each other, the path length difference shall change as a function of look angle to source, path length is the path travelled by a wave and look angle is simply put the angle with which the radar is looking at the surface of earth. So we learnt that for a wave, a single measurement of phase is not of much use, is it not? But then when we compare between two or more phase measurements, they can provide invaluable information about the path length difference, path length difference. Again when two sources which can be a receiver or transmitter when they are close to each other, the path length difference shall change as a function of look angle which is given by the following relationship here, delta L equals d sin theta L where theta L is nothing but the look angle, delta L is the path length difference and d is the separation between the two sources. Also we learnt that the phase difference can be written using this relationship, delta phi phase difference equal to delta L path length difference by lambda wavelength into 2 pi. Again we shall be getting constructive interference whether the phase difference is say 0, 2 pi, 4 pi or any integer number n of 2 pi's. Now if we consider only those angles which give a peak in the interference pattern, we can rewrite the expression like this where theta L that is a look angle equal to sin inverse n lambda by d. Notice this is a maximum in peak which occurs if theta L, lambda and d that is the separation between the two sources, they satisfy the following relationship. And remember in the last lecture we also learnt about interference pattern what happens when the frequency is changed, what happens when the distance between the two sources are changed. So by now we have an understanding that the phase information of a wave can be used to compute a two-way distance, shown here is a wave to convey the meaning of phase. Phase difference is the difference in phase angle of two waves. So by looking at the screen in front of you you can understand that a complete cycle is defined by 2 pi, isn't it 360 degrees and for two-way distance the phase can be written as phi equal to 2 pi by lambda into 2r and 2r remember is for two-way distance. For the distance that is in one direction it is going to be phi equal to 2 pi by lambda into r, for two-way distance it is 2r because the wave has to travel twice. For passive remote sensing it is going to be r, isn't it? And for active remote sensing like radar remote sensing it is going to be 2r. So now we are clear on the fundamental aspects. So then let me ask you a question, you know if topographical information possible to be extracted from synthetic aperture radar because in the beginning of the lecture I showed you a hologram which gives you a three-dimensional virtual representation of the earth surface. Now I am asking you if two single look complex images from synthetic aperture radar is captured over the same aerial extent can they be used in some way to generate topographical information, okay? Elevation information, the answer is yes, absolutely yes and we can obtain high resolution topographical information from synthetic aperture radar interferometry. Interferometrics are abbreviated popularly as INSAR which will be part of today's lecture. So firstly let us try to understand what exactly is interferometry that is a new terminology isn't it? So interferometry simply stated it is the technique by which the coherent properties of electromagnetic waves are utilized to measure the earth surface changes even to the order of centimeters, think about it even to the order of centimeters and to measure the distance between two points which has been travelled by a wave, we tend to count the number of whole wavelengths between two points isn't it and of course we need to know the fraction travelled by the wave as well. For a wave say we know the phase but then we do not know the number of full or whole wavelengths travelled in the path length then the measurement of phase is not useful isn't it? We know the phase but we do not know the number of full wavelengths travelled by the wave then the information of phase alone is not useful it doesn't tell us much say we consider an X band radar system operating at 3 centimeter wavelength then an uncertainty in distance would have to be much lesser than 3 centimeter for the phase measurement to make sense. Let me try to reiterate say we are considering an X band radar system operating at 3 centimeter wavelength then an uncertainty in distance would have to be much lesser than 3 centimeter for the phase measurement to make sense isn't it? Now in an interferometric SAR or INSAR the phase difference between two synthetic aperture radar images of the same area are acquired at different time to create an interferogram which is used to infer ground surface displacements or the range change between two time periods. Now we will try to understand this through the schematic in front of you. So towards the left side you see a single antenna SAR wherein the return echo can be from anywhere in this circle okay say we are trying to measure the return echo from this point over here. So it can be from anywhere in this circle and towards your right side you see the schematic for interferometrics SAR, INSAR wherein there are two antennas used, antenna 1 and antenna 2 and they are used to capture information from the same point okay. So the return echo can be from the intersection of circle in red and circle in blue, interferometrics SAR that is phase difference between two SAR images of the same area are acquired at different time to create something known as an interferogram, interferogram. And using an interferogram we are trying to infer about the ground surface displacement between the two time periods okay fine. But then why to learn about interferometry in this course. Now INSAR is ideal as I mentioned before to measure the surface deformation and think about the amount of information will be available to us using INSAR in the event of natural hazards like landslides or earthquakes. Of course there are other techniques like the global positioning system, GPS or spirit leveling etc which can provide us with elevation information but then they are labour intensive and they cannot cover large area extents whereas we use INSAR imagery to cover precise information about surface deformation for broader regions of interest okay. When we use INSAR imagery they can precisely give you surface information over a larger area extents over a broader region of interest alright. So what you see towards your right side is one product that is obtained from interferometric techniques which is indispensable I would say for hydrologic and hydrodynamic modeling. They are called as DEMs or digital elevation models. So interferometric techniques result in generation of digital elevation models which are created at high vertical precision from the space making them invaluable for hydrologic and hydrodynamic modeling. So what exactly are digital elevation models and how they are useful in hydrology will be covered as part of an upcoming lecture but for now understand that the color differences show the elevation difference okay, elevation difference with respect to a reference surface okay. And what you see here is Ipsar Alaska from USDS Potter alright. Let us try to understand a bit more about INSAR interferometric SAR hoping that you remember the example which we used to discuss in the last class about sound waves interacting. So I am going to use the same example here, assume the two sources sending coherent electromagnetic waves are from two antennas in space say antenna 1 and antenna 2. So the baseline or interferometric baseline refers to the separation distance between the two sources okay between the two antennas. The two antennas are separated by distance known as baseline distance, baseline and the interference pattern can represent the phase difference which is measured from different directions, is not it? And in the case of synthetic aperture radar which is onboard an aircraft or a satellite they are going to have a side looking geometry. So the interference pattern also it is going to be generated accordingly okay shown here is just for representation purposes so that now you can replace the sources of sound waves in the same example you can replace the source by antenna 1 and antenna 2, two sources okay moving on. So in this diagram you see the baseline distance that is the separation distance between the two sources antenna 1 and antenna 2. The figure shows baseline components along the x axis as well as the component along the y axis is not it? Just to let us know that there are different means of denoting the separation distance between two satellites either we can use the separation distance along x axis and along y axis or we can use the baseline distance and alpha an angle so that we can denote the separation between the two antennas. The third way is to use the baseline distance the perpendicular and parallel components of baseline indicated in diagram C. Remember right now we are speaking about the distance separation distance between the two antennas there will also be a time difference between the overpasses which is referred to as temporal baseline. And this temporal baseline it can be days it can be at a sub daily time scale that is the time the aircraft takes to fly over the scene temporal baseline okay. So the separation distance between the two sources that is the two antennas they are referred to as baseline distance and there will also be a time difference between the overpasses which is referred to as temporal baseline okay. So in today's lecture I am going to introduce you to few new terminologies because that is required for us to understand about INSAR interferometrics R alright. So moving forward different types of radar interferometrics and broadly speaking there are two main classes of interferometric radars which are separated based on the geometric configuration of the baseline vector and they are cross track interferometry and along track interferometry okay. In a cross track interferometry the antennas are separated in the cross track direction and in along track interferometry the antennas are separated in the along track direction as seen in the schematic in front of you remember the underlying principle of interferometry is that two or more measurements of synthetic aperture radar images they are taken and they are combined to give more information than what was originally available and these measurements can either be taken by single antenna or different antennas okay that brings us to our next terminology I will give this some time to sink in you can note that the separation in a cross track interferometry is along the cross track direction and the separation in along track interferometry is along the along track direction. Now why you hold that thought there can also be measurements which are made by a single pass or by repeat pass or multi pass just to make things clear single pass interferometry is the measurements that are made using one single flight over the scene measurements made using a single flight over a scene single pass interferometry it is possible in two modes two modes mode one is wherein the single platform it carries two antennas single platform carrying two antennas and there is transmission from one antenna and measurement of return echoes from both the antennas mode one okay now there can also be a second mode mode two wherein the transmission and receiving of signals are alternating between each antenna transmission and receivable of signals are alternating between both the antennas. An example is tandem X which is a twin satellite of Terra SAR X that is a German Earth observation satellite it uses SAR which has a single pass SAR interferometry configuration tandem X moving on now for tandem X there are different data acquisition modes remember the different modes of SAR similar to that there are different data acquisition modes for tandem X again easy to remember terminologies but I want you to be familiar with these terminologies because they will be required for us throughout this course data acquisition modes for tandem X they are by static acquisition mode by two by static acquisition mode wherein the transmitter and receiver are separated by distance and monostatic mode where they are not separated by distance that a single device is responsible for both transmission and reception okay the schematic is shown here for by static acquisition mode and for monostatic acquisition mode monostatic means transmitter and receiver are not separated by distance but then a single device is responsible for both transmission and reception as well alright. While we discussed about single pass interferometry please be aware that there can also be something known as a repeat pass interferometry repeat pass see having you know two instruments on board the same platform the same satellite can lead to weight restrictions or increase in cost because of which an alternative can be decided based upon the form of repeat pass interferometry you know so an example I can give you is biomass which is an earth observing satellite which is planned for launch in the year 2023 by the European Space Agency it shall use repeat pass interferometry so the difference is that here two measurements are made using different overpasses of the same instrument different overpasses of the same instrument now we have discussed about single pass interferometry repeat pass interferometry there can also be a multi pass interferometry which is possible when multiple passes of the same instrument can be used to collect information multiple passes okay. So now let us understand a little bit about a cross track interferometry for which a detailed geometry is being shown here. So here assume this is the ground surface okay and assume X is an elevated point on the ground surface here you can see two antennas antenna 1 and antenna 2 their distance from the point X is represented as range 1 that is R1 and range 2 that is R2 look angle is shown here that is theta L say our aim is to estimate the height of the point X from a reference surface reference surface remember the reference surface can either be an average terrain height or it can be a datum like MSL mean C level alright. So now we know what is theta L what is alpha what is baseline distance what is R1 and what is R2 isn't it? Towards your right side there are given different across track in SAR configurations just for your understanding that is we can have single pass with two active antennas single pass with only one active antenna but then two receiving antennas dual pass using single antenna and multi pass using single antenna. So they are just different across track in SAR configurations. So we have just understood the geometry of a cross track interferometry assume that the surface is not changing across the temporal baseline. So we know that theta L is the look angle R1 and R2 are the range distances from antenna 1 and antenna 2 to the point X which is an elevated point on the ground surface and we know that the phase difference that will be measured between two sources for a given look angle shall be related to the path length difference which I am going to denote as delta R which is going to be R2 minus R1 for a single pass interferometry but then for a repeat pass interferometry wherein two measurements are required using different overpasses of the same instrument delta R can be represented as 2 into R2 minus R1 okay. Again I can write the phase difference as delta phi equal to 2 pi delta R by lambda and this in turn I can write it as k multiplied by delta R where k is nothing but the wave number okay. Remember that we can represent the absolute phase difference we can represent it in the form of delta R equal to baseline distance baseline multiplied by sin of alpha minus theta L okay moving on. Now remember that our aim is to ultimately use the signals of synthetic aperture radar image pixels take their phase difference and then estimate the look angle that is the direction to that pixel look angle and once we know the look angle of a pixel we can estimate the true ground range and height the true ground range and height. Now please remember that we are measuring the relative phase difference here not the absolute phase difference. So the return echoes from the earth surface shall have a relative phase difference which can be represented as an image called as interferogram that you see in the screen in front of you interferogram. It looks like this it is a colorful image which shows the phase value of each area on the ground surface. So instead of using variations of black and white interferograms are colored wherein the lines of equal color are called as fringes. Fringes lines of equal color fringes okay and for up slope the fringes shall be closer together whereas for down slope the fringes shall be far apart okay. Now once we have obtained the interferogram what next? So the next step is to convert the phase differences into look angle and to height above a reference surface and this is performed so that the final product is in a standard geographical coordinate system. This is also called as flat earth interferogram. So remember we have a measured interferogram, we have a flat earth interferogram and you are subtracting to create a flat earth corrected interferogram okay. Please remember that still now we are not aware about the absolute phase difference okay. And to estimate topography of the terrain the absolute phase difference is required it is essential. Now if the fringes are in cycles of 2 pi every time the phase reaches 2 pi it is going to be reset to 0 isn't it. Let me re-itrate say the fringes are in cycles of 2 pi every time the phase is reaching 2 pi safe or even a small amount of phase above 2 pi even then it is going to be reset to 0. And such an interferogram is called as wrapped because whenever it is reaching 2 pi it is going to wrap again to 0 wrapped okay. The concept of a wrapped image is shown here. So in other words at any instant when the phase values are greater than 2 pi the phase resets to 0 as a relative phase difference is only being measured and not the absolute phase difference we need a way to estimate the absolute phase difference which means we need to do phase unwrapping phase unwrapping okay. In phase unwrapping we assume that the terrain is continuous such that every time the phase is countered to a value greater than 2 pi even by a small increment of delta the phase shall be re-esigned as 2 pi by delta. And this process is carried out till the next cycle is reached that is 4 pi again for any further increment in delta phases will be re-esigned as 4 pi by delta and so on. So you see the difference in the concept of unwrapped image and the concept of wrapped image isn't it okay. So you have understood the basic terminologies that are required for processing a single look complex image. So these are the steps that you will be following as part of the tutorial that is for INSAR processing. We start with a single look complex or SLC data or a raw unprocessed SAR image. We perform something known as a core registration. See in INSAR there are terminologies of master image and slave image and these refer to difference images being used for core registration such that the master image always remains unchanged okay. Now after core registration the next step is generation of interferogram and after that it is phase unwrapping and finally it is to infer pixel heights okay pixel heights. Let us try to quickly understand how to interpret an interferogram because in a few slides before I just showed you a colorful interferogram. We did not actually learn how to interpret it. So assume this figure represents how land surface displacement is represented and assume each fringe or each color cycle is say 18 millimeters of range change okay. And to interpret an interferogram we have to count the number of fringes between two points in an interferogram where one fringes one complete cycle. For example say you are focusing on this area that is within the circle you have to count the number of fringes between two points in an interferogram where one fringes one complete cycle. So between blue to blue it is going to represent 18 millimeters. So two fringes is going to be 18 plus 18 again in a region with up slope the fringes will be close together it will become squeezed together and in a down slope region the fringes shall appear stretched apart okay stretched apart from one another okay interpreting an interferogram okay. Let me leave you with two more terminologies for this lecture. You know can we take two SAR complex imagery that is captured from two different flight path and simply subtract the face value of image one with the face value of image two to produce an interferogram. Is that how it is done? You pick two SAR complex imagery subtract the face of image one with the face of image two will that give an interferogram? So then my question is what happens to the area of radar shadow? Remember when we discussed in detail about image distortions we covered a term known as radar shadow that is a silent zone a silent zone wherein you get no measurement from a radar shadow region isn't it? Now we need some means to discriminate meaningful measurements when the face differences share information about the geometry rather than the noise okay that is the face difference should make sense they should not be all noise. In radar interferometry we use an expression to estimate coherence of an image coherence of an image I am introducing you to one more new terminology that is coherence of an image. Say we have two single look complex pixel values n is the number of pixels and let I1 and I2 be the SLC complex pixel values. For estimating coherence we assume that across a small window the image surface properties shall remain constant okay this is the expression that gives you coherence sigma n I1 into I2 the whole by root of sigma n I1 square sigma n I2 square. Now before I move forward please understand that the face angle gives face differences face angle of coherence gives face differences and the magnitude gives a measure of meaningfulness of measurement okay how meaningful the measurement is. Now as the magnitude gives us the quality of correlation it is also sometimes called as the degree of coherence okay all right. Now before I move forward let me quickly display a set of stereo images in front of you okay stereo pairs of images showing the basalt cliffs of Argentina from NASA site. To view the stereo pairs as it is written cross eyes slightly until a third white dot appears between the two I will give it some time are you able to see the center image which is in 3D try to cross eyes slightly until the third white dot appears between the two white dots are you able to see the new center image which is in 3D. So this cross-eyed image was generated using the topographic data from shuttle radar topography mission SRTM plus the Landsat 7 images here you look at two perspectives of the same image using one for each eye isn't it and in the process the image shall appear to be shifted slightly depending on its elevation giving you a vertical view of the earth surface in three dimensions. The reason I am showing you this image is because while we discussed about INSAR we did mention that it is used to create high resolution ground topography but then please note that there are other means of extracting topographic information from synthetic aperture radar imagery and one such method is known as stereosar radar geometry. I will leave you with the last new term of this lecture that is stereosar radar geometry shown here is the geometry of stereosar radar geometry wherein there are two antennas the baseline separation between the two antennas R1 and R2 are the range distances of an elevated point X the ground surface is shown in the brown colored line and the reference surface is shown in the black colored line H is the height of the point X from the reference surface. Now this reference surface remember it can be an average terrain height or it can be a datum like the mean sea level. The look angle is also given here theta L. Now if you watch closely you can note that the same point on the surface of the earth that is X is projected to two distinct locations on the reference surface that is X12 dash and X2 2 dash the same point X it is projected into two distinct locations on the reference surface. Now the difference between these two points is proportional to the height of the point above the reference surface and by measuring R1 minus R2 we can estimate the surface height. Let us try to understand this through a derivation. Please note that the two images from two antennas are acquired with a separation known as a baseline. We know what is R1, we know what is R2, we know what is baseline distance and what is theta L that is the look angle. We can use the cosine rule and write an expression R1 square equal to R2 square plus baseline distance square minus 2 into R2 into baseline into cos of 90 minus theta L plus alpha. We are using the cosine rule here. You know what is R1 and R2 range distances of antenna 1 and antenna 2. We know what is baseline, we know what is theta L, we know what is alpha cosine rule. Now what I will do is I will further rearrange this expression a little bit such that I get cos of theta L minus alpha equal to R1 square minus R2 square minus baseline square 2 into R2 into baseline. So theta L is going to be cos inverse this expression plus alpha, is not it? To estimate H we are assuming that the baseline angle that is alpha is known such that we can write H equal to isn't it? Now remember R1 is known H is nothing but a flight parameter which means you can estimate small H once you know H R1 theta L and theta L that is the look angle can be estimated once you know R1 R2 that difference and baseline. So what we discussed was the geometry of stereosar radar geometry. So let us try to summarize what we learnt as part of today's lecture. So today we understood about in SAR that is interferometric SAR and then we covered a few terminologies that are required for us to understand about in SAR. We understood how in SAR can give us topographic information about the earth surface. We saw the different modes of acquisition of data, single pass interferometry, repeat pass interferometry and multi pass interferometry and we also understood that baseline is the separation distance between the two antennas. Further we also tried to understand about coherence of an image. We tried to interpret an interferogram and we also understood the different in SAR processing steps. We finally understood that in SAR is not the only method that can give us topographic information from SAR. We also have stereosar radar geometry. So let me hope that you could understand what was covered as part of today's lecture and I will meet you in the next class. Thank you.