 Hello and welcome to Tutorial 7, Handling Active Microwave Data from Satellites. Firstly, let us see what is active microwave data. So active microwave sensors, they transmit short coherent pulses of electromagnetic radiation in microwave frequencies at radial angles as you see in this diagram. So as these pulses travel away from the instrument, part of them get reflected back to the antenna due to back scattering from targets within the sample volume and this time difference, difference of time between the transmission and the reception, it generates a map of received power based on all the sampling volumes. So shown here is a diagram that explains active microwave remote sensing and how data is collected. So in active remote sensing, characteristics of the ground surface are mainly derived from what is known as the backscattered values. So you can see the reflected microwave signals from the built up area, from the clouds and the transmitted signals shown in blue from the instrument, ok, moving on. So shown here are a few satellite missions that are relevant to measuring variables important in hydrology or variables which we continuously use in hydrology and water resources engineering. For example, I have tried to subdivide water into groundwater, soil moisture, surface water and others. So that you can see what is available in the saturated zone, unsaturated zone and the surface. So there are missions like the tropical rainfall measuring mission, abbreviated STRMM or the global precipitation measurement mission, abbreviated as GPM and the tropics, etc., which help us to study about precipitation. Now they contain active sensors as well as passive sensors operating in the microwave region of the electromagnetic spectrum. Now say you want to study about the water levels in rivers and lakes from space. There are missions like Jason, Saral, SWOT, etc., to name a few which operate in the microwave region to give you the water level from space. Again if you want to measure soil moisture you have missions like the ASCAT which stands for advanced scatterometer, we have SMOS which stands for soil moisture and ocean salinity and SMAP which stands for soil moisture active passive. Moving on, there are missions like GRACE that help one to study more about the terrestrial water storage. So this was just to give you a summary. In this part of Tutorial 7, we will try to learn in detail about data from GRACE that is what is used to estimate terrestrial water storage and the data from ultimate missions like Jason and Saral. Firstly, GRACE stands for Gravity Recovery and Climate Experiment that is the abbreviation of GRACE. So this mission provides unprecedented information about the monthly change in terrestrial water storage. So GRACE consists of as you can see in the diagram, twin spacecrafts, one is here, one is there and they fly at about 220 kilometers apart and they send microwave signals back and forth. So assume the twin satellites of GRACE are passing over a region say the Himalayas or the oceans where there is increase or decrease of gravity. Whatever that happens, the distance between both the spacecrafts will change and this change in distance allows the satellites to map the Earth's gravity field. Given here are the spatial resolution and latency period of GRACE. Let us try to understand how to work with GRACE data. Now before that, GRACE observations if you go to the website you will find that they are available in three different processing levels. So we have level 1 data that is satellite tracking data, level 2 data which is global gravitational stokes coefficients and level 3 data that is global grids of change in terrestrial water storage. So as part of this tutorial I will be trying to show you how to work with level 3 that is gridded data showing terrestrial water storage. Also it should be noted that the Central Groundwater Board of India records seasonal in situ well hydrograph data for more than say 30,000 well locations at Pan India scale. So in research we find a lot of studies which tries to validate the measurements from the GRACE satellite with the in situ data. So let us try to work with GRACE data in Python. So as before I have created a notebook and named it as Tutorial 7 followed by name of the course. So first what we do is we try to import all the necessary libraries for us to perform this exercise. So let me start with import OS. This module in Python gives you the functions for interacting with the operating system. It is part of Python standard modules. We have already seen what is NumPy and Matplotlib. Pandas is used for data manipulation and analysis. Now for plotting I am importing matplotlib.pyplot and I am importing matplotlib.dates as well as .grid spec. So matplotlib.dates are represented as floating point numbers. Now again the grid spec is a grid layout to place the subplots within a figure. The color bar as the name suggests it helps you to visualize mapping from scalar values to colors and date time. So netcdf4 is a Python interface to netcdf libraries and this has a lot of features not found in earlier versions of the library. So netcdf stands for network common data form. We have grays data in .nc format that is netcdf format and hence we require this. We have already seen scipy. Now x array it helps to deal with multidimensional arrays in an easy manner. Cartopi.crs here crs stands for coordinate reference system. Scipy is a package that is used for geospatial data processing to produce maps and other geospatial data analysis. If you are interested to create animations you can include matplotlib.animation. It helps in creating a live animation. Again shapely.geometry it allows to view the geometric objects without having the need for any graphical package. So we shall be using a few of these for the present exercise. The additional libraries are shown here so that you are familiar with its existence. Remember as before if you do not have any of these libraries installed please go to the command prompt as was shown in one of the earlier tutorials and make sure that you have these installed before importing the same for this exercise. So once we have all the necessary libraries installed let us open the data set. So I am going to the specific location where a sample netcdf file of grays is saved. It was downloaded and saved. I am going to specify the path of the data set here. It has the file has a lengthy nomenclature is not it as before okay dot n c yes. So let us try to see what data contains. You can see all the details given in the file what it gives that is it gives gridded surface mass anomalies derived from spherical harmonic coefficients. The processing level is 3 that is level 3 data and then you see information about the geospatial latitude longitudes and also the variables dimension sizes. So all the details regarding the dot n c file netcdf file is present here okay. Now let us try to visualize the data because I want to see how the data looks like. So as before I am going to use plt.figure, figure size 12 6 we have already covered these functions hence I will not repeat it. I am going to specify the axis now I also need to add the projection. So you see I have added an empty figure. So now let us try to fill it with the coastlines. Going to use ax dot coastlines you see the coastlines have been added. Now I want to add the grids. So let me use ax dot gridlines you see the grids have been added now and now I want to open the dot nc file. So I am going to use the function xr dot open underscore dataset okay and going to copy the path encodes okay. So now I have opened the dataset let us see what it contains specifically I want to see the liquid water equivalent thickness okay lwe thickness I want to see it plotted in the figure below. So dot lwe underscore thickness is the name of the variable that is present in the netcdf file which you saw when the file was opened that is why I am adding grids dot lwe underscore thickness. Going to add the transform. So plate carry is a projection okay and c bar underscore kwa rgs it is used to specify the number of ticks in the plot okay. So now we get to see the dataset if you look at the legend each color represents the liquid water equivalent thickness in meters the time is also mentioned okay. So similarly you know when you try to look at the grids data if you want to see for which latitude and which longitudes this information is present that can also be viewed if you want to extract specifically the latitude and longitude values you need to type grids dot long remember grids is the name I gave to the variable. So whatever name you are giving when you open the file that name dot lat dot long it helps you to see the latitude and longitude for which data is available okay. Just to make it clear yes. So now that we have opened the data let us try to move further just to be clear we have used XR dot open underscore dataset and I have picked out the variable that is LWE underscore thickness I have plotted the same specified the transform and I have viewed the data and now I know what the net CDF file contains in terms of data latitude longitude okay. Let us try to move further I am going to use XR dot open underscore MF dataset function okay another function that helps me to open the net CDF file data name path is as before you can see. So you get to observe the array shape the bytes are shown the type is float 64 units meters of what liquid water equivalent thickness because I have tried to extract only that okay. As before I can explore the latitude and longitude and time for which data is available the grid mapping is WGS 84. Say I want to display the data for a particular section of latitude longitude okay which means I want to slice the data I can use grace underscore LWT dot SL SEL select I want to select the latitude between 25 30 and the longitude between 70 80 and plot it. So the same is going to be plotted here as a spatial plot as before you can see the legend which gives the colors that are nothing but liquid water equivalent thickness in meters so similarly for your individual works you can extract the data specific to your own study region using this function and then you can move further say you want to generate time series or you want to generate a series of spatial plots to be shown as an animation. So everything is possible using these simple libraries and commands okay. So now that we have seen how to deal with grace data in python let me take you further into the next topic that is satellite altimetry. The history of satellite altimetry started by late 1960s and it has seen missions like topics Poseidon, Jason 1, 2, 3, Envisat, Saral, etc to name a few. Now shown here are the name of missions, the altitude, the frequency in gigahertz to be specific the microwave frequency in gigahertz in which it operates, the sampling rate, the spatial resolution and the temporal resolution. These are just a few examples of satellite altimetry mission which has the potential not only for measuring the dynamic topography of the sea surface but also for applications such as monitoring inland water bodies. So let us try to understand what the term altimetry means okay. So shown here is the basic concept of satellite altimetry that is the fundamental principle of satellite altimetry is it estimates the water surface elevation by measuring the distance from satellite to the surface of water okay, distance from satellite to the surface of water. Now for this the sensors on board the satellite transmit microwave signals at the rate of even say 2000 pulses per second for high precision altimeters. Now these signals are transmitted towards the earth surface as shown in the diagram and the two-way travel time taken by the signal is measured, two-way travel time taken by the signal. Here R denotes the range, C is nothing but the speed of light and delta T is the two-way travel time between the satellite and the target surface on the earth. Now this is then converted to one-way distance because you just divide by two you get the one-way distance between the satellite and the target surface because we know the speed of light okay. So the resulting value is called as range which is also specified in this diagram okay, again the height of reflecting surface above the reference ellipsoid is calculated in this manner. Here h, small h is the height of reflecting surface above the reference ellipsoid, capital h is the altitude of satellite above the reference ellipsoid and corrected range is the range measurement that has been corrected for errors say for propagation and geophysical corrections. Further say you need to compare the information from ultimateary satellites from in situ gauge measurements as I mentioned earlier. So here we need to correct for local undulations as given by this equation where Wsh stands for water surface height which is nothing but small h that is height of reflecting surface minus geoid or mean sea level because they are the local undulations to be applied for land and ocean respectively okay. So this is the fundamental principle with which satellite altimetry works to give you the water levels from space. Now let us see where to get the satellite altimeter data from okay. Now you can download the data from the MOSDAC website that is Meteorological and Oceanographic Satellite Data Archival Center okay MOSDAC website for SURL as well as you can go to Copernicus Hub for Sentinel-3 data set respectively. So depending upon which satellite data you want to use you can each go to the respective websites create an account and then download the data. Now to download the data from Copernicus Hub was already covered as part of a previous tutorial. I hope that by now these terminologies are clear to you. Now let us try to visualize to see how the data looks like. Now when it comes to altimeter data it looks something like this what you see in the screen in front of you. So altimetry data consists of mainly two types of variables okay. One is the satellite based variables and the second is the model based variables. Now when satellite based variables get recorded by satellites like range or altitude the model based variables like corrections applied to range are based on numerical weather prediction model. So it is a combination of both data shown here is the data visualization from JSON-2 satellite. The ground scanning velocity and the pulse footprint size the virtual stations are displayed. So virtual station is nothing but whenever the orbit intersects a water body that intersection we call it as virtual station. So as I mentioned before altimetry data will have satellite based variables as well as model based variables. They are each having different spatial resolution and different temporal resolution okay. So you see different colors are used to denote the data from satellite based as well as the data which is from model based. So as an example we can see that JSON-2 data is available at a frequency of 20 hertz whereas its corrections are available at 1 hertz okay just to show you how the data looks like. So from here let us try to understand further how to visualize the data from JSON-2 satellite as well as from the SRL satellite. Now this portion will be covered in part 2 of the section I will see you in the next class. Thank you.