 In five minutes Zenit will be great and your presentation please. Okay thank you very much. I will start sharing the screen now. Let's hope this works. You should see the screen and if we go to full screen. So you should be seeing the full screen right now, I hope. Yes. Okay, welcome everyone and as Alec mentioned today I'm going to give you an overview of data simulation and inverse modeling of atmospheric trace constituents. And since I know this is an area with which many of you are not familiar I'm going to spend a little bit of time at the beginning introducing the subject matter and then go through some examples of the use of data simulation inverse modeling in this area of research. As many of you are aware I'm sure that you know much of the motivation for this work is due to the fact that human activity has profoundly changed the composition of the atmosphere and that has implications for climate as well as for air quality. In an air quality context it is now well established that air pollution is a major cause of human mortality globally. The images here from the Cohen et al study shows this unfortunate reality. The top panel shows deaths in 2015 due to fine particulate matter exposure. These are PM 2.5, these are small particulates that are less than 2.5 microns in diameter. This is showing the percentage of total deaths in each of the countries in 2015 due to PM 2.5 exposure. And just to give you know if you can't see the numbers clearly this is on the order from you know three all the way up to more than nine percent of total deaths due to particulate matter exposure. In 2015 they estimated there were about four million deaths globally associated with that particulate exposure. For ozone the picture is less grim but equally unfortunate. The numbers here the scales are different. The numbers here are the percentage of deaths per hundred thousand people and Cohen et al estimated that in 2015 there were about a quarter of a million deaths globally due to ozone exposure. And so from an air quality perspective to develop effective air quality regulation strategies we need reliable estimates of emissions of pollutants on policy relevant scales. In addition there's also a need to understand how emission of pollutants will impact the global atmosphere and how those changes in the global atmosphere will in turn influence local air quality. And this information is needed everywhere because every country is impacted by this problem. In North America for example we've seen significant improvements in air quality as a result of air quality regulations but despite those improvements there are still large numbers of people who are living in regions where the ozone abundances for example exceed the air quality standard. The plot on the top left here shows those measurements of NO2. NO2 is a compound that is precursor gas to ozone so it leads to the production of ozone in the atmosphere. It's not NO2 is not emitted but it's created from NO nitric oxide which is emitted from high temperature combustion and NO very quickly forms NO2 which very quickly forms ozone. So if you have NO2 you will make ozone. This shows satellite observations from the OME instrument of the vertically integrated column abundance of NO2 over North America in 2005 and in 2013 and you can see the red color showing high abundances of this ozone precursor gas decreasing significantly when we get to 2013. So very dramatic reduction in the abundance of NO2 in the atmosphere of the North America as a result of air quality regulations over the past two decades. Associated with this reduction in NO2 since NO2 is a precursor gas we see reductions in ozone. This plot is showing urban trends in surface ozone. The arrows indicate the direction of the trend and the magnitude of the trend. Downward pointing arrows indicate a negative trend and so you can see ozone is decreasing pretty much most of the urban areas across the United States and Southern Canada as well. However according to the US EPA in 2019 there were over 74 million Americans who were living in regions where ozone abundances exceeded the air quality standard. In Canada it's about eight million people a comparable fraction of our population still living in areas where ozone is an issue. So despite the improvements there's still a long way to go and there's critical need for top-down as we call inverse modeling estimates of changing emissions of pollutants over you know how the pollutants are changing in time so we can better target the various pollutants that are important to to improve the health situation. And much of the talk that I'm going to give today will be focused on inverse modeling of these pollutants. Another key issue that we face is a global society is the dramatic rise in greenhouse gases in the atmosphere the fact that human activity is significantly perturbing the global carbon cycle which of course is leading to dramatic changes in the climate system. The figure here on the left shows it's a sort of the canonical picture if you will of the CO2 record from Mauna Loa in Hawaii. The red curve shows the monthly mean CO2 concentration as measured at Mauna Loa and you can see the strong seasonal cycle due to the fact that in summer across much of the northern hemisphere in boreal summer we have significant vegetation growth the leaves come out on the trees they're strong uptake of CO2 by due to the photosynthetic activity that leads to this drop in CO2 that we see in the atmosphere. In winter photosynthesis is suppressed and we have net respiration net emission of CO2 to the atmosphere and that drives this seasonal cycle which effectively you know looks like we're watching the biosphere breathe as we go through the seasonal cycles. The black line is the de-seasonalized curve showing the very constant increase in CO2 over time due to human activity so this increase that you see here is really being driven by fossil fuel emissions to the atmosphere. Now however as bad as this is we are somewhat fortunate in that only about 50 percent of the CO2 that we emit to the atmosphere remain in the atmosphere the rest of it is taken up by the oceans and the biosphere and how that how that partitioning is changing is shown here on the right the top part of the panel shows the emissions as a function of time from the 1800s to the present and you can see this significant increase over time due to fossil fuel emissions there's also land use change which contributes to CO2 emissions to the atmosphere. As I said not all of that CO2 remains in the atmosphere this sum of it is taken up by the ocean so the blue colors here indicate the ocean uptake and then there's a fracture that's taken up by the land and then the residual that remains in the atmosphere is indicated by the light blue. If you look at the land sink you can see there's significant variability of the time so large seasonal variations are due to the biosphere but also interannual variations are also due to the planned biosphere the drys variations in the atmospheric CO2 so if we want to make reliable projections as to how atmospheric CO2 concentrations will evolve in the future it's really critical that we able to model how the biosphere component will change with time because this biosphere signal component is is highly variable and it's sensitive to changes in climate El Nino and La Nina variations will lead to changes in the biosphere long-term changes in climate will affect the biosphere sink which in turn will then feed back on atmospheric CO2 levels and as a result global warming so it's critical that we model how the biosphere changes in time and unfortunately at the moment our terrestrial biosphere models do not do a very good job of simulating these biosphere fluxes this figure shows five CO2 fluxes in summer boreal summer from five different terrestrial biosphere models ignore the units I didn't plot the units here and the color bar doesn't have any relations the main point here is for you to focus on the spatial patterns and the intensity of the colors it's during the summer boreal summer when we have significant photosynthetic uptake across the northern hemisphere due to the vegetation growth and so you see blue colors indicating a sink of CO2 across much of the high latitude northern hemisphere but as you look across the five different models you see that there's a fairly large spread in the fluxes you have sip three versus the C 10 crew and set model for example this is suggesting a much weaker uptake of CO2 than the crew and set model spatially the models all seem to be consistent which is great for the magnitude of the uptake varies dramatically and so a key objective of the inverse modeling community is to use atmospheric CO2 observations to try to obtain robust estimates of regional sources and sinks of CO2 that can be used by process based models like these to try to improve our understanding of the terrestrial biosphere the biosphere processes and hopefully in doing so then be able to make more reliable projections of how atmospheric CO2 levels may evolve in the future to also go after anthropogenic emissions because there's a growing interest in using atmospheric CO2 measurements in an inverse modeling context to quantify anthropogenic emissions from fossil fuel combustion for example we need to be able to reliably estimate what the biosphere is doing so that we can tease up what the human signal is so both of those issues you know are driving this this inverse modeling work that we're doing as part of the community now a challenge that we face in doing this is to develop an effective observing system that will provide the information needed to allow us to constrain these fluxes on regional scales to get robust fluxes on regional scales and to give you a sense of the challenge of developing this observing network I have this animation here this is a model simulation of CO2 this is from the environment and climate change carbon assimilation system there's no simulation in this particular animation this is just the model simulation of CO2 the white color represents high levels of CO2 the dark colors lower levels of CO2 and in spring coming out of winter you can see the CO2 being exported from the continental regions in Asia and North America and being transported around the world you see the strong synoptic signatures effectively you're looking at CO2 weather here as the emissions are dispersed globally as we move into summer you're starting to see some dark colors emerging this is the uptake signal from the biosphere as the biosphere starts to draw down CO2 in summer and that signal also gets exported globally now in the context of the inversion it's the gradients that you see here the transport patterns acting on the sources and syncs of CO2 that provides the information that we need for our inversion analysis and as you can see the system here is incredibly variable so the challenge that we face is to develop an observing system that's able to capture this incredibly varying system so we need to be able to have sufficient spatial and temporal resolution to capture a lot of the structure that you see here and much of the early inverse modeling work that was done starting in the late 80s and early 90s relied on a surface observing network that was fairly sparse and couldn't really provide the kinds of constraints that were needed to allow us to estimate these fluxes on on regional scales so this is what's driven the investment in new satellite technology over the last 20 years space agencies around the world have invested significant resources in developing space space observations to allow us to more densely sample this this state and hopefully provide the information that's needed so that we can get estimates of these fluxes on policy relevant scales for both you know carbon cycle applications as well as air quality so here's a picture of the NASA Earth fleet this is a very NASA-centric view of the world unfortunately but we have similar investments being made across other space agencies in the world these satellites that are listed here are not all just for atmospheric composition of course yeah it's for all of earth science there's a SMAP mission here for example which Fabio talked about yesterday this is the soil moisture active passive mission that was launched in 2015 and the the Landsat missions I'm going to focus on a couple of the the missions here I'm going to talk a bit about the Terra mission in particular one instrument on Terra there's the ORA mission this also had a number of composition instruments on board and there's still many of those instruments are still operational as you move into the future we have instruments like tempo which is a geostationary instrument that's that will be launched next year as well as geocarbon that's another geostationary instrument and I'll also talk a little bit about the orbiting carbon observatory OCO2 that was launched in 2014 so how do we make these measurements of these trace gases from space clearly from space we cannot make in situ measurements we can only rely on the radiation field that's emanating from the atmosphere and so to walk through how we do these this inverse analysis to estimate the trace gas concentrations given the measure of radiation field I'm going to use Mopit as an example Mopit is the measurement of pollution in the troposphere and it was launched on the Terra spacecraft but I just pointed out in December of 1999 it was supposed to be a five-year mission and it's still operating well this December it will be 22 years in orbit the instrument actually was designed here at the University of Toronto and Jim Drummond is the PI of the instrument and this was the first instrument designed specifically to measure pollution in the troposphere and the troposphere is that lower 10 kilometer of the atmosphere where you have convective overturning that drives a lot of the weather that we're familiar with. To measure carbon monoxide CO Mopit measures radiation in two parts of the spectrum in the near infrared part of the spectrum around 2.3 microns in which the satellite looks at the absorption signature of CO2 in reflected solar radiation and in the thermal infrared part of the spectrum around 4.6 microns in which we are looking at thermal emission of CO from the atmosphere and the way the retrieval works it's based on a Bayesian approach inverse modeling approach in which we have a set of measurements Y which are the measured radiances so these would be the level one data that Fabio talked about yesterday and given those measured radiances in the different parts of the spectrum we want to infer the atmospheric concentration of CO that would have produced the measured radiances and so X in this case are the retrieved profiles and this these retrieved profiles are referred to as level two products so we try to go from a level one product to a level two product when we do the inversion and as I said for the Mopit retrievals and for many of the retrievals today from these atmospheric composition instruments we take a Bayesian approach to do the optimization and going back to Collins lecture on Tuesday if we consider a linear model that relates our observations to our state of interest so X here is the state that we're trying to quantify given knowledge of the of the system based on the observations Y and we have an observation operator here that maps the state into the measurement space so if we assume we have a linear model and further let's assume we have Gaussian errors which Colin talked about on Tuesday the maximum a posteriori estimate which is the state you actually mentioned the maximum a posteriori estimator is an approach that tries to maximize the conditional PDF that to give us a statement that maximizes the conditional PDF and so the map maximum a posteriori estimator is obtained by minimizing this cost function assuming that we have Gaussian error statistics and what you're looking at here is an innovation which is the observations minus the projection of our multistate into the observation space and yes recalling the nomenclature that Colin introduced on Tuesday so we have the information coming from the observations here constrained by our prior knowledge of the system with XB this the background or a priori estimate of the state with an observation error with a background error covariance B and an error covariance for the observations are so if we minimize this cost function we get a solution for the state so our analysis given by this expression here and I've listed it in two different forms because what you decide to use for your analysis really will depend on the particular problem that you're working on so if you have a situation in which for example your state space is much smaller than your observation space so n is much smaller than m then this would be the preferred form to use because this matrix would be much smaller than the matrix here this form is used very often in the satellite inverse modeling community because our state space typically consists of maybe 100 levels in the atmosphere for the trace gas of interest and the measurement space can be tens of thousands of elements in spectral space for the atmospheric inversion problem which I will talk about as well this is often the preferred form and you may be more familiar with this form of the expression of the analytical solution to this inverse problem assuming Gaussian errors and linear model because often for our atmospheric models the state space is usually is usually much larger than the observation space and then of course the analysis error covariance matrix is given by this expression so what do the retrievals actually look like i'm going to show you an example of this approach applied to the retrieval of mothid data starting with the level one gradients is y and ending with the profile of carbon monoxide x so here's an example from a paper by Jim Crawford et al from 2004 remember mothid was launched in 1999 and so we're seeing the first set of observations coming from mothid being used by the community and in this paper they were looking at mothid data and comparing those data to aircraft profiles over the north pacific as part of a nasa aircraft campaign called trace b so this was an aircraft mission that nasa had over the north pacific in spring of 2001 to measure the export of pollution from asia over the north pacific to north america and they took advantage of the fact that mothid was in orbit at the time to isolate particular overpasses of the satellite that coincided with aircraft measurements so that they could compare the measured carbon monoxide from mothid with the measured carbon monoxide from the aircraft flying over the north pacific so these profiles shown here are for coincidence coincidences over the north pacific around 20 degrees north near the data plot so on the plot here we have as a function altitude in pressure units the mid troposphere about five kilometers is around the five hundred hectopascal level showing carbon monoxide as a function of altitude if we focus on the aircraft profiles there are two different aircraft profiles here the p3b and the dc8 we see as we move up from the surface a significant increase in carbon monoxide the units here are parts per billion and then decreased back to more background conditions what you're seeing here is a pollution plume that's been lifted off of asia in spring you tend to have cold fronts that come down from Siberia that lifts the pollution from asia into the mid troposphere and over the north pacific so you can see this fairly localized pollution plume over the north pacific here and both aircraft measurements are picking up that signal very nicely the thin black line here is showing the average of the aircraft profiles and you can see very nicely here this pollution plume that's fairly localized in in in altitude the moped measurement is the thick black line and you know when the first set of moped data became available there was a lot of confusion as to what are we actually seeing with moped because at first glance when you look at this moped profile the thick black line it doesn't match what the aircraft is seeing at all and we expected that the moped instrument would be would have smoothed the atmosphere would not be able to provide measurements at fairly high vertical resolution but if you were to take the aircraft profile and smooth that you would get something that looks like the thick dashed line here which still does not look like the moped profile the thick black line so there's a lot of confusion initially in the community as to how do we make sense of these remote sensing observations keep in mind this moped was the first instrument that you know took this sort of a Bayesian inverse modeling approach to provide this kind of space-based observations of pollution in the lower part of the atmosphere so to make sense of why moped looks like this compared to why to the aircraft profile that clearly shows this pollution plume signature we need to think about the smoothing influence of the retrieval of the inverse of the inversion that we're conducting here so if we go back to our analytical expression for the analysis this is our background guess plus the increment from the inversion going back to the nomenclature from Colin we have our innovation this is our gain matrix which gives us you know we multiply the gain matrix which we can write as k and the innovation we get the increment that's added back to our background to provide our estimate of the analysis but we do have we can take this a little bit further we have an estimate of a model that relates the observations y to the true state and that is our linear model that takes this true state and maps it into the observation space but of course our instrument has errors and so we have an observation error epsilon that we have to account for so we can substitute y into our expression here and in doing so we have an expression for the analysis in which we now substitute our observation model for y that is of this form and we're going to take the product of k times h so the gain matrix times the observation operator and we'll call that a now in this form this looks should look a little familiar to to many of you we can think of this as a linearization of the analysis around our background state where a is the sensitivity of the analysis with respect to the true state but as I said you know in this expression here a is just k times h so the gain matrix times our observation operator but we can just think of it as the sensitivity of the analysis to the true state we can now can substitute our expression for the gain matrix here back into our expression for a so we have a equation of this form and with a little bit of algebra we can write it as that a is equal to the identity matrix minus the analysis error covariance matrix times the inverse of our background error covariance matrix a priori error covariance matrix now if a is indeed the sensitivity of the analysis with respect to the true state ideally we would like to have an a matrix that is the identity matrix that suggests that we have perfect sensitivity to the the true state and if that were the case for a state vector x that consists of n components if we take the trace of a we would get n and that would be the total degrees of freedom for signal that we have it would suggest we have perfect sensitivity to each element of our state vector in our inversion analysis in reality that's often not the case and that's critical understanding that is critical to understanding why the moped retrievals look the way they do so let's go back to that example and look at the moped retrievals so what we have here are this is the same plot that I showed you before but now I've added the averaging kernels of the moped retrievals so actually let me go back and walk you through one more thing so this is the sensitivity of the analysis to the true state and so this matrix is going to be an n by n matrix but it's not going to be symmetric and each row of the matrix tells you the sensitivity of a given element in the retrieved state to all of the other elements components of the retrieved state so keep that in mind and what's plotted here are the is the averaging kernel matrix and in particular each line is one row of that averaging kernel matrix so if we look at the line with the open circles that's the averaging kernel for the surface level of the profile retrieval and it's telling us what is the sensitivity of the retrieved carbon monoxide at the surface to all of the other levels in the state vector x and there in this case this was a seven element state vector and you can see that at the surface the sensitivity of the retrieval to co at the surface is quite small so the averaging kernel is quite small there the surface carbon monoxide has greater sensitivity to co in the mid troposphere not at the surface if we look at the 850 hectopascal retrieval that's the open triangles that also has low sensitivity the surface low sensitivity also at 850 hectopascal with the sensitivity peaking around 500 hectopascal right in the middle troposphere around five kilometers and there are two things to keep in mind here what this means is that when you look at a market retrieval and you look at the profile of carbon monoxide the estimate that you're getting near the surface is not really a true reflection of the co at that surface because the instrument sensitivity to co there is quite low and so to a large degree you're seeing the background information being reflected in the version rather than actual information from the from the true state itself so the average problems provides a way of characterizing the the the smoothing influence of the sensitivity of the inversion to the the actual state and I should add actually add what make one more interesting point here which I'll I'll get to later on but I think it's worth mentioning here is that the reason the sensitivity is low here is because these retrievals are relying on radiance measurements in the infrared part of the spectrum around 4.6 microns and when you get near the surface it's very difficult to discriminate between thermal emission coming from the ground versus thermal emission coming from the atmosphere just above the ground so all the instances where you have significant thermal contrast between the ground and the air above it will you be able to discriminate thermal emissions from the atmosphere versus thermal radiation from the ground and so for these thermal infrared instruments sensitivity near the ground is very problematic and in this particular case we're looking at a retrieval over the north pacific the north the pacific ocean in early spring where conditions are quite cold and so there's very little sensitivity to carbon monoxide emissions near the surface. Given the smoothing influence of the of the retrieval and the fact that we can think of the retrieval as this linear expansion around the background state to properly compare the aircraft data to mop it we need to account for the smoothing influence of mop it and the biasing influence of the background information that went into the mop it inversions and so we can take the same expression that we have for this retrieval this linear relationship and substitute you know the true for which we don't really know but the retrieval is giving us an estimate of the true profile but now we can say well what if the aircraft profile were the true profile in the atmosphere well we can subtract the background state from that and apply the averaging kernel to get an increment that we add back to the background state and in doing so we're taking the aircraft profile and transforming it into the instrument space the mop it space so that we can make a proper comparison between mop it and the aircraft profile in the mop it measurement space so if you take the aircraft profile within black line and apply this expression here where we substitute the true state yeah yeah with the aircraft profile we get this process line that is this thick gray line so now this looks much more like what mop it actually saw this thick black line and this is telling us that mop it is indeed seeing this pollution plume but because of the smoothing influence of the retrieval the fact that we have low sensitivity near the surface we can't capture this signature fairly well but we can't we can't isolate the signature in space but the information is actually there in the retrieval nevertheless but in a very smooth representation and so you know we can now compare the two and these two look much more similar than this thick line and the thin black line from the aircraft the conclusion one of the conclusions in this Crawford et al paper was that these results provide insight on the sources of variability both real and artificial in satellite observations and that understanding these sources of variability is important if mop it observations are to be quantitatively useful and indeed the mop it has provided has provided the longest continuous measurements of carbon monoxide in space and the measure the measurements have been fantastically useful but there was a significant learning curve in the community initially to understand how to interpret these remote sensing measurements because of the significant smoothing influence of the inversion in going from the measured radiances to the concentration profiles of carbon monoxide and this applies for all the instruments as well all the trace gases as well for example here is an example of a retrieval of ozone from the tropospheric emission spectrometer tests troposphere commission spectrometer as an instrument on the aura spacecraft that was launched in July 2004 there were four instruments on board of which test was one test unfortunately is no longer operational it was decommissioned in 2018 however it was a five-year mission and the instrument worked wonderfully on 2004 until about 2018 the instrument itself is an infrared Fourier transform spectrometer that measures radiation in the thermal infrared part of the spectrum from three to 15 microns and ozone is estimated in the atmosphere from measuring emission ozone thermal emission around 9.6 microns where there's a strong ozone emission band now if you look at the average in kernel for the ozone retrieval you get something that looks like this this is one measurement on the eastern seaboard of north america at 30 degrees north 87 west on the 15th of august 2016 each line is one row of the average in kernel matrix the retrieval here was done on a 65 level grid so the average in kernel is a 65 by 65 matrix with each row of the average in kernel indicating the sensitivity of the retrieval on a given level to all to the ozone on all of the other 65 levels so if we plot the individual rows of the average in kernel matrix we get something that looks like this and in this paper mark parrington there's a postdoc in our group at the time color coded the various retrieval levels so all of the levels in the lower part of the troposphere between 1,500 hectopascals are color coded in red and as we move up in altitude the levels between 500 and 150 hectopascals are color coded in green and then the levels in the stratosphere from about 16 kilometers to about 25 kilometers are color coded in blue so the first thing to note as with the moped retrieval if we look at one of the lower tropospheric retrieval levels so one of these red lines we see that the sensitivity to ozone near the surface is quite small again it's a thermal infrared measurement and so it cannot really discriminate between variations in thermal emission coming from the sprung versus variation of thermal emissions coming from ozone just above the cloud we just don't have that sensitivity but we see that the sensitivity peaks around 800 hectopascals so around you know one to two kilometers is where we can we actually start to get sensitivity to variations in ozone and the instrument can now start to detect the changes in the radiation field associated changes in ozone at those levels but the fact that these red lines all look the same and essentially fall on top of each other and these red lines correspond to all of the retrieval levels from the surface to about five kilometers is saying that we can't really discriminate though between the change in the radiation field due to a change in ozone at two kilometers versus a change in the radiation field due to change in ozone at four kilometers so the measurement even though it were providing a profile on a 65 level grid the measurement itself is highly correlated and the average and curl captures that correlation structure to some extent so instead of getting 65 independent pieces of information for the 65 level retrieval for this particular retrieval we obtained about four pieces of information so four independent pieces of information from the retrieval despite the fact that it's a 65 level grid throughout the whole atmosphere if we look only at the lower part of the atmosphere the troposphere so the first 10 to 12 kilometers this particular retrieval provides about one independent piece of information and for this particular day if you look at the ozone retrievals across the the globe so this is showing the degrees of freedom for signal so the number of independent pieces of information that each retrieval provides on tropospheric on ozone as a function of latitude from the southern hemisphere to the northern hemisphere we see that for the whole profile for the whole 67 level profile typically we get between two and four independent pieces of information from the retrieval if we look only at the tropospheric component so the lowest 10 to 15 kilometers of the profile we see that in the northern hemisphere in boreal summer we're getting maybe one to one and a half pieces of information in the winter hemisphere where it's colder and in the southern hemisphere in particular where there's mostly ocean we're getting very little information so less than 0.5 degrees of freedom for signal so significant smoothing effects associated with the retrieval and it's important to keep that in mind this smoothing though really reflects the physics of the retrieval itself it's it's it's it's it's something that we cannot work around it's a fundamental aspect of the retrieval but there are ways to get more information on the vertical structure to improve this the inverse problem and get a better constraint on the overall state and one way of doing that is by combining information from different parts of the spectrum and this is something that has been pioneered with moped very nicely so if you look at the moped retrievals you know you can retrieve carbon monoxide from the thermal infrared part of the spectrum as I mentioned at 4.6 micron but I also said that moped measures around 2.3 microns as well in the near infrared part of the spectrum where it's looking at backscattered solar radiation so you can retrieve CO in this thermal infrared part of the spectrum as well as in the near infrared part of the spectrum and you get very different information you know that the smoothing effects are very different depending on where you are in the spectrum but you can also combine the TIR and thermal infrared information with the near infrared information and that also will change their performance of the retrieval and so let's take a look at what that how that works here in the five minutes or so that we have remaining so if we assimilate if we do the inversion using only the thermal infrared retrievals we get average in kernels that look like these and again each line here is a given row of the average kernel corresponding to different retrieval levels so if we look at the this cyan color here this is for the 700 hectobascale retrieval and you see that that peaks around 700 to 800 hectobascales there's no sensitivity near the surface and then falls off quickly with altitude the 300 hectobascal retrieval level peaks at around 300 hectobascales and then the sensitivity falls off with altitude as you go down toward the surface which is great for this particular retrieval the trace of this average in kernel suggests we get about one and a half pieces of information if we use just the near infrared channel this is looking at backscatter solar radiation we really can't say anything about the vertical structure of carbon and oxide because you're really looking at the absorption signature of the cl molecules in the atmosphere in this reflected solar radiation measured by the satellite and so when you look at the average in kernel it's they're all fairly uniform with altitude because there's very little vertical information that you can get from these retrievals you can see closer to the surface because you're looking at backscatter radiation near the ground but you can't really say anything about changes in carbon and oxide as a function of altitude you can really just capture the extinction signature of the sea molecules so the average kernels are fairly uniform and the degrees of freedom for signal here is about one so there's like one piece of information so you're getting essentially think of it as a column integrated estimate of the carbon molecules co molecules in the atmosphere so very different depending on where you are in the spectrum because it changes what the inversion provides you can now also combine the tir and the nir data as i mentioned and if you do that the average in kernels now look very different so if you compare the surface level retrieval which is in black here you see very little sensitivity near the surface with a black line some peak sensitivity near 700 hectobascals and then it falls off but very little sensitivity to the surface so the surface carbon oxide retrieval has essentially very little sensitivity to carbon oxide near the surface with the tir there's more sensitivity with the nir near ir but as i said you don't you can't discriminate between variations in co and the vertical however when you combine the two this is the surface level retrieval you can see it is quite the sensitivity is quite high so by bringing the two together you can use the near ir information to provide constraints on the co variations in the lower part of the atmosphere and the tir information to get some constraints on the mid to upper troposphere and in doing so you can now get close to two pieces of information you can discriminate variations near the surface as well as variations in co aloft at higher altitudes and the impact of this can be shown very not seen very nicely here in this comparison with moped and aircraft profiles this most these mosaic profiles are aircraft profiles over new deli there's a typo here my apologies for that so this is over new deli on the 3rd of september in 2004 so if we focus on the tir retrievals the aircraft profiles are shown here our aircraft profiles shown here as a red dotted line you can see high carbon monoxide levels near the surface this is reflecting the pollution near the ground which falls off with altitude and then increases aloft this c shape structure is typical of the influence of convection this time of year you tend to have a monsoon structure over over over south asia and you get lofting of this pollution from the surface and then convective outflow aloft and that drives this c shape structure when you look at the moped retrievals using just the tir data you get the black line so the aircraft measurements look nothing like the moped measurements and that's as i mentioned on the last two slides a reflection of the smoothing influence of the retrieval within near infrared only measurements there's even more smoothing of the retrieval so you get a fairly uniform profile over much of the lower to middle part of the atmosphere as opposed to some of the structures that you see here however when you combine the tir and near infrared you get the black line this is the moped retrieval combining the information from those two different parts of the spectrum and now you see the c shape structure showing up very nicely within the moped retrieval moped is actually picking up the surface pollution here with this multi spectral inversion compared to using only the tir or only the near infrared information and if you take the aircraft profile and transform it into the instrument space using the averaging kernels and the prior or background profile you get the red line here so not only is the moped looking very much like the aircraft profile by itself but also when you transform the aircraft profile into moped space they're very consistent so this is a beautiful example of how bringing additional information from the different parts of the spectrum together can give you more constraints on the atmospheric state and you can now start to constrain the vertical structure of carbon monoxide very effectively and you know in this paper by Helen Warden she showed you know just how transformative this multi spectral retrieval is in terms of the information it provides on surface level co so keep in mind the surface level averaging kernel the row of the average kernel for the surface level retrieval shows high sensitivity near the surface and these are the average kernels again but now if you look at the surface level carbon monoxide and this is a five-year average of co near the surface from the moped retrievals you see the benefits of this multi spectral retrieval on the left this is only using the thermal infrared data and on the right is this multi spectral retrieval you can see on the left there's some structure of the major pollution regions in asia but this stands out very nicely the high the red colors represent high levels of carbon monoxide you can see the endogangetic plane showing up very nicely here uh you know the pro river delta much of eastern china showing up very strong so there's significant information that we're able to get from the multi spectral retrieval on surface level co and so now with this kind of information we can take this what we call a level two moped data product so we went from the we did an inverse an inversion asked to go from the level one satellite radiances to level two carbon monoxide profiles and now we can take this level two profile and do another inversion using an atmospheric model to now infer the surface emissions of carbon monoxide to the atmosphere given the observations that we have available to us and i'm using moped here as an example of how we do this you know this is also done for co2 which i'll talk about methane and many other trace constituents so the next part of the talk will focus now on taking these satellite measurements and bringing them into the atmospheric models to try to do the inversion problem in the atmospheric models to infer these emissions on policy relevant scales so i think it is a good time to stop for the break alec okay yeah wonderful girl okay thank you very much here for the first part and now it's break let's continue at seventeen zero zero central european time and ten nine minutes thank you