 Good afternoon everyone and thank you for joining us here today. We have some great turnout already of over 100 people are logged in online. And so welcome to the spring meeting of the Committee on Solid Earth Geophysics my name is Matt Richard I'm one of the members of this committee and we are pleased to present this session today on novel geophysical data sets for environmental applications, moving from building signals to societal benefit. Before I begin the presentations today I just will tell you a little bit more about the Committee on Solid Earth Geophysics. On this side, you'll see a slide that's going to be showing you'll see some information on the mission of this committee committee is composed and funded by four federal agencies the US Geological Survey Department of Energy, NASA and the National Science Foundation. It uses its convening functions to support community discussions like the one we're having today about interactions between Solid Earth Geophysics and federal agencies, and societal benefits, encouraging greater understanding of a range of geophysical topics and areas of emerging interest and research, our meeting resources webinars and videos are produced under the auspices of the committee are available all online. And today I'll now tell you a little bit about today's meeting and our goals for today. Environmental seismology, geodesy and geoelectrics are growing fields that use geophysical sensors and novel ways to learn about environmental conditions. These disciplines extend the well known discipline of environmental geophysics by using large sensor networks continuous time series and high performance commuting to develop new ways of looking at the conventional data. Oftentimes finding useful signals and other people's noise. Some links currently exist among the disciplines and have been underway, but for the next steps in the Solid Earth Geophysical Community to nurture these interdisciplinary fields. We'd like to use today's workshop to help us engage and understand how we can engage new potential users and ensure that we are addressing societal problems and fields such as oceanography hydrology atmospheric science and geomorphology. We may need to have discussions about how we can meet these needs by modifying instrument networks creating new facilities for data sharing and discoverability and helping to operationalize these discoveries. So today we're going to begin our first session with overview talks followed by an array of shorter talks on the latest developments environmental seismology. And our last panel will discuss examples of using satellite for imagery and hydrological applications. I have a few announcements before we begin this session is being recorded and will be available on our website within a few days. In addition to questions from the committee members we plan to take questions from the audience. Simply click on the Q&A button at the bottom of your screen type your question a box and click send any questions you submit maybe read aloud and included in our video recording. In the interest of time we will skip committee introductions bios of our committee members are located in the Academy's website and the link to be found in the chat. I do want to thank all of our speakers for all the work they put into giving presentations today. I really look forward to hearing from all of them. And I will now start our first session and introduce our first speaker. So I'm pleased to introduce Christine Larson from the University of Colorado as our first speaker. Christine and Sarah will each have 20 minutes to speak, plus five minutes for Q&A and then we'll have after both talks are completed another 20 minutes or so for group discussion. So we'll begin with Christine. Christine has an AB degree in engineering from Harvard and received her PhD at Scripps Institution of Oceanography. She's been at the University of Colorado as a faculty member since 1990. Christine has a great Twitter account called fun with GPS which well reflects all of the fun that she's had with GPS over her career, earning numerous awards, including the Prince Sultan Ben Abdul Aziz International Creativity Prize for Water in Saudi Arabia. She's also a member of the US National Academy of Sciences. During her research is focused on all sorts of aspects of using GPS, measuring plate motions, measuring ice sheet speeds, using a GPS receiver as a seismometer, and many other environmental applications that we'll hear about today. So now, without further ado, I'll pass it on to Christine. Okay, share. Can you guys see that? Yes, it looks great. Okay. Get the show on the road. All right, thank you very much for this introduction to speak. I'm going to give you a short interview, excuse me, a review on Hydrogeotasy and GPS and then I'm going to talk a little bit about the themes of this meeting. How can we help the different communities work together. I'm first going to start out talking to you about what I mean by GPS interferometric refutometry, which is the technique my group developed. Just a nomenclature thing, GNSS is just a generic term for GPS. I'm going to talk about applications of this technique and then, like I said, I'll talk at the end about how we can make this more available to people. So, GNSS IR or GPS IR, well, it's essentially making each GPS site into a radar, so by static radar in particular. It's a little bit different than regular radar and that I use very low elevation angles. And I'm using forward scatter, or what a GPS person would call multi path from planar surfaces near the antenna. And that was saying this is a noise or he was talking about using noises is very much a noise for geodeses. And then the other distinctive thing about this technique is I'm using normal GPS GNSS instruments nothing's been changed. The antenna is pointed toward Zenith. I'm not doing anything to optimize it for reflections which means I can use data from other networks. I'm using a geometry slide to show you how this works. Basically, GPS, everyone, including everyone using GPS and their phones is getting a direct signal from multiple satellites here and just showing one signal. But there's also signals that are coming down that are going to reflect off planar surfaces below the antenna. So I'm showing the geometry of an antenna above some generic planar surface it could be ice it could be snow it could be water. And you can see that this reflected signal and these are my little symbols for elevation angle this reflected signal and red is a longer path than the direct signal. The direct and reflected signal are going to interfere, you're going to get some kind of interference pattern from that sometimes it'll be constructive interference sometimes it'll be destructive that pattern of an interference is what I'm going to be using. And this is multi pack it's it's been around forever. The two things I can tell you from this technique are how high the antenna is above the planar surface, and I can tell you something about the kind of surfaces isn't is it wet is it dry. What kind of surfaces is bear soil is different than say snow. So it's basically GPS as a radar. I also wanted to at least show you the observables we use geodeses use carrier phase data navigation people use suit arrange. I use signal power data this is a typical time series for signal power signal to noise ratio. I'm showing you a single satellite track per site in Boulder, Colorado. It's long on the X axis. The boring normal power signal is shown in black that's the part I don't care about, which shown in red is a real signal. And if I take the black part out by doing some kind of polynomial fit you see I have these two sine waves. These are those interference patterns I was talking about the frequency of the interference pattern tells me that height of the antenna above the reflecting surface. And the amplitudes for the most part tell us the kind of surface it is so that's the data I'm using quite different than what geodeses use but it's the same instrument. So, I'm going to just jump in and assume you want to know environmental signals without motivating why you'd want to do that but why would you want to do it with a GPS receiver. I want to supplement other sensors. Many environmental sensors have small footprints a meter by a meter some only a few centimeters by a few centimeters. Many satellites measure the same kinds of parameters but they have very large footprints kilometers 10s of kilometers by 10s of kilometers. I can use instruments installed by geodeses which makes the data inexpensive, and you need in situ sensors for satellite validation and assimilation. So our first result that some of you are aware of was measuring soil moisture for the site in Boulder, Colorado. It started in 2007 what's shown in the time series in dark blue with these bars that's just precipitation it's a reality check that it rained on those days and how much it rained. What's shown in the gray region is the range of five water sensors soil moisture sensors were actually buried in the soil, what's shown in the colors are different GPS satellites and you can see that those are very highly aligned. The correlation coefficient of something like 0.9. So this was our first demonstration that said you know hey maybe I can use this GPS antenna right here as a radar to do soil moisture. We followed that up by showing that we could measure snow the next year and vegetation water content the next year. And once we had algorithms that allowed us to derive those quantities regularly or quickly. We asked if we could start our own water sensor network called PBO H2O. We were planning to use the plate boundary observatory which the locations of the conus sites are shown in cyan. Of course the purpose of the plate boundary observatory had nothing to do with water sensing but it was supposed to measure these deformation rates shown in the right. It was a beautiful network with outstanding equipment and it was an opportunity. So, in 2012 in the fall I believe it was we started our first results we only had 25 stations. We measured soil moisture every day. It was not a real time network is near real time we posted these every morning. The kind of quality of these data is shown here this is a site in eastern New Mexico. We measure very shallow soil moisture which is shown in the blue dots here. If there was a MET sensor we provided that data to people not because we use it in the algorithm just that as a reality check. We also downloaded and provided to the community NLDAS precipitation records again to give context to these records. Ultimately as we optimized our algorithms and also our quality control. We were able to do this for 150 sites in the western US. We also measured snow depth originally and ultimately we then improved our algorithms to also provide snow water equivalent because that was requested. We did this at about 220 sites I'm going to show you one in. Excuse me in Idaho. Here is just a single year where we validated the GPS retrievals of snow depth by taking a photograph with a camera of a pole and digitize those numbers very good agreement in this particular year. But once you have the algorithms. We developed climate records for that site so PBOH show began here in 2012. We ran it as a network for almost six years. But we also, once we had the algorithms went back and analyze the data starting at the beginning of the plate boundary observatory. And you might have noticed that there were sites in Minnesota. It's not part of the plate boundary observatory but it is a source of free data that Department of Transportation in Minnesota provides it so I added it just so that there'd be snow data for people in Minnesota. We also measured vegetation water content changes I've left out the details I'll just give you a big broad view of it. Again, we take advantage of the fact that PBO was a crystal deformation experiment that lasted for over a decade to look at vegetation changes over nine years in this case. Just basically the story is that if it was an extra wet year. It's dark blue if it's an average year of vegetation. It's shown in cyan and if it was really red that was drought. And you can see the development of the very significant three year drought in California between 2012 and 2014. But you can also see there was a very significant drought in 2007 as well. So this was all we did ground experiments to validate our algorithms we published papers on how we did it etc etc. After we had PBOH2O going I worked with some other colleagues to do some other applications so they weren't explicitly part of PBOH2O. We were able to measure tides at this site that Jeff Frank you'll are installed to measure crustal deformation rates. The distinctive thing here is that this is a, because this is a GPS sensor it's measuring water levels in ITRF. So that will mean something to the geodesys. This is not the best tide gauge, but it is the only one that measures in ITRF by its very nature. This happens to be in Alaska. I just want to, I showed you that data earlier. This is the kind of data we use for a rising and setting GPS satellite in Boulder. The antenna there is about two meters tall and the frequency here is very similar to the frequency here because you know antenna surface didn't change in the six hours. When the satellite was in view. This is Jeff Frank Miller site. This is what a data look like to me when you have a very significant title signal. So he has this very high frequency signal here and a lower frequency low tide high tide. You don't have to do one value you can do multiple per day. Excuse me for track. This was our initial results. This is a two week period for that time for that place there was a tight gauge about 30 kilometers away. Richard Ray did the comparisons between the two. And you can see here on the y axis sea level this is plus or minus eight meters and the GPS can very reliably retrieve sea level there in ITRF. We then went back Richard Richard Ray myself. Also Simon Williams and looked at a longer time series at Friday Harbor Washington there there was a co located tight gauge. And Richard was interested, not just in how well it agreed with daily averages over a year where that was better than two centimeters but also the 10 year record where the RMS difference was 1.3 centimeters. Now this was early days of reflectometry where we only use GPS signals and I'll talk later about what you can do with the new signals. And I also was motivated by some. Well, I don't know what you want to call it I saw that there were GPS receivers running in the middle of the Greenland ice sheet. I couldn't figure out I mean I knew why they were there to measure my sheet motion but they intrigued me and the data were archives at Unavco. I worked with John war on this. This is what the site looked like. The antenna was very beneath the ice. It was about three and a half meters tall. When they installed it this was 2011 and this is the Irish tech. We were able to measure snow accumulation there for almost nine years. Just point out to you a very large melt occurred in 2012, but there's, you know, smaller melt each summer. This is an accumulation zone. First question you have as well. Is that very big melt real well, we have some photographs anyway you can see that when they installed it. The monument was flushed with the snow ice surface 2012. Yes, in fact, there had been very significant melt. We can do better than just photographs. Thanks to the late Connie Stefan who provided a finger to compare with our results and also Michael McFerrin gave us one of his finger records so it's accurate. It was in situ data for free basically because it had been installed to measure my sheet speeds. Some newer things I've been doing since, well, since last year. I noticed there's been a large earthquake in the Schumagans and I'd always been interested in possibly trying to detect tsunami waves but you know, there's been no event that I thought I could do that. We used a GPS site that had been installed for the plate boundary observatory. If you, if you know the region of the stars location, you'll notice it's not on a, it's not on a dock, it's 70 meters above sea level. So you can do this technique from quite a height. Most people would recognize the value of these data to measure coast seismic displacements this happens to be a vertical displacement of about 35 centimeters. Here I'm using GPS data to measure tides before and after this tsunami. We don't have in situ records to compare with. But there is this tsunami model put together by Thorn Lane as colleagues at the University of Hawaii, also from China and Japan, and it had about a 30 centimeter amplitude. Near my site, not my site, our site. What's shown in gray are the tsunami wave models and what are shown in blue and green are the GPS estimates. So there's good agreement between what could be a real time GPS system and tsunami models for this earthquake. A few weeks later, Hurricane Laura hit, I thought, well, let's go see if there's a GPS site near there. In fact, not just GPS with GNSS. So again, that's the generic term. That means that not only tracks GPS attracts the European system, which is called Galileo, the Russian system, which is called GLONASS. It's on a Sentinel on a canal. There's a NOAA tide gauge. And also there's a wind sensor there. So we knew from some previous work that you can measure storm surges with GPS, but here we have all the in situ measurements to really demonstrate to people that we could do it. So here I'm showing the record of the GPS retrieval compared to the tide gauge, which is in blue. You can see the time series is very much filled in and that's because we have multiple constellations, which is the best way to do this. And you can also see that it operated during these high speeds, particularly here when the speeds went down as the eye crossed over the site, it happened to be in the eye of the hurricane. So as the wind speeds went down, we were able to get the peak. It does not work very well above 30 meters per second. So, tide gauge during Hurricane Laura. Now I want to spend about five minutes talking about how we did all this. Well, how did we do this basically from 2007 to 2017. I want to acknowledge my colleagues. These were the primary colleagues and students I work with basically all brilliant and contributed in a lot of different ways to making this all work. I also want to talk about serendipity a little bit. Jay Fine was our program manager and happened to come to my Bowie lecture, which I think helped him see what we were up to. Dara and to copy was the PI for snap snap was very important in supporting the development of this technique. We succeeded because we have data from the plate boundary observatory and Barack Obama, I will just say. This technique was developed as the stimulus funding hit the NSF. So even though we weren't funded with stimulus funds certainly it helped that that new money was available to support new ideas. So where are we today. Well, PBO H2O was funded originally in 2012 it was renewed we operated it through the end of 2017. We were very careful to archive our work at places where people could find the data soil moisture archive in Europe and the National Snow and Snow and Ice data center here in the US. Willis hosted our is hosting our archive website. And, and we were very fortunate that NASA agreed to port all of our code from PBO H2O so could be run by JPL it isn't being run now but they do have the code if they wish to turn it on. And I would say while people I continue to get emails about soil moisture really it's the cryosphere and oceanographic applications. I'm a little bit more interested now. So here are my other thoughts I only have about three minutes to go. I think that NSF and NASA do a great job of supporting PI driven research and this is an example of some of that. You know you have an idea you write a proposal, write papers, develop algorithms. At the same time, some agencies are driven by missions and a lot of the ones that, you know, help our communities are driven by missions and sometimes the PI driven research doesn't get to those agencies. I'd be curious to know what people think about how to get new ideas to an operational agency it's very difficult in my view. This is not just geodetic facilities. Same problem with seismic facilities when I tried to archive my GPS seismology results. They're supportive but you're not really what they do. So they're not quite sure what to do with you and that can make it difficult. It's just a generic rant no one reads the literature. I cannot tell you how many times I get emails from people that tell me they've discovered something that we published in 2013. That drives me nuts. But I don't have a solution to that. And I'd say a little bit that success was a double-edged sword for a PI driven research like this because if you really succeed, it means an operational agency should take it over. And if they don't, what does that mean? So how are you going to make your work available to people? Are you going to keep analyzing the data? That's what we did for PVOH2R and I think it was a good model for that because it was a network. If you're going to provide code to people, how bulletproof are you going to make that code? And resources are limited. It's a lot different to write a PI proposal for 100K a year and versus operational proposals. So I retired. Matt didn't mention I retired from my faculty position. So I've been thinking about how to make this kind of work available to people. A lot of people don't want to know what you're doing. They just want the results. And I call that the Amazon experience. So because I'm semi-retired, I decided to teach myself how to make a web app. I watched a lot of YouTube videos. I had to teach myself Python and Flask and JavaScript and I will not say more. But it does reflect commentary on demand. You can see examples. You can analyze data from the archives and you can upload your own data. And it returns the results both in a graphical form with helpful links and photographs. But if you just want the numbers, it gives you the numbers. But there are other people, particularly or scientists, people that might be on this phone call or Zoom call, they might be willing to run the code themselves. They might not be interested in all the details, but if you make it easy for them to use, I think they're more likely to use it. I just have one or two more slides. So if I'm getting a reminder to stop, I will. So we have started an open source Python project. It's currently on my website. It's being supported by NASA. And it will support and already does support snow and water levels. We're going to add soil moisture. So it's there. It's on pi P and you now go is going to make Jupiter notebooks for the community. So I'm pretty excited about that. So it's a different way of providing the data to the community. You can analyze all the data or not, but this is another way of reaching people. So I just want to say my hope that by helping to build this open source software, I'm going to be able to engage the community in a different way. Not the way we were doing PBO H2O, but also the community that needs the software to make this happen. And I will stop there. Great. Thank you so much, Christine. We have a couple minutes for a question and I think Jeff Primuler has a question. Yeah, thanks Christine. Great talk and I wondered if you might be able to say a few words about how you got from the hey we can measure this thing with our GPS to here is a product that people want to have, right? I mean, so this, your work is a great example of things that started out in this sort of curiosity based and was turned into a product that was beneficial to other communities. And how did you work with hydrologists and others to, you know, to really kind of focus the work and refine things so that you're producing the products that they wanted. I would mostly, well, it was a big team experience, but I'm going to particularly point to Eric Small. He was, he's quote, our hydrologist, and he knew what hydrologists wanted, and he made sure we provided it. I always remember him, I well know I will tell that story. Anyway, he's a great colleague. I would also, again, reference Dara Endicabi and SMAP. They were looking for data. They told us exactly the kind of data they needed. We weren't the only people providing data, but it made, we had very clear goals for how accurate we were going to make that data. We had a lot of support, I will admit as well, we had a lot of support from NASA, because they wanted SMAP to be a success, and they wanted people like ourselves providing data. NSF was interested for providing soil moisture data in general. So I think if you want to make your data products, if you a geodesist, solid or geophysicist want to make your data products available to these new communities, you have to have someone on your team from that community. And Eric Small played that role, I would say, yeah. And for tight gauges, for example, you know, I found myself trying to estimate tide title coefficients in a hotel room in Sweden while I was jet lagged once or something, you know, like you're up in the middle of the night. And I was like, why am I doing this? Richard Ray is like a world expert. I'll work with Richard Ray. So there, again, working with the expert in tides is to me a better use of my time. I'll work on the algorithms and let the experts do the analysis that that's the product. Another example would be Matt Siegfried and David Sheen. They used this technique on ice sheets. And, you know, they're cryosphere scientists, I'm not. And they're the ones who make sure that the results make sense. Lon Winded answer. I hope that was okay. Thanks actually very helpful. Thank you, Christine. We do have one quick question here from Rosemary night that I'll ask you before we move on and question was, how large was the gap and time and funding between having code that was great for your group to use. Should I turn my presentation off. I'm not even sure how to do it. But to repeat the question, how large was the gap and time and funding between having code that was great for your group to use, as you called it bulletproof code. I'm not sure I'm there yet. I thought I was doing pretty well, Rosemary, and I talked to a new colleague a few years ago. And he told me, I mean, I think I had code that a grad student could run at that time. And he told me, no, I needed to make it so the code could be run by an 18 year old work study. And I was just like, wow, that's quite the bar there. So, but you know, I'm there. I'm there for snow and water levels. I'm not very up for soil moisture we have put that last because soil moisture is by far the hardest thing to do and we want to make it bulletproof. So I don't know how long did it take me. Well, that's hard to say because I had to teach myself Python. And also this was, you know, I think we've written this code three or four times. Two or three years. I've been doing it part time. So I don't know that I have a good answer. But I do I'm getting better at writing it. I will say that it's a little faster each time. Great, thanks so much Christine and thanks to all of you who asked questions we will come back to those at the end of Sarah's talk. So, why don't we ramp over to have Sarah get ready. So I'm glad to introduce our next speaker which is Sarah Cruz, who is a faculty member at the University of South Florida, where she leads the Near Surface Geophysics Research Group since the year 1996. She has an undergraduate degree from the University of Wisconsin Madison and a PhD from MIT in tectonophysics and also has been a faculty member at Eckerd College and the College of William & Mary. So, we're very excited for Sarah to be our next speaker. Thank you. Hi, thanks Matt. Can you see the slide. Yes, it looks good. Okay, so I'm Sarah Chris from the University of South Florida, and I'm going to follow as Christine did and talk about research that's active today and also how we can move forward most effectively within the future with this in the future. And I'm going to give a little bit more of an overview talk, Near Surface Geophysics is a very broad field and assume that just try to get some examples to a general audience. Okay, so what do we mean by Near Surface Geophysics, the top few hundred meters of the earth. And so Near Surface Geophysics differs primarily from solid earth geophysics in the sense that we actually have more tools available instruments that can sense important parameters in the top hundred meters, but that don't penetrate to greater depths. So I'm going to give some examples of science being done now with these Near Surface Geophysical methods applied to cryosphere problems groundwater hydrology volcanic hazards. Make a point about how this field can fill critical gaps and measurement scales, and also how we can apply it to diversity equity and inclusion issues. Alongside these examples, I'm going to make suggestions about what we could do if we had more resources directed towards Near Surface Geophysics, and, for example, a Near Surface Geophysics Center. And the reason I bring this up is because in the recent cores earthen time report, they recommended the funding of a national Near Surface Geophysics Center. So I would point out that we are all familiar with the value of national facilities like iris, you navco, C temps in terms of cyber infrastructure we have quasi and open topo. These provide incredibly useful tools for collaboration. 30 years ago, if you wanted to do an experiment with an ocean bottom size monitor, you had to be affiliated with Woods Hole or some other oceanographic institution. So anyone in the United States can write a proposal to for an experiment using ocean bottom seismometers. We do not have something similar for the use of Near Surface Geophysical equipment and expertise in the United States right now. So I'm going to borrow more from this course report. They propose 12 important questions to be addressed in the next decade, and they point out that seven of these questions. Near Surface Geophysics can make significant contributions. So I'm going to use this alongside my slides as a framework for illustrating the sort of fundamental science importance of much of this work. I've shortened the questions up. But I'll point out for the different studies where they're relevant to these key critical questions. So what can your surface geophysics do? We have a suite of instruments that we can use. Now the solid earth geophysicists in the audience will be very familiar with seismics and gravity and magnetics. But there are a number of other methods, mostly but not all electrical that penetrate near surface depths that allow us to widely expand this toolbox to look at a range of geological processes. So these methods include ground penetrating radar, electrical resistivity tomography, electromagnetic methods, which is our sort of near surface version of MT, nuclear magnetic resonance, induced polarization, and self potential. So I want to point out that the discipline of Near Surface Geophysics isn't just a box of equipment on different kinds of equipment on a shelf. It is how do you use these data? What can we actually map in terms of earth materials from these physical measurements? What are the uncertainties in what we can say about earth materials and processes? How do we integrate these methods and invert the data for the parameters that we're interested in? So it's a huge field with an enormous amount of intellectual capital addressed at resolving these critical questions with these tools. So I'm going to start my talk with a couple of examples from the cryosphere. So I'm going to focus these examples on the issue of permafrost, permafrost melting causes infrastructure damage, and all sorts of other changes associated with climate change. So there are a number of fundamental questions that we can look at using Near Surface Geophysics in regards to the cryosphere. So a couple of these, like the first one, how does liquid water affect snow, glacier and permafrost dynamics? And the fourth one, what is the distribution and thickness of ice rich permafrost deposits? So the first study I'll show is an example addressing these questions with both ground penetrating radar and nuclear magnetic resonance. So this study from Terry and co-authors integrates GPR and NMR data to look at permafrost thaw processes. The background image is a ground penetrating radar profile. GPR is great because it provides high resolution stratigraphy and detailed imaging of contacts in the subsurface. The problem is that the GPR radar response reflects the integrated matrix and water composition of the earth material. So if we want to fully interpret this in terms of cryosphere processes, we want to know where the liquid water is. And this is where the NMR comes in. The NMR, when it's set out, can provide a depth profile of the total water content in the different layers, which lets us essentially calibrate our GPR image to answer cryosphere problems. So integration is really important for the science. So with these green boxes, I'm trying to show what maybe we could do better in the future. So ground penetrating radar systems cost between 50 and 100K and are relatively widely available because there are a lot of commercial applications. NMR is a much more specialized and much more expensive equipment, starts above 300,000. And there are actually very few researchers in the United States that have access to NMR equipment. So this key component of the integrated study is really only accessible at present to a few researchers. The next example I want to show is a study that combines electrical resistivity tomography and self potential methods to address cryospheric issues. This paper by Wojtek et al. they're identifying hydrologic flow processes in arctic hill slopes. And the figure on the bottom shows the results of earth resistivity profiles. The blue zones are wetter soil zones, but the beauty of combining the resistivity data with the self potential data is that self potential data is sensitive to flow groundwater flow while the resistivity is sensitive to the presence of groundwater. So by combining the two data sets, you can learn about both where the water is and where the water is moving. So what can we do for example for studies like this? Well, self potential electrodes tend to be pretty flaky and especially for long term monitoring, we want to understand better their responses, and we would like perhaps to develop more robust electrodes. So this is an example of where equipment development could really help us out. I'm going to turn now to a couple of examples from groundwater hydrology. The examples that I'm going to show address two profoundly important questions. The first is saltwater intrusion and coastal aquifers and if there are any coastal water managers in the audience, they recognize how critically important this issue is. The second is important to all of us who eat food and it's related to subsidence from groundwater withdrawal for agricultural purposes and how we have to balance that against our food security needs. So both of these studies that I'm going to show come from Rosemary Nights Environmental Geophysics Group at Stanford. The first study uses electrical resistivity tomography. The second study uses airborne electromagnetics. So the first study is remarkable because it's a watershed scale study. This is a resistivity study collected along the coast of Monterey Bay in California. So the water is out in our front. We're looking at the land in the background. And these is a resistivity profile spanning almost 40 kilometers of the coastline to 300 meters depth. Reds show zones of saltwater. Blue show zones of freshwater. And the important thing here is the saltwater intrusion pattern is much more complex than what might expect from the simple where the locations of river inputs. So we have a lot to learn about saltwater intrusion and management on the watershed scale. The next example is in the Central Valley of California, which is responsible for a massive amount of the U.S. food production. So this slide shows INSAR data collected between January 2015 and September 2019. Over this approximately four year interval, the zones shown in red subsided up to 1.3 meters. Blue indicates little subsidence, red maximum subsidence. This remarkable subsidence is related to over pumping and the decline in water levels. The subsidence itself has an impact on infrastructure, including the aqueducts that move the surface water. So to deal with this combined issue of subsidence and yet the need to grow food in this valley. Rosemary Knight's group collected airborne electromagnetic data along these white lines shown in right in this area in the Southern Central Valley of maximum subsidence. And the results are remarkable. On the aquifer scale, the airborne electromagnetics, and this is only a subset of the data, show us at low resistivities, the presence of clays, higher resistivity zones that represent sand and gravel. Now this image, these profiles were actually represented of the material below the ground surface, but they're plotted just above the ground surface. The images show inner fingerings of clay layers. These are important because groundwater pumping produces compaction in these clay layers, which contributes to the subsidence. And most significantly on this scale, you can identify zones of natural recharge. You could identify zones of that would be good sites for managed recharge. You could identify zones where pumping would cause minimal amounts of subsidence. So watershed scale mapping can answer critical questions about water management. So how could we do even better than these remarkable data sets. Well, one thing is that in terms of electrical resistivity tomography, many researchers acquire data in in different ways. I think the community needs some data quality metrics or some idea about how we can characterize the noise and uncertainty in these data sets. We also need open source software that would help us with experiment design. In terms of the Electromagnet, airborne electromagnetic data, we need an accessible repository for these large data sets so that other researchers can access and work with this data. I'm going to move on to an example of near surface geophysics being applied to volcanic hazards. This, the hazard I'm going to focus on are hazards associated with eruption explosivity and ash plumes. And the photo in the upper right shows a Icelandic volcano erupting. This is not the eruption this week. But one that you may recall from 2010 and I won't even dare to try to pronounce the volcano name. But this eruption produced a massive ash plume. And you may recall that closed down airports in Europe for many days in 2010. And the map in the bottom shows the ash plume in this red cloud and the blue dots show the locations of airports where flights were canceled at the moment of this screen snapshot from the internet. So here are some science questions we can try to address with near surface geophysics in regards to volcanic hazards. I'll talk about the top one, do eruption and transport models explain the distribution of ash deposits from violent eruptions. And for example, the bottom one could expansion of a monitoring toolkit we use improve our ability to do eruption forecasting. So the study I'll show is when we did with ground penetrating radar. We did it on a small basaltic cinder cone in Nicaragua called seronegro. And seronegro is important because it's just northeast of the city of Leon, which is the second largest city in Nicaragua. And the prevailing winds carry ash from eruptions of seronegro over the city of Leo of Leon. And seronegro has erupted some 23 times since 1850. So what we would really like to understand in terms of its eruptive history in order to make hazard forecasts and design plans for this area is whether eruptions are more likely to produce massive plumes like this photo from the 1968 eruption, or distinguish those from eruptions that produce smaller amounts of material that may just flow or roll down the flanks of the cinder cone itself and not affect Leon. The only way we can do this is by using ground penetrating radar on the flanks of the volcano and I realized this figure may not project well through zoom, but you can see schematically in the upper right of these figures, we illustrate what the radar data show. And on parts of the cone, we can see clearly that layers truncate against older deposits out in the plane surrounding the cinder cone. There's only material that flowed or avalanche down the side of the slope and then stopped at the toe. Other parts of the cone we imaged strata that continue continuously below the surface well out onto the flanks and out onto the plane of the volcano. This work would have been deposited from high ash plumes stretching out into the horizon. So we can distinguish material deposited from from high ash plumes from those from lower energetic parts of eruptions or events. Oh, this work is from courtland at all distinguishing deposits of low energy and high energy scoria eruptions. Okay, so how could we do better. Well, GPR profiles give us the thickness of these volcanic deposits, but the hazard assessors who are modeling the ash plumes are actually interested in a different quantity. They want the kilogram per meter squared of ash that has landed on the surface, because that is what their models account for and forecast. So to convert a layer thickness into the mass of ash contain in that layer, we need to know the porosity of the layer. And GPR is unfortunately or fortunately depending on the science you want to do highly dependent on the water content of the material. In order to try to get at the porosity of the layer we need to correct for the water content. To correct for the water content we need other methods, maybe NMR or seismics, and we can do this better if we have access to state of the art multi offset GPR systems. So this is an example of a problem where we could go from the geophysics more directly to the physical parameter of interest to the hazard modelers, if we could integrate methods if we had broader access to equipment. Okay, the next point I want to make is that near surface geophysics can fill critical gaps in measurement scales. This figure comes from Berkman's Lee at the USGS he's plotted on the vertical axis the depth of investigation, and on the horizontal axis, the spatial coverage of the methods and this afternoon we'll hear about grace results in the right, grace can give us information about properties deep in the earth, but only on very large length scales. The gold standard of understanding the earth perhaps comes from borehole data direct measurements and borehole data can give us information to depth but only at the spot in which the borehole was drilled. There's enormous gaps in these scales of observations that can be filled with both ground and airborne geophysics. An example of using near surface geophysics to connect the gaps between what some of these scales is a study where remote sensing data were validated by near surface geophysical data collected on the ground. This is a study by Schaefer et al where they verified measurements of permafrost thickness determined from remote sensing data by collecting data within the satellite footprint on the ground. Another example of scale gaps that can be filled by near surface geophysics come seen in this plot from Steven Moisey. He plots the horizontal axis, we again have the spatial scale of the process or the measurement, but on the vertical axis now we're looking at the temporal scale of the process. And we can image processes over a range of temporal scales from fractions of seconds to multiple years with logging sensors in the ground, but these are limited to the single point of installation. If we want to image, get information about all of these processes that fall in this larger space with larger scales of measurements and longer temporal periods, we either need networks of sensors or geophysical monitoring. Another example of this is a 2D time lapse resistivity survey that was collected after the BP oil spill on a remote coast in Louisiana. And these are resistivity profiles collected perpendicular to the shoreline. Oh, and this is published by Heenan et al in 2014. The top profile shows the resistivity data collected seven days after the BP oil spill. This is the resistivity to about two meters depth along a 25 meter profile. The bottom profile shows what the resistivity at the terrain looked like 152 days after the oil spill. So what we are looking at is the natural attenuation of the oil in the sediments after the spill. So really useful studies like this, though, are hard to translate at present into studies that require 3D coverage in complex terrain, sometimes to test our conceptual models of what's happening, for example, in a critical zone study here. We need data on different scales and in three dimensions. Airborne EM is quite expensive. Resistivity systems require cabling that can make them very impractical to deploy in complex environments. So here's a potential example of instrumentation of the future. This example comes from a white paper submitted by Slater and Zhang to the NSF in a call for public comments last summer. So this example shows what we could do on a remote hillside in which it would be difficult to install cabled sensors. You can put in electrodes that have clocks that are calibrated with GPS and drive current in selected electrodes with a specific waveform of current injection and record the potential, the voltages measured at other electrodes that you've installed in the ground. So if this looks like a seismic nodal array, it's because it would be, it would be like an active seismic nodal experiment where some of the nodes can function as sources. This would be an absolute game changer for near surface geophysics and all these studies that benefit from imaging the resistivity of the subsurface. So finished in one minute. Okay, we'll jump forward we need community input into instrument development. So, one last point. The course report recommends that we should improve diversity, equity inclusion and the geosciences. Most students have their first encounter with hands on geophysics with near surface geophysics studies show that this first encounter can be transformative. You can collect data of your own that has meaning to you on a relevant project. It encourages you to stay in science. This is a picture of a student from Dickinson College collecting ground penetrating radar data over unmarked graves in an African American cemetery. But most institutions lack access to even basic geophysical instrumentation, especially institutions without large endowments, we need broader access we need training for instructors and students and we need teaching materials. I just tried to show examples of applications of near surface geophysics to a number of different fields. I want to stress that these are a small set of what I could have spoken for hours and hours that we can address a whole range of problems with a whole range of instrumentations and the methods to use them. Near surface science would be accelerated by investments in broader access to high resolution to sorry both basic and high end equipment and training. I think transformative research will involve integrated methods and multi scale data. We'll need new methods and equipment development will need new techniques machine learning and inversion to deal with very large data sets. We need open solar software we need data quality metrics, we need data repositories to do this important transformative work. I want to thank all the people that contributed to this presentation. I won't say their names out loud but it was really a community effort, many of the ideas presented here came from these people. So thank you for listening. Thank you so much Sarah that was a great overview with so many great samples. I don't see a question for you at the moment unless someone can point me to one but I will so I'll go to the Q&A where we'll sort of start with some questions. Maybe this one is for you Sarah I don't know. And it's a question that asks is remote sensing sensing technically possible to measure part per billion or low levels of groundwater and drinking water pollution from heavy metals and PFAS. Is there a technique that we have that we can do this. Parts per billion of, of what. Parts per billion is for PVB and PPT level groundwater and drinking water pollution. Okay, it depends on the kind of pollution. So if the contaminants are higher conductivity or lower conductivity than the background groundwater, we do extremely well at sensing these kind of contaminant plumes. If the, because many of these near surface techniques exploit that measure are measuring basically the conductivity of the ground. So the answer is some yes, some no. Yes, and I see Rosemary Knight has put in the question answer she says possible yes with NMR if the contaminant is absorbed to the solid. Yes. Thanks Rosemary. Thank you Rosemary. All right, so we have about 15 minutes left in this session and so I'm going to use this as a chance to open it up to any general questions that anyone has for either of our speakers or for the topics that they have raised. We have a couple of questions here in the chat that I think we'll get us off to a big start here one is from Matt vouch. The question is, research to operations seems to be a critical gap across a number of areas, not just NSF funded work. I know that NASA disasters program focuses heavily on this approach. Are there parallel efforts within NSF that could leverage this approach, not only with GMSS IR data, but also other types of geophysical data in a more unified way. So either of you want to take a chance at the standard that that sounds like a good question. I guess I feel like I certainly NSF could, maybe they could do more to help people like myself get things into operational research but I would say NASA has programs for doing that. I'm not ready, as you noted, but I will say that as a GPS person, NASA can't quite decide if they like GPS. I'll just say that because we're not a NASA satellite. So it's a problem. I mean, if it were a NASA satellite products, maybe it would be looked on differently by NASA, for example. Is that okay to say. I guess the other agency I guess NOAA is not one of your sponsors. Is that right, Matt? No, no is not, but they, they might be on, they might be on the call but so, you know, no is another operational agency. What are they doing to take on these new discoveries and solid earth geophysics. That's what happened, right? Because they're an important part of our community for hydrogeology. I mean, they need soil moisture data, they need snow data. So I think it's a problem that I and I don't think NSF is really the problem. I think maybe they could enable more discussion with these operational agencies, perhaps. So our questions here for you, Christine. One is from Dinesh. He asks, I'm working with soil moisture data with GNSS IR and I'm having difficulty finding in situ data for data validation. Where could we find in situ data? Well, I guess Mike, the question would be, did you put in your own sites? I mean, are you trying to do this without having your own data, right? So we put in our own sensors. We believe us if we didn't go out and make our own measurements. And that's why working with hydrologists was so important. Similarly for snow depth and SWE, we have to do our own validation. But if you're looking for other soil moisture data, there is a US climate reference network, and possibly you could put a GPS site near there if you wanted to have a reliable source of data. If that helps, right? Does that help? Christine, thank you. All right, here's another one for you. This is from the Yangdu. Could you talk about the limits of the GPS retrievals for soil moisture vegetation, water cover, water content and SWE and the next generations on overcoming those limitations? The limitations for retrievals of soil moisture are the same that we started with. It's a surface very shallow. Either you're not interested in that or you are. It was perfect for SNAP, but I can't tell you what's going on a meter deep, right? So it's a very shallow surface soil moisture. For vegetation, water content, the limitations, it could use some more terrain modeling to perhaps get the unit set right now. Now it's a normalized quantity. And for snow depth, it's really the same problem for soil moisture. I need planar surfaces. So if you plan to put your GPS site in a forest, rounded by tall trees, it isn't going to work. So you need to be sensitive to, I mean, we set up validation sites and people wanted to put the GPS in the middle of this field and I said, no, no, put it in the north so I can get some nice southern planar surfaces. So no one your technique will work and won't work. That's why I love the ice sheets because they're just empty. There's no people. There's, you know, occasionally they'll have a tent, but the ice sheets are a very good place to use this technique because no roads, no cars, no parking lots, and things like that. So those are kind of limitations. And the near surface geophysics can expand your spatial coverage and tell you something about the moisture meter down. I would also say, you know, in 20 minutes I couldn't begin to talk about what I do and the people that weigh the water, people are doing some very exciting things where they use GPS and INSAR to essentially weigh the effect of water. Look at, you know, Hurricane Harvey, just look at loading from snow and from soil moisture. When you talk about deeper sources of water, you can use other kinds of geodetic data. And that would be another talk right there, right, but just beyond that. And I think those techniques have a lot of promise because they're not limited like mine, where I have to have planar surfaces. They're weighing the effect of the load of that water on the crust. And so I think they give a bigger view of what's going on perhaps in a watershed that might be more useful to hydrologists. There's a tension I think some people don't care about in situ data. They want the big picture. They want the satellite to tell them what's going on. You need both kinds of data. And that's what Sarah is saying. You need different temporal and spatial scales and one instrument is not going to do at all for you, most likely. Great. Thank you, Christine. All right, I think this question is for both of you, but maybe Sarah can take the first step from Justin Sweet. And he has a question that says I'm interested in the implications of facility support raised by both speakers. Christine mentioned issues with facilities you need to accept data outside their domain and comfort zone. Sarah mentioned the lack of a national facility for near service geophysics. I would love to hear thoughts from both speakers on what they would like to see in terms of geophysical facility support for PI science in the future and how they see that support impacting the science PIs are able to do. Well, I'd like iris not to add instrument responses to GPS positions when they archive my data. If there are any seismologists on the call. That's what I kind of met out of the comfort zone I mean to a seismologist or to a seismology archive sure let's add an instrument response. But to those of us who are geodeses position is position there ain't no instrument response. So, these kind of, you know, when you're talking to people in a different field, you can easily make these kind of mistakes. And when I was working with hydrologists, you know, I banned local time. I don't, you know, and I told I never ever want to hear about local time again there's GPS time and we will all live, you know, in GPS UTC time and they look at me funny. And, you know, the person who's trying to find in situ data, I feel your pain, I spent years trying to find environmental data sets for validation and, and it can be difficult when you don't know how the game is played at these facilities. So, geodeses know how to get data out of geodesy archives, but it can be very very difficult for non geodeses and the same is true when I try to get data out of hydrology archives or seismology archives. So, your question though is more how do we help fix this. I don't know I mean I, I have found by making this open source package. I'm trying to make it easier for people not to have to be a geodesy expert to use my technique. I think that one of the reasons people didn't apply this technique is because there was just too much for them to have to learn they had to learn about orbits and what files were right next files things they shouldn't have had to do to be a soil moisture sensor. And so if you can write code that, you know that lowers the barrier. I think that's one thing we can do the facilities. I don't know. Sorry to interrupt Christine. I mean, that's something a facility could help with right a facility needs to be much more than, you know, a warehouse for equipment. We need the, we need to expand the user base to people so they can do these integrated studies and that needs to include training and, you know, open source methodology software just like you're developing. It's not supportive. I mean, you know, we are not my, the technique I developed is not supported by any facility. I mean I'm supporting it. Well I think Justin's question was aspirational. So, this is our aspirational answer right this is this would help do transformative science right this is what this is a kind of thing we need going forward. And I think the, I think the question is that I don't think there's any one single answer right these national facilities are organized in different ways. And I think the fact that for near surface at least the fact that this, you know, this process is being supported by the course report for near surface geophysics now this is ideal time for the community to gather and think about okay, what would serve this you know multi method integrated systems approach that we do, you know, most well and it, it probably isn't going to be the exact model of any of the existing facilities but it's a really exciting time to think about what would work best. I do want to say one nice thing. When I would have these discussions with hydrologists and we were writing this code, they noticed that I deleted the input files as soon as I use them so that was to be the GPS right next files. And the hydrologist was like why aren't you archiving them. And I'm like why should I that's you know, because job, and he was perplexed. So, I think you now come and iris do an extremely good job of making that data available. You don't have to talk to him. It's an FTP command. I'm thinking about it's outside their comfort zone it's like when you go to iris but you're giving them something funny right and and similarly you're going to quasi, but they're not used to what you're doing so I agree that maybe these facilities need to be more than just data and equipment areas I agree. I would also argue that their their structure which is you know bottom up community governance gives you that opening to go to them and say look there's a gap here and what you do. And that that is a real I think that's one of the good things about these NSF funded facilities is that you know some of them are really designed for you're absolutely right they are they are run by their own communities, you're just, it's my only concern is you're outnumbered. You know they do a really good job of supporting their main customers and I, like I said they do a good job it's just they're doing a lot with very little money. Well and Christine this is one of the fundamental problems of interdisciplinary transdisciplinary science right is that there's not a single box we fit into and to even do this we have to get people talking to each other. Right. Absolutely. Absolutely. Good. About three minutes left and so we're going to the lightning round. But these are some very rich questions here. So Sarah, I think we're talking about the facilities I'll ask a couple questions here related to facility so there's one question that says for building a near surface geophysics center in the US it seems very challenging to handle various types of methods and a geophysics center. How would you envision this challenge. And then another one from Rosemary was, in addition to getting the near surface equipment and software out there through a central facility how do you ensure that the equipment and software are correctly used. So maybe, Sarah you can talk a little bit about sort of a vision or thoughts about answers these types of questions of how facility might operate. I don't think the template is out there yet. I think this is something that we need to talk about because this is a challenge beyond, for example, what iris faces you know they have seismographs and they have empty systems and ocean bottom seismometers right. So it's a much more limited equipment pool. So I guess I feel like as one person I shouldn't say you know, this is what I would do. And in fact I'm not sure what I would do I would want to talk to lots of people. But I do think that we need to address the integration of multiple methods as hard as it may be because that is what's going to move the science forward. Did I weasel out of that. Great lightning round quite an answer to that. I did see a question here about using cheaper GPS sensors, and I'll just say one way to grow this so more people could use GPS or GNSS reflections is to basically use your cell phone, or that type of you know it's a $5 chip. And that person is absolutely correct and that might be something that we can do to make it available to more hydrologists use cheaper sensors. And I guess I don't know if you guys have any thoughts on this question from Jorge is sort of how do we make these equipment available not only to students but also to entrepreneurs in a way that could make use for them without facing obstacles. Is there a way to involve entrepreneurs in any of this work. Well, I feel like I've seen that iris interacts with seismograph manufacturers in ways that are, you know, productive for both the research community and the commercial manufacturers. I think this is one thing that's missing from the near surface community is that we don't have an organizational structure that would say, look, you know there's 50 of us if if this piece of equipment got developed, we would all use this, you know, if it was too expensive but from some shared you know facility that we would borrow it from so it would be worth your while to develop it. So there is entrepreneurs using equipment from a national, you know, NSF facility. I really don't know what what NSF's you know philosophy would be on something like that there are these you know SBIR programs for small business to get funding to advance technology so that might be a route for something like that. I mean a colleague of mine did have one of those SB not an SBIR but they had an entrepreneurship program NSF sponsored that I don't recall the name of it was the previous director of NSF was big on that. So maybe I'm going to ask two last questions here that I hope will be relatively short so one is from Miguel Valencia. And it's a question of how do you track groundwater plumes using ERT or SP can you use any of these methods to monitor deep injection wells at a 2,000 feet you'd be wanting you'd want to be using magnetoteluric methods and the answer is that yes, people do this that it's a big industry and geothermal exploration they do a lot of monitoring with electromagnetic methods. Yes. Great. All right, next question from Stephen Hernandez for Christine. I was wondering if Dr. Larsen has observed any co seismic changes in snowpack or soil moisture with her technique I'm thinking something I can do a low grade lipo faction avalanche triggered by the strong shaking. That's an easy one no. I haven't. Great. All right. So, I see there's what quite a comment here from Ross Henderson and I'm just going to we're going to go a little bit over here just to answer this as a good one. NSF currently has a decadal call for proposals called signals in the soil and SF has a grand challenge using distributed acoustic sensing gases. So what are the prospects is for increasing the sensitivity of soil signals to nanoscales for functional genomics of the soil micro biome using helically wound dark dark fiber followed save at the University of Colorado has done some more work on these sort of gases. Any comments on on gases or these other questions. I don't know enough about gas to answer that question. Great. All right. Thank you so much everybody we are over time. We are now in a break time period. We'll be back in 12 minutes at 130 Eastern time. So enjoy your break. Good afternoon everyone. Welcome to panel two of this afternoon's meeting. My name is Jessica Warren I'm an associate professor at the University of Delaware and also a member of the Committee on Solid Earth Geophysics, and I will be moderating today's session. This session is going to focus on the latest developments in environmental geophysics with five exciting short talks from Danica Roth, Clare Mestella, Marine Danol, someone Dan, and when you're in fan. Each speaker will present an eight minute talk with another two minutes question and answer before group discussion at the end. Moving to the talks Danica has agreed to give a brief introduction on environmental seismology at the start of this session. And when you're in who will give the final talk is going to take a few minutes at the end to synthesize all of the talks before the group discussion. To begin with we will have Danica Roth speaking. Danica is an assistant professor at the Colorado School of Mines, her primary research interest center on understanding the coupling of surface processes with regional variables such as climate biology and anthropogenic influences in order to better relate process mechanics to landscape evolution to landscape form and evolution across scales. And with that I will ask Danica to take it away. Alright, can everyone see my screen. Yes, and he you great. All right, great. Okay. So, so I wanted to start this talk with reading part of the john your quote that inspired its title, because I think it does a really great job of describing why seismology is such an intuitive tool for studying rivers and other surface processes. So he describes the the sounds of a river over various topographies and says that anyone who has learned the language of running water will see its character in the dark. And if you've ever stood by a waterfall you probably know what he means here, we can feel in here the elastic waves that are generated by rivers, and by many other processes that act over a wide range of spatial and temporal scales on the Earth's surface. A lot of these are challenging to study with traditional monitoring techniques because they're either happening in places that are hard to observe, or because they're stochastic and can occur very suddenly in regions that just aren't instrumented. So as we learn to understand this language of seismic waves, we're finding that environmental seismology offers us some unique advantages and opportunities. It gives us continuous and high resolution records for process monitoring and event detection. It's useful for characterizing properties of materials that the waves travel through. And seismic signals are also a direct link to energy transfer to the ground, which means they offer potential insights into fundamental energy budgets, as well as coupling and cascades processes. So here are just three examples from rivers. In one recent study seismic data was used to identify a glacial lake outburst flood and infer that by mobilizing channel stabilizing boulders. These floods may actually be the primary driver of long term fluvial erosion in the Himalaya, rather than precipitation seismic records have also identified seasonal transitions and sediment supply regimes driven by typhoon induced landsliding in Taiwan. A couple studies have pointed to low frequency ground tilts as measuring precipitation loading or possibly fluctuating pressure on stream banks, caused by turbulent eddies. So these studies, they highlight kind of some of the big picture insights that we're gaining from fluvial seismology. But for the rest of my talk, I'm going to be focusing on a more complex and challenging question of measuring and predicting sediment transport rates. This is basically one of the holy grails of fluvial geomorphology sediment transport underpins all questions of erosion deposition landscape evolution. It has direct impacts on ecology hazards, water quality range of other fields, and our models are pretty bad at predicting it, because sediment transport is really non linear and stochastic. Our models are calibrated with empirical data from laboratory flumes or rail rivers, but because of the non linearity, we really can't just extrapolate that predict what's happening at high flows like this when when the majority of sediment transport actually happens. And when physical sampling is really dangerous and logistically challenging. So it really gives us an alternative approach to capturing continuous high time resolution data using instrumentation that's relatively cheap easy to install and external to the channel. So, I want to start with exploring kind of a best case scenario at one of the most extensively instrumented streams in the world, where we actually have independent measurements of discharge precipitation rates, and sediment transport, which is measured by this line of impact systems, installed under steel plates in the stream bed just out of sight in this photo. So we wanted to use this data to identify the signatures of these three distinct processes in the seismic spectra were recorded outside the channel as monitors. And what we learned from the study was that if we have enough independent constraints as we did here process rates are actually recoverable from the spectral data. So this squares regression of the power spectral density at each frequency to identify the spectral contributions from each process. So the power per unit precipitation discharge or sediment transport. And then we can invert that regression with measured discharge and precipitation and seismic power to actually estimate the transport reasonably well. In cases where we don't have independent measurements of sediment transport to calibrate seismic data, which is most cases, we can still gain qualitative insights. These are photos and hydrographs from a study where we looked at the impacts of two floods following a dam removal in Taiwan, where a large amount of material trapped behind the dam was transported in the first flood. Essentially, nothing was left to move in a later very similar flood. So if we plot the seismic amplitude against the water depth, we find that the flood with high transport rates chose a large amount of hysteresis clockwise in this case, in the relationship between seismic amplitude and flow depth. So this is actually a really common observation and sediment transport studies where we often see higher transport on the rising limit of a flood than at equivalent flow strength on the following one. And a number of studies like ours have interpreted similar hysteresis in the seismic data as a result of sediment transport adding to the signal on the rising limit of the flood. So in this study, we defined a hysteresis metric for the normalized area in this curve at each of seven stations along the river, and we found that it actually tracked the downstream infection and dispersion of that pulse of sediment that was released from the dam. So you can see the pulse moving downstream here and spreading out between the first and second flood. So this looks like hysteresis scales with sediment transport rates, which is a conclusion that several studies, including ours have drawn our assume we were wrong though. Going back to our very well instrumented site in Switzerland. This is the hysteresis in seismic data compared to hysteresis in actual sediment transport data for each of these five flood events, and they don't match. These little loops here are coach showing either clockwise or counterclockwise hysteresis, and the gray regions are showing hysteresis going in opposite directions and four out of five events, meaning the seismic signal is higher when sediment transport is lower and vice versa. So what's causing that hysteresis. What are turbulence seismic power is very sensitive to both bed roughness and flow velocity, which means it's fundamentally influenced by the movement or just rearrangement of sediment on the bed, which can happen without producing a net flux. So without additional data we can't distinguish changes in seismic power due to changes in sediment flux versus rearrangements of the turbulent boundary layer, which poses a major challenge for trying to interpret this kind of hysteresis quantitatively. But it could also be an opportunity to gain new information about turbulent flow and bed dynamics during floods, which is also really challenging to study. So work on this front is being advanced by theoretical models that are closing the gap between our size of observations, flow characteristics and sediment transport theory. Inversion of observed spectra to recover sediment transport rates requires a lot of site specific parameters and model validation. So these models still remain largely unvalidated in real settings because independent data is just really hard to obtain. So we really need more controlled studies with welcome strength parameters and variables before these approaches will become broadly applicable to real rivers. And so on that note, I wanted to end on some very new work that my colleagues and I have just started using a distributed acoustic sensing or desk system which I know someone was just asking about. So for anyone who's unfamiliar with desk, it's essentially measuring strain at discrete points or channels along a fiber optic cable, which you can just see going into the water here. So we zigzagged it up the creek and then back down the bank here. So we sampled at 20 kilohertz for 15 minutes. The channels are set to two meter increments. So you go around tugging the cable to identify the channels associated with key physical locations, which lets us align the spatial spectrogram with the landscape and pinpoint hydraulic features like these rapids, which you can see in the spectrogram here. So seeing some interesting spectral evolution or gliding as the channel geometry changes along the stream here, which is really similar to what a few previous studies have observed in river spectra that are evolving in time. For example, this spectrogram from our dam removal study, but this desk data is just the isolated signal of water turbulence. There's no sediment transfer happening in the channel. The channel geometry and bed service here are really well known. We hope that this kind of short time high spatial resolution data can help us examine the connections between the spectrum and fluid ground coupling to explain some of the spectral shifts that are observed in more dynamically evolving rivers with fewer independent constraints. And I guess I will just leave my last slide up here to leave time for questions, but to quickly summarize, there are a lot of opportunities we're advancing on, but there are also a lot of important challenges and needs that future studies can help us address. So I'll leave the rest of my time for questions. Great. Thank you Danica. Do I have any questions from the panel. And participants should also add questions in the Q&A. I'm going to lead off with a question to ask you to talk a little bit more about the challenges of, you know, whether you are always going to have to do some ground truthing or you think that this will get to a point where you don't have to instrument every river to be able to make a prediction. I'm very hopeful that that will be the case. I think right now we just don't have enough data to kind of explore the whole parameter space. Rivers are really heterogeneous. There's, I mean, just among different kinds of rivers there are some that are on bedrock and there are some that are the ground is made out of boulders. So the attenuation is a huge, a huge issue. And there's also different kinds of transport that happen in rivers. There's different stream geometries. So I think as we get more data that's paired with well constrained environmental variables, then, you know, maybe the hope is eventually we can go out, it's really easy to measure something like stream discharge. So hopefully once we can constrain that and kind of site geometry, things like that, then it'll give us a better tool to use these theoretical models maybe to actually invert measured seismic data to estimate sediment transport rates. Great, thank you. And then we have one question from Rosemary Knight in the questions list. And her question is, as you look at the link between your seismic data and sediment load, is there an observable change as you transition from a dilute concentration of particles to a slurry, as that changes the module I think that's a great question. Unfortunately, we don't have data on on when and where the actual sediment concentration is changing. So that would be a great subject for a controlled seismic study. One of the papers that I cited in my example with Chow et al 2015, I think they were looking at changes in suspended load versus bed load seasonally I don't think they looked at slurries, but similar. Great. And then we have one last question which is going to come from Mark Bain on the panel. Hey Danica, great talk. So not knowing anything about how hard it is to measure sediment transport. I'm just curious in terms of practicality. Like if you have to go out and put out seismometers, what's the benefit relative to measuring sediment transport, and can you, how close do you go to a station to a river system? Or can we use existing stations to make a lot of these measurements already without having to add additional ones? Yeah, so I can share my screen because I actually have a supplementer slide. Whoops. Are you seeing my It's just white unfortunately. Okay, are you seeing that now? Yeah. Okay, so these are some of the ways that we can calibrate our data and measuring stuff. So a lot of our sediment transport data that goes into theory comes from this lab bloom. If you're trying to measure it in channel, there's basically like you go out with ladders and a basket on a stick. So in stream, there's sediment traps, there's like basket samplers attached to larger instrumentation. This stuff is like you cannot do this in a flooding river or you very much might drown. A lot of instrumentation gets washed down rivers or gets damaged in large floods, especially if there's a bed load moving. So it's a pretty major challenge. This place where we measured it is since Switzerland, so it's backed by Swiss funding. And this is like automated basket samplers that move into the flow. Plate geofoams, it's a whole thing. And there are a few places where people are installed in the US, but there is a major undertaking. With a seismometer, I mean we've rented instruments from Pascal and ours were installed very close to the channel. People have also done studies where they're hundreds of meters away from the channel. And I think it depends on what you're trying to get at. The attenuation question is still an open one. So distance from channel is, I don't have a clear answer on that. The ease of use and convenience is like orders of magnitude, I would say. Great, thanks. Great, thank you Danica for a great talk. So we are now a little bit over time so we are going to move on to our next speaker which is Clare Mastella. Clare is an assistant professor at Washington University in St. Louis. Her research focuses on the role of sediment transport and erosion mechanics and driving landscape evolution across a range of spatial and temporal scales. And so Clare you can take it away please. And you see my screen and hear me. Yes to both. Okay, excellent. Hi everyone, and thanks so much for the invite to present in this session. And so I'm going to talk about some of the recent progress that's been made in using environmental seismology on rocky coastlines. And so about half of the world's coastlines are rocky. And given this the geomorphology community is really interested in understanding the conditions under which these coastlines form and erode. We're particularly interested in assessing how sensitive these coasts might be to a changing wave climate, because we know that wave height and variability have both increased over the last 50 years, and we'll continue to do so in the future. And so we really want to understand how this will impact coastal erosion hazards and landscape evolution. So to get at this we first need to figure out when cliffs erode and rocky and erosion on rocky coasts isn't steady. Instead, it's dominated by these infrequent large collapse events. And a spoiler here is that we still don't really know when and why cliffs collapse. So this question presents a very clear outstanding challenge for prediction of future coastal change. Why haven't we cracked this code? First, our observations are really varied, which is to say that bigger storms do not always guarantee a bigger cliff collapse. In addition, the tensile strength of cliff rocks far exceeds the stresses applied by winds, waves and tides. So any individual weather event is going to have a really hard time breaking cliff rock. And because cliff erosion is so episodic, we really don't have great constraints on modern rocky coast erosion rates. And then finally, these near shore environments can be incredibly challenging to instrument, which really limits our ability to constrain energy at the cliff face. So enter environmental seismology. So there's an exceptional opportunity here to use seismic methods to work towards establishing links between environmental forcing, sea cliff strength and cliff erosion. And because we can capture cliff response to environmental forces at a very high resolution, and also resolve the timing and size of individual erosion events, we can start to discern the causes of these collapses. And so we need to start by disentangling different aspects of our seismic signals to better isolate those potential drivers. And so we typically do this through the comparison of our seismic data sets to other environmental data sets. And so in this example, you can see that we have tides and waves and wind, all being integrated into this power spectrum. And so let's start to break it apart. And we'll start with wind. And so wind energy is persistent on rocky coasts, but it's also very weak. So energy fluxes from wind or about two orders of magnitude less than those of waves or tides. And so there is some correlation between wind intensity and the number of small failure events, but that correlation remains pretty weak. We can also look at the influence of tides. And in general tides act to amplify wave signals. That's because as our tide rises, the wave breaking is going to shift closer to the cliff face, that's in turn going to increase cliff shaking, and then potentially enhanced cliff erosion. We can of course then look at the waves themselves. And so ocean swell can load and unload the near shore at a frequency consistent with the wave period, but the influence of that near shore deflection on cliff erosion rates remains completely unexplored. So most of the work thus far has really focused on the influence of wave impacts, which are these high frequency events that result as a direct contact between wave meeting rock. And in general average ground motion increases with increasing significant wave height. But for individual waves, the type of wave influences the intensity of that ground motion quite a bit with breaking waves tending to do the most work. So, in addition to looking at the integration of all of those environmental forces, we can also capture cliff failure events. And so it is possible to detect, locate and estimate cliff failure volumes from seismic data. However, it's really worth stressing here that depending on your environment, these can be exceedingly rare. And in particular in very high energy environments or in areas with very hard or massive rocks. These can become really tough to identify or rare to observe, but it is possible to see them and people are working on it. So, but one thing that all of these studies have in common is that we're measuring how much the cliff shakes. And so what we're interested in is what is the role of that shaking in and of itself in driving rock erosion. And so one of the seminal applications of environmental seismology to rocky coasts, Pete Adams noticed that ground motion decays with distance from the cliff face as seismic waves attenuate. So based on that observation, Pete hypothesized that this repeated flexure repeated shaking of the cliff allows for the development of rock damage near the cliff face, which preconditions the rock to a road under more moderate environmental forcing. One caveat of this hypothesis is that it has yet to be tied explicitly to click prop to cliff rock properties, or to cliff retreat. And we also haven't been able to observe a damage zone present at the cliff face. So some of my recent work has tried to start nibbling further on this question and I'm going to show you some very preliminary preliminary results of that. So we got our hands on four different rocky coast seismic data sets, Orkney in the UK being one that I interest instrumented myself as part of my postdoc. And we compared them to assess the potential signature of damage via that repetitive shaking across all of our sites. And so it's worth pointing out here as well that while there are a number of these types of data sets floating around. They're hard to find and they've never been directly compared before. And so first things first, we wanted to see how cliff displacement scaled with wave height across our sites. And we found in general that hourly displacement increases with hourly significant wave height, but with varying degrees of sensitivity. And so Orkney is far and away our most sensitive site in terms of differences in ground motion relating to variable wave height. What we found is that sensitivity is nicely explained by the average position of wave breaking across our sites with wave breaking on Orkney happening very, very close to the coast, which leads to this much more faithful translation of differences in wave height into differences in ground motion. We also wanted to take a closer look at ground motion attenuation following the work of Adams to see if we could see similar patterns across our sites. What we found for three of our sites is that once we correct for wave breaking distance, our ground displacement at each site follows a one over the square root of distance attenuation pattern consistent with surface waves. However, attenuation at our Orkney site is enhanced compared to all of our other sites. So this implies potentially maybe an increase in the role of body waves at this site, or potentially an additional signature of rock damage which leads to this additional loss of energy as the seismic waves move landward. So we want to break this apart to look at attenuation as a function of wave height, we see that we get increased attenuation with increasing wave height, which is a pattern that we don't observe at any of our other sites. So taken together we suggest that the potential for rock damage processes may be higher at our Orkney site, due to this increased sensitivity to coastline of the coastline to variable wave conditions, and this enhanced attenuation. However, we like those before us have yet to connect these observations explicitly to cliff material properties or cliff erosion, but we are working on it. So we plan to do further analysis of these attenuation patterns. We've also collected two active seismic surveys to look for evidence of this damage layer. We're also thinking about how we might apply ambient noise techniques to explore temporal changes in cliff properties. So this is all very much still a work in progress. These observations though have informed a broader landscape evolution story, but unfortunately I don't have time to talk about that today. But what I can tell you is that the sensitivity of these coastlines is largely controlled by local uplift rates and near shore morphology. And so Orkney is our potential is potentially our most sensitive site to these future changes in wave climate. What that ultimately means for erosion, we still can't say. And so just to wrap up here. We've made a lot of progress towards separating signals related to distinct environmental forces on rocky coasts and monitoring failures, as well as starting to assess the potential of different failure mechanisms across sites. There are still a lot of challenges, including really linking those environmental forces to erosion. We've also only done a limited interrogation of the influence and evolution of cliff material properties. As well as we haven't really looked at precursors to failure at all, but there's a lot of opportunity here to develop early warning systems. And we have a small number of data sets. These environments are really noisy. We're dealing with directional sources and failure is rare. So erosion is hard to measure. So I will leave you with a video from Orkney to show you the sort of drive this drivers of the signals that we're dealing with. And yes, this is a waterfall that is blowing back uphill. And I'll stop there. Thank you for a wonderful talk, Claire. So do we have any questions from the panel or in the Q&A? There is, I'm going to say now there is a great question from Emily Brodsky in the Q&A, but I'm going to save that for the general discussion question because it's directed at both of our first two speakers and I think it can easily see it along the discussion. So I'm going to hold on to that. And there's also a previous question for Danica that we will come background to in the discussion section. So I'm actually going to lead with a question for Claire and then somebody else should jump in. And my question is, you mentioned that it's been hard with the seismometers to catch a failure during deployment. And I'm wondering whether there's been any kind of probabilistic modeling based on the past frequency of occurrence to try to focus in on the deployment window that would have a higher probability of having it. Yeah, so that's a great question. So a lot of this also ends up being really site specific. So the failure size distribution does follow a pretty, the like magnitude frequency distribution is fairly well behaved for individual sites. But unfortunately, still we have such sparse monitoring efforts that we haven't really explored how portable those distributions are between sites. Okay, thanks. Okay, and if I have no I see no more questions coming up and to keep us on track in that case we are going to move on to our next speaker. I'm now here from Marine Danal Marine is an assistant professor at the University of Washington, her research focuses on predicting the dynamics of earthquakes and their grand motions in a changing of. In particular she uses that she uses seismic signals to characterize the physical processes that control them. So you are now good to go marine as soon as you unmute. All right thank you can you see everything. I can see and hear you. Thank you so much. So thank you so much for inviting me to this session and I should still learning a ton about potential for seismology and environmental studies. Today I'm going to show you preliminary work we're doing with graduate students in postdoc and undergraduate to the law and Tim. I'm showing you in the background, the photo of the fall some like damn before and during the drought in California, which we motivated a part of my research program. The changes that are quite drastic are something that we're trying to track with seismic waves. So, to tell you a little bit about how we do this. I basically, we record Indian vibrations, taking that was briefly introduced in Claire's talk just recently. Once by recording this ambient seismic vibrations of these two stations, we're trying to understand and image the structure of the function of time. We do this by cross correlating these two time series shown here in this, this function G. Now if the earth change in between these two measurements we can make another measurement of this function, and we can extract earth properties by making the difference between these two waveforms. And if I take a measurement of time, you know, like time series or the red time series, what we'll notice in these waveforms really late in the coda. So we are able to measure phase shifts. Actually in a better way than we were doing the early arrivals, and these fish is actually proportional to the change in velocity it's a very average measurements between two stations. So using this type of measurement to monitor groundwater levels is not new. Great team from Germany has done that in 2006 to look at volcanic response in Indonesia, and what they found with this time series shown in blue and red, this DV of a V measurement is the change relative change in seismic velocities. These changes seems to be anti quality with the groundwater level shown here in black. And so we took this into a different system and we started looking at California. This time series here shown in the light this change in velocity flipped, fit access here are shown against the groundwater level that was measured during this 18 year survey in San Gabriel. So San Gabriel is one of these urban managed aquifer and confined aquifer in Southern California. It's also where Caltech is and so there's great, long record of seismic data and digital seismometers. And so we use this place to study this natural laboratory to basically look at these variations. And you can see that the changing velocity that we measure seems to match quite well, the changes of the groundwater that was measured at that, the well that was in center of that aquifer. And then there's no face like or anything, even this 30 day average. So we would found is over successive droughts in California we have the regular pattern of dry multiple years wet events dry years wet futures. We can see that the groundwater levels are going down and down and down. So this was the kind of a first first year project for my recent amazing grad students and we're scaling this up to go back on this time series we also looked at how the specialty we could map the students velocities. And also we just did a crude averaging in between these stations shown in in the in the blue triangles here. Also showing the GPS time series this was our first attempt to do geodesy and seismology together. This might be showing you a direct comparisons between the changing velocity and the groundwater level and an estimated volume loss during the 2012 2016 drought. So we were able to have to collect the free data on these actual manage water levels for this aquifer, and we were able to calculate about the same amount of volume. That was found during that time. We also found this large wet events in 2005 the winter of 2005, where the aquifer actually inflated that was seen by inserted ashrin here on the top right. We also did map that we are produced on the for that blue area with the velocity decreases the water level one up and the vertical vector here showing the GPS uplift that occurred during that time. And so this this slide is basically showing you what I want to go in my research program is how can we measure seismic waves with Genesis data with inserted up with groundwater. How can we combine these all to create data products that could be useful for groundwater management. We're also trying to do this in Mexico City and it's worked on by Laura and Estelle. It starts been doing really fantastic work on combining different type of geodetic measurements to look at long term hundred year scale. This is one of the purposes of the Mexico City, because it was city is is in big trouble. The empty their lake, they're still pumping groundwater for urban use as you can see in this photo this nice non uniform basin subsidence that occurs in the lake. And has this paper and review on subsidence in the in the Mexico City area from 1926 2020. And so when we're reaching out to her and ask if she could help us understand our data. The top left criteria is showing your her times here a hundred year time series of relative elevation in in Mexico City and you can see this very nice puzzling and troubling linear trend as things go down. And she's really help fill the gap for this 1995 until today data. The bottom left body showing what Laura was showing is the change in velocities. And just to give you a bulk senses as the lands subsides, the velocities go up. And we're doing this study over a third year time scale so this is, you know, moderates velocity change but very sustained. Changing of the properties to achieve a large scale work we turn to cloud computing we've been using AWS because some of these data archives already on on the cloud. We found that in business as Margie is very well suited for embarrassing palization and so the cloud is perfectly suited for this type of work we've, we've gone into very nice to put speeds of about two weeks per second for these archives. The cloud also allows us to produce data and make it available. So we developed these tools for this. And now we're back to California and trying to analyze this large scale data set. The southern California earthquake center is all online there to about 100 terabytes of data which is pretty nice. We are combining this with a 30 about 30 terabytes of northern California and some of the temporary array. We're also deploying these tools and Julian has been also helping on the data workflow part of it. So this is our system, which will combine with northern California. In a lot of a I'm just going to show you a brief overview of yesterday's plots of how the velocity changes as a function of different environmental factors. What's important is that temperature affects velocity drastically. This is a two to four hertz type of waves so we're in the upper 500 meters annually as you can see for this time scale from 2000 until today. There's a strong effects, maybe a percent of changes in velocities due to the temperature. So we have to deal with this. The water level seems to be the number two, if not sometimes number one effect on this time series, it really depends where. So this is a time series from the station ngq since 2003, and the Alice all reservoir surface elevation that you see in blue. The velocity that you see here in red are showing this quite strange drop recovery and drop and then slow recovery that I was actually not familiar with because we didn't have that in earthquakes in California then. But when team plotted that against the reservoir level we could see that there's a somewhat direct correspondence between the reservoir levels and the velocity. This is not loading that we're seeing because the velocities go down with water level going up. So we think this is just a saturation problem, but we're not hydrologists. So here we are giving this presentation to a more expert audience. This is called again seems to affect these time series. This is the southern sea level. So we're looking again at 15 years worth of data. You can see the annual signals there and then the velocity that constantly seems to be going up again. Sorry, I mean, you have about 30 seconds. This is a very rich data set earthquakes also affect this time series. And so what I wanted to conclude is that I'm trying to put this together as a community that we want to have a virtual observatory of all these data sets together with this team of awesome colleagues, we want to combine all this data. Thank you. Okay, thank you for a great talk. I'm going to lead off with a question that is in the Q&A right now. This is from Dean Whitman. And the question is, beyond detecting changes in groundwater stage can these passive seismic methods be used to match spatial variations in aquifer properties such as porosity. I'm not sure about the porosity aspect because we do see spatial pattern, but it could also there's there must be some other hydrological parameters that control fluid flow that we're not, you know, we're not sensitive to, but maybe by doing a refined temporal evolution of the functional forms for the recovery. You know, if we have a steep recovery or a lock function, there may be some models that could help us discriminate between this type of porosity is one of this. The idea of also thinking about changing porosity as a function of time is to see how how these DV measurements respond through time, whether with the groundwater. So seeing if we're not recovering as much where the velocities do not match the groundwater level anymore. It could be that we cannot that the aquifer is no longer responding elastically. So I think these are the analysis would like to get into eventually. Okay, okay, there is a question from Rosemary Knight in the Q&A but we are going to hold that until the end so that we stay on track. Thank you for a great talk Marine. We are now going to move on to Jean-Wan Zhang. Jean-Wan is an assistant professor at the California Institute of Technology. His research focuses on seismic imaging of Earth structure, earthquake rupture processes and the intersection of seismology and environmental science. And so you are good to go. Great. I can see your slide and hear you. Okay, thanks for the opportunity to speak. So today I'm going to talk about submarine environment sensing or telecommunication fiber optical cables. So I hope that by the end of this talk I can convince you that there's a new field that's emerging by using pre-existing cables like the one shown in the left, either on a cable observatory or on cable transatlantic ones to monitor submarine environment. This field is very much in the infancy so a lot of challenges still need to be overcome. So as a seismologist, you know, it's not hard for me to see why we need submarine instrumentation. We always want to have globally uniform coverage. What's not surprising is oceanographers also cares about the ocean just as much. And they study ocean waves, internal waves, tidal currents, and so on. Many of these can also be observed on submarine seismometers, but for most of us it's a noise. But they also actually care a lot about seismocoustic waves. So we know that the ocean temperature is really important in understanding global warming. The chart to the right shows the excessive energy trapped on Earth and more than 90% of that is in the ocean. So they have spent a lot of effort in trying to constrain mirror oceans temperature, especially in a deep ocean. And the one was a great idea was using acoustic waves to do a thermometer. The idea is that ocean acoustic waves trapped in the sofa channel can propagate a very large distance. If you can set up acoustic repeating acoustic sources, then you can measure the speed very accurately and that turns out to be mostly controlled by temperature. This was an idea proposed by a water monk in the 1970s. Unfortunately, this idea was hoarded due to environmental concerns, ironically, and now the most important way to measure ocean temperature is using flows like Argo. Just last year, Wenbo Wu, a poster in my group, proposed to revive this idea instead of using man-made repeating sources, we can actually using repeating earthquake to do it. So for example, to the top left, you can see two earthquakes, T wave, the acoustic wave from the earthquake detected by island station, we call it T phase. And you can see that very similar to each other, the red and blue, but they have a time shift. It turns out it's time shift due to ocean temperature. We can measure Indian oceans warming over more than 10 years and compared with very nicely with Arco data. So we're sort of in a situation where, you know, we have a new way of measuring ocean temperature, but the data we are using here and the station we're using here are very similar to the seismology ones. So the future success of this kind of effort will rely on a better sensor coverage around the globe. So I'm going to talk about DAS a little bit first. A few people mentioned DAS already. DAS is disputed acoustic sensing is capable of converting a long cable into many, many sensors. For the application of submarine cases, their most important feature is that equipment is only required at one end on the land end of the cable, you don't actually need to do anything along the cable. Because of this very nice property of deploying dense array in summer environment, this field really expanding very fast. These are just some papers that's already published in many different scenarios, but mostly on what I call the research type cables. For example, the one to the right is a recent paper in Japan where they use a cable observatories telecom fiber turned out into a 50 kilometer long cable. Of course, this is a detection of a small earthquake on that cable. You can see nicely PNS waves. This is what seismologists were studying the solid air so we're very excited about. But at the same time, very interestingly on the same cable, they also observed hydro acoustic waves on DAS in this particular case was caused, I was sent out from man made air gun shots. But this gave me some confidence that maybe we will be able to see earthquake T waves on DAS very soon. And this would come back to this ocean seismic thermometer idea so that we can probably wind it using fibers to measure ocean temperature as well. Not only DAS in summer environment can measure seismic waves and acoustic waves, it actually can also detect the ocean waves. This experiment we did a few years ago offshore Virgin is a 40 kilometer long cable connecting a wind farm. We turned that into a couple of thousand sensors. This was just a snapshot of a short section of data on that cable, five kilometer section here. You can see there are actually a lot of different patterns here, some are more horizontal, some are more tiered. It's hard to see what they are, but because we have a very dense array here, we can do like this FK transform to look at what's their speed of propagation. It turns out you can see the blue part are showing where you have the biggest energy. The dash line are showing you the speed. You can see there's one package of energy that's a seismic wave. This is actually ambient seismic noise propagating on the ocean floor. And the other one is around kind of 10 meter per second. This is actually ocean surface gravity waves. What can we do when we observe ocean surface gravity waves? Well, Maren De Nau earlier showed us how to do seismic wave interferometry. By looking at ambient noise, you can get the Green's function between two sides. It turns out this theory can also apply to ocean waves. The left figure is showing the Green's function, the ocean wave extracted along the cable. So you turn one of the sensors as a source and see how the ocean wave is propagating along the cable. Also similar to what Maren did, a detecting subsurface change. Now we can detect how ocean acoustic wave speed change, ocean water wave speed change in the ocean. Well, they don't really change except if there is a current, then it has a dopper effect or slow down or speed up their ocean surface waves. The figure to the top left is showing you over about 100 hours how the ocean wave travel time is changing. And we can convert that to the bottom figure, the ocean current speed along the cable. So you can very clearly see this ocean current in this particular region is strongly tidally modulated. So this is a start to we can detect, you know, ocean current and some ocean where some of these oceanographers interest at a very fine temporal and spatial scales that cannot do in conventional method. We also have its limitation. It requires dark fiber. And because it's using backscattered way, you can really only use very short sections of fiber near both end. If you have, you know, that's very useful for cable observatories if you don't have a long cable. But if you're talking about trans-oceanic cables in your talking about very tiny portions on both end of the fiber. And we also use the long-haul tables to do better monitoring of submarine environment. Well, recently we proposed a new idea is instead of using backscatter light, we use the polarization of the forward propagating light, the direct way, and use it to monitor submarine environment. The reason is that if you bend or, you know, apply stress to your fiber in the submarine environment is to actually cause change in light polarization. It turns out telecom operators also care about polarization for their own telecommunication purpose. So they are actually already marrying polarizations in real time. So we applied this new technology to a long cable, a 10,000 kilometer one cable connecting Los Angeles to Chile. This is a cable query only by Google. And we were able to detect dozens of earthquake showing at the yellow stars along their cable in about 10 months. But the interest to this topic to the topic today, we were also able to detect ocean swell event. So in the top figure here, I'm showing one and a half months of data of spectrogram in the frequency band also called the primary macro system. What you can see is this red blob of the energy dispersal ones lasting a few days and they come back every few days. It turns out that you can also observe the same patterns on coastal stations. You can see both the primary and secondary macro system. It turns out that all these packs of energy are due to an ocean swell event applying pressure on their cable. So the fact that we only observe the primary macro system around 0.06 hertz instead of 0.12 hertz on the SOP polarization measurement is telling us that we are actually marrying their ocean pressure bottom pressure instead of their seismic wave generated by their ocean waves. So sort of this is my summary here. I think there is a very promising future of submarine fiber sensing. Of course, I'm motivated to do all of this because of the interest in solid earth geophysics. It turns out the method can also observe many many ocean processes. So it applies very well to submarine environmental sensing. It mirrors hydro acoustic waves, ocean waves, ocean current. I'm sure there will be more interesting operations coming up. There are actually multiple emerging technologies. I focused on dust and polarization today, but there is also face approach. There was a smart cable approach. Maybe there are even new technologies coming up. And finally, I want to say that to make this field new field successful, we need collaborations amongst seismology, oceanography, but also fiber optics, physics and telecom industry and potentially with help from government regulatory agencies to help us access the fiber cables and assemble their resources together. So with that, I thank you for their interest and love to take any questions. Great. Thank you. That was, those are really cool applications. So we have a question from Rangan and then I think we will move on after that but continue putting questions in the Q&A and we will come back round to those. Thanks. Can you hear me? Yes. Okay. Well, thanks for the very nice presentation. My question goes to actually handling the data because this is all very nice. We are talking about very long fiber optic cables. I know it's very data intensive. How are you handling all this data and what's the future approach to doing this work with dealing with the massive sampling rate and a long cable. Thanks. Right. That's a great question. And it's a very active discussion going on in the DAS community. Just to mention that the polarization approach because it's an integrated measurement along a cable, it actually doesn't produce as much data. But for DAS, you really indeed produce terabytes of data every day. And frankly, right now, every group is handling it differently. One of the potential solutions is like what Marin said, go to the cloud. But I think if we want this to be an effort of a bigger community, then we need facility support to handle the data challenge. Remember that it's both a storage challenge and a data processing challenge at the same time. Thank you. Great. Thank you for an excellent talk. We are going to move on to our final speaker of the session. So when you and fan is going to round up our talks for today. And he will give a couple of extra slides at the end of his own talk to give us a summary. He is a assistant professor at Scripps Institution of Oceanography as well as the University of California, San Diego. His research focuses on using seismic observations collected both onshore and offshore to study the earthquakes and environmental processes, including hurricanes, landslides, and turbulent subglacial rivers. And so when you're in you can go ahead now. Thank you. Oh, great. Thank you. Thank you for the committee for the opportunity and thank you for the audience. This is excellent. Today we will talk about Joe Hathers in particular summary landslides but before I start the talk, I would first like to acknowledge all my co-authors and collaborators research is collaborative efforts in nature and I'm grateful for all the opportunities. So there are a lot of summary landslides in the Gulf of Mexico, as you can see from the symmetry map, various of the sizes of slides have occurred here. As a matter of fact, the largest summary landslide around the US margin occurred on the Texas slope. So summary landslides can damage offshore infrastructures, for example, cables, and also other things. Here is an example that summary landslides in the Gulf of Mexico damaged the oil platform, and because of the difficulties to stop at it has been leaking oil ever since. It really exists, but our perspective over the phenomena has been rather aesthetic, right, so we kind of identify where they are from the maps, but we don't really know actually when. There is a huge knowledge gap about the basics of summary landslides, including where when not to mention what. So here I want to argue, seismology in particular marine seismologists can help a lot. You know here what we're trying to do is to understand where when basically time and location and as a seismologist, this is what we do. And I'm going to say something controversial yet brave is that we're actually the best doing location and timing. So here use a novel surface with detector, we identified 85 seismic sources in the Gulf of Mexico, and they can generate coherent transcontinental surface refills. And here I will try to show you that there are summary landslides. Now this is what I mean by coherent transcontinental surface refills, you start represent the three stations of the red square here shows you the summary landslide that we detected. The color of the dots represents the surface with arrival time, the arrows show the propagation direction that in lines are great circumstances and the left shows the uncertainty. How do we get how do we get this. Now we take advantage of the local highly coherent intermediate surface waves, for example when we propagating across a separate before reach the stations at different time. And also the location separation we will be able to resolve the propagation direction and also the centroid arrival time very simple technology. Now if we do it independently at separate and then piece it together we get a wayfield with a wayfield we will be able to locate the system source. And we're doing it differently than traditional approaches, we basically do a hybrid approach right so doing a reprocessing and also inverse series which is part of model three and part of model dependent. There are many advantages of doing detection this way but I want to point out the most important part is the method allows us to discover on no unknowns, which are buried in the noise. So to understand the example source I showed you we can look at the waveform so most important part of this figure is that the duration of the surface with lasted over 10 minutes this is very long and very different compared to earthquake signals. Now on the right part I'm showing you with record sections from earthquakes traveling generating with traveling through similar epicentral ranges and you can see the durations are much much shorter. Now the physical processes we need to model it. Now, somewhere landslides or landslides are you really model as central single forces, because the loading and unloading processes associated with the sliding. And because of that we can assume the first histories as a boxcar. Now, following the assumption, we can model this event and our model suggests the event propagated towards or slide towards the northeast direction. With an empirical skating relationship, what we found is that this event likely displaced about 62 million tons of rocks or sediments. And you can see it is located near this edge of the continental slope where the topography is quite steep and likely facilities occurrence, but I also want to point out is currently because of a lack of near field instrumentation our resolution is not good enough to do a direct comparison with detailed morphological features so future work. This kind of event seems to occur without any precursors or preceding earthquakes and we identified 10 of them occurring in the northern part of the Gulf from 2008 to 2015. When earthquakes happen, they can trigger somewhere landslides in the Gulf of Mexico, and here an event occurred in the Gulf of California and soon after its occurrence we observe a force in the Gulf of Mexico. Now these two panels share the same color bar what you observe here is a second source occurred 1500 kilometers later away and 435 seconds later. And what this means is occurrence of the summer landslide coincide with the surface waves from the earth and because of the positive of any assessment events in the Gulf of Mexico. This indicates the second one was dynamically triggered by the first one, not a random occurrence. What I want to emphasize is that what we're observing here is not your typical chain reaction hazards in the near field for earthquake trigger landslides. The ground motion needs to be really large. What we're seeing here are two things separated by 1500 kilometers and perturbation from the stress is minimum. In this rare event, we observed 75 dynamically triggered summer landslides in the Gulf of Mexico, clustering in the northwestern part of the Gulf where the basimetry is rather complex. Most of the triggering earthquakes are from the Pacific plate boundaries, and to plot the triggering pairs into a distance versus time plot we might gain more insight. The horizontal axis is a separation distance and the vertical axis is separation time, and the color represents main shock azimuth. To put things into perspective, we can plot them on top of this move outs. And the first thing we learn is this, these events were triggered dynamically triggered by the passing surface waves indicated by three kilometers per seconds mobile line, and they were triggered immediately, or with a very short delay. We should not observe a magnitude cut-off at the lower end. This suggests maybe the peak dynamic strain was not the only triggering threshold that we need to consider here. We do observe an interesting distance limit that all the triggering earthquakes are within 40 degrees of the Gulf of Mexico. And this potentially indicate a frequency dependent triggering mechanism because the further away the earthquakes, the less high frequency ground motion that can generate at the Gulf. Now in total we observe 85 events, 10 spontaneous, 75 dynamically triggered. The only thing we have more in the Gulf are the oil platforms, show as the yellow dots. On the left side you see a lot of the occurring sites of the summer landslides coincide with the active exploration leasing site locations. Now what I want to say is this is actionable science, understanding the mechanisms of summer landslides in this region, which means why and how can help to mitigate the future potential hazard. So mechanically, for a seismically detectable summer landslide, it needs to move fast, move as a block and move along a width zone. The complex structured ocean basin and the six sediments here really set up the fundamentals. So maybe rapid sedimentation accumulation have created overly steep topography, which would lead to over pressurization of either water or oil. Gas hydrate is prevalent in the region and maybe another contributor, or maybe with oil layer or seepage pathways have been there to facilitate sliding surfaces. The dynamically triggered events are definitely due to the prolonged ground motion. This is based in effect with skating, they don't get out. Now the prolonged ground motion will cause cyclic sharing, which will lead to plastic strain accumulation effectively reduce material strength and eventually leads to the observed slides. Cyclic sharing can also enhance the permeability, which might have contributed as well. Now what I want to say is all these hypothesis are testable if we have in situ observations. So even though these are dangers, I do want to say we have a pathway to move forward. And in particular, seismology and marine seismology can help. And I want to say, seismology is really at the forefront to make the world a better, better and safer place. For example, as John has showed you now using global cable networks, we can understand the earth and ocean processes as an unprecedented details. And marine has shown us that using passive seismic observations, we are able to monitor subsurface hydraulic systems at very high spatial and temporal resolution. And clearly has shown us that using seismic observations, we will be able to understand the environmental forcing, which ultimately changes the shape of the surface of the earth. And lastly, Danika has showed us seismic signals can be used to understand river systems in particular the bedrock loading system, which are very difficult or challenging to observe otherwise. And finally, I want to say innovative seismic techniques can be used to study the environment and ultimately to benefit the society. Thank you. Great. Thank you for an excellent talk and for giving us that nice summary of the end of the session. This has been a great set of examples that everyone has presented before we move to the general discussion. Do I have any questions for when you're on. I have a question. Go for it. Go Diego. Hey, Ben. How's it going? Hey, how are you? I'm good. So I guess I have two questions, but I think I want an answer to the first one more than the second one. The first one is so those 10 landslides that you're saying are spontaneously triggered. Really, what that means that they're not seismically triggered, but there's other mechanisms for triggering submarine landslides like storm waves or isopic no flows and other kinds of stuff. So, have you tried to look at other mechanisms for those kinds of landslides. Great question. So, I checked the, we watched three models, tried to infer if there is any correlation with storm activities. That is negative for these 10 events. But other than that, like you mentioned, there are lots of issues, or not issues, lots of factors can play into the initiation processes, which is really poorly known. My current guess is a lot of them are due to the existence of gas hydrant, because they're very prevalent and they can act in a wide water depth range. But that is unconfirmed hypothesis and I agree with you many things could happen. Great. And then we're going to have a question from Donna. Let me list talk when you're on that was great. I was curious if any of the submarine landslides either that you've studied or if you know of others have been also detected on DOS cables and if you think there are opportunities for kind of integration of the kind of seismic methods that you used and other kinds of seismology for characterizing things. Excellent question. And thank you so much for bringing that up. I was hoping to have a chat with John and afterwards. And so the interesting part of the ones we have detected are mostly happening in deeper water. And the reason for that is for a seismically detectable they have to be relatively rigid to move as a blog therefore they have to have some history. So the desk cables tend to be near shore, which are on top of unconsolidated sediments, which are much softer. I think there might be mud flows near the coastal areas, but that might not be the ones I have been detecting. And coming back to your notion is, if we are able to leverage observations, you know into the deep water through a much larger region, I think that's just excellent. I think John might have something to add. Yes, he is about to. Yeah. So, first of all, when I said totally make sense, you know, the DOS, the range may not reach where you can observe the submarine line slide. But of course, they also summary nine side near coast where it totally within the reach of DOS. I just want to bring another topic that is submarine line slide is often what cause cut of telecom cables in the ocean. And they are actually database actually the way the way people first of their big summer line slide is because of cut of submarine cables. So I think this is that there is a great opportunity there are using data collected, you know, damages to cables collected by other by the telecom industry also using the kind of polarization sensing we are doing. You know, maybe it's not cutting the cable but deforming the cable enough to produce a polarization change to monitor submarine line. I think there's a lot of opportunity there. Okay, so we're now that this was a really nice segue into the general discussion part of this session. So, you know, questions can be asked of all of us because I'm in a circle back to something that is in the Q&A that Emily Brodsky put in after the second talk. And that question I think is actually applicable to more than just the first two speakers. So her question or kind of comment and question. It seems like this approach is currently a bit like studying earthquakes by seismometers in the 1930s with fast stations. There was a lot of promise to capture qualitatively new information. Real high precision time series which cannot be accessed any other way, but there needs to be a concerted effort to get to the next step for earthquakes that was a serious investment in seismic networks. What does the next step look like in this environmental seismology. And I don't, I don't know who wants to take it fast if someone can jump in or we can do raised hands. And we have more funding. I can add to this. So I think the points that Christine and Sarah raised earlier about needing more support for mixed and interdisciplinary data also apply very much here. And strengthening the infrastructure around support for training instrument use archiving and metadata standards, especially for mixed data types on co located data and collaboration for interdisciplinary research to to use seismic data and kind of connects a relevant fields of expertise are all pretty critical. There's, I think a strong need for bringing together seismic technical expertise with process based expertise for understanding some of the very complex systems that we're studying here. And there's, there's big boundary I think for participation and in just using seismic data that is a hurdle for for non seismic researchers to get into using the stuff so Yeah, I just to follow up on what Danica said, some of the experience I had an even trying to compile this cross site comparison is that there aren't really standards for best practices in terms of how to most effectively set up these sites, or how to archive your data process your data, make your data available, which meant that I basically needed to go from raw data and reproduce make sure I could reproduce everyone's methods read download everyone's environmental data for each of their sites individually. Even to just look at like does a bigger wave shake the cliff more. Um, so I think establishment of some of those community standards can be really helpful to deal with this issue of the fact that there are still a very limited number of data sense that we even have to look at. Go ahead marine. I arrived a bit late in the game, but the critical zone of the battery was such a cool innovation for transdisciplinary research. And I think I arrived late because I was still learning about the length scales of the processes I was looking into in the monitoring the show structure. I think we're not yet to solve moisture but there is definitely like we're spanning greater length scale. That's easy or was looking into, but it was inspiring, and I don't have any perspective on how, you know, things could be improved from that program but something of that scale would be fantastic with seismic data. There are permanent monitoring sites set up within the existing critical zones that have the full infrastructure and all of the other data that one might want to have to see what your seismic signals are telling you about. So the last slide I had is because I'm really interested in collocating these measurements when they are not collocated so looking back in the past. Part of the goal I have is to use seismic properties as proxies for these other data types that are not continuously recording since the 1990s or something. But collocation is key. And so maybe we'll find smart ways to learn from data types to collocate like in star and GPS or GSS. But there's definitely if we could have something that would, you know, in the future deployments of these large arrays, the future of stage engage for instance, let's think about ways we can collocate these all of these together. So we're going to go to when you end and then John when it's going to go next. Oh, great, thank you. So, I want to come back to what Emily just mentioned in the in the question part is now we're go beyond single stations for a while and the race cosmology has really run in a revolutionized our understanding of a lot of processes like what someone showed he uses thousands of like 10th of thousands. The seismometer is along the last cable to do the things. And this is really one leverage that we can play with understand these processes is by deploying for designing arrays that to fuse our purpose right so basically all we mirror as a small just our location and time. And we have control over the seismometer locations this is a great leverage. So understanding these processes, we should think of what we want to know because different array configurations are suited to study different processes and the resolution also depends on the overall footprint. And so if we keep that in mind and potentially with the emerging technologies, we, you know, we do have a very great way to move forward. And he's actually also a follow up with what when you and said, you know, at least for their submarine case, either it's the landslide or other processes of interesting to our geographers. I think there is two resources we should tap in to really make this scale up. Number one is all those historical data that may be sensitive to navy or other kind of usage. You know, maybe this committee this course like a committee has some leverage or your sponsors have some leverage to say about that but you know there's a lot of very valuable data for for studying the ocean. Another one is to really expand their fiber sensing arrays in the ocean. The instrument turns out may not be their biggest challenge is the fiber access is the biggest challenge. Again, this is the regulatory domain there, people who own the fibers get permission to do it for telecommunication they are not allowed to use it for research purpose. And they need the permissions from the regulatory agencies to say yes, you can do it or you are required to do it. Again, maybe some of the sponsors of this committee is able to help the research community about that. So, unfortunately, this committee does not have the ability to declassify Navy data I will clarify that, but I am going to use my, my power is moderator. Well I generally don't ask the question and then I'll I have another question after that so Donna do you want to go. Yes, sure. Thank you so much for fitting me and I just had a question, particularly for Jean went along the lines of what we were just talking about. Putting aside the, the access to these data, what would be the, what's the maximum time series that's available for many dots cable worldwide should the data be available I think I saw in your last slide you showed a time series that continued back to maybe 2009 but I was wondering if data, over what time periods data actually do exist that if I've made myself very clear, thanks. Right, I don't think there's anything back to 2009, all the dusting experiment so far goes back two years and most of them are short term experiment. There hasn't been any real long term operation of that's already so that's mostly again due to access to fiber early for the research cables when they are doing maintenance you're able to put your dance instrument make a few weeks of merriment, and that largely limits what we can do right we can show, we can observe this and that but we cannot show how it can make a real scientific impact to a particular question. It's definitely one of the limitations. And unfortunately some of their polarization data measured by the telecom operators, they use it and they throw it away they never save it. Okay, that was my question because we have had these cables deployed over some time if there were, if those, if that information was stored over longer times by the company but not. I'm interested in the data at various more time scales and once they use it they don't need it anymore. Thank you. Okay, we're going to go to a question from the Q&A. So Colton Luna has a question following on Emily's comment. It seems like the seismic approach up to now has been to catch up to the existing environmental data. What happens when seismic data surpasses the existing environmental data. What are the methods of funding across direct routes and stakeholders to create co located experiment all that I think of, I think of what it should be other mechanisms for funding across direct routes and stakeholders to create co located experiments. Marine. So the first one produced the ladder, you know one thing that I found challenging is that reading our geodetic measurement has ties to NASA, and then solid earth stuff that I want to do is more tied to NSF and hydrology and not necessary geophysics and so I don't know how to join these funding sources because it seems like there should be a common interest but I don't know how to do this and so maybe teaching us how to do this as a career would be valuable. I don't know when seismic data would surpass the other I don't think they're surpassing I think there's going to be nice proxies that we can learn from each other and then we'll just need to huge computers to store and process. Anybody else want to comment. I'll add that I've been involved in a research coordination network trying to look into DAS usage on my, my working group is a net geomorphology and I think one of the challenges for us is that DAS in particular is is very expensive so if we're interested in co installing, you know the kinds of seismometers that Pascal already carriers that's not a problem on the stream gauges and most rivers there's plenty of other, you know, weather data that kind of thing. But if we're interested in using some of the most high resolution cutting edge stuff like a DAS that's, I think they're in the like $200,000 range to purchase one. And, and rental is is pretty prohibitively expensive especially for sections that are. I understand that some of the sections have had more funding available for these kinds of things than others so I think kind of data sharing and instrument sharing support is is an important thing going forward for for use across fields. Does anyone else have a comment. So, I wanted to kind of take to ask you guys on this as well in the, you know, because Danica brought up this RC and idea. And I think in geophysics we can look to some really good examples of communities coming together to push for for much bigger scale projects you know obviously us scope to geo prisons and margins to compress. And so is, is this community reaching the critical mass where they, they, they can do that or moving towards doing that. And, you know, I realize that I'm asking, you know, five assistant professors whose time is maybe not best spent on that immediately but is there a long term vision for me for how to, you know, work across fields and how to develop infrastructure you know some of these mid midsize infrastructure projects and things like that is is the community moving that way do you does anybody have an opinion or comments on that. There's a metric. Oh, sorry, when I don't have the because you had been to say something before and then when you're and go next. That's okay. So Danica has led a lot of the AGU sessions so I think metric to see how you know the growth of that community is by the success of these environmental studies, a seismology session at SSN AGU. Then he has that a lot of them and so I think eventually the, the community is there so that that's all I want to say. Um, so I want to echo back first marimeta excellent point is if more people work on this and they're there will be the base to, to use the infrastructures, but I also want to point out is we not only need to have the user base we need to be able to do something other direct observations cannot offer. And I would say at this moment, a lot of our studies are still in the validation stage for example my own, I, you know, I think there are similar landslides but I have not validated this. I think once we're reaching relatively more mature stage that our results are established like what Christine did earlier for the panel one is if it is already a robust technique to to offer things that other observations cannot stand. That is a better way to move forward with a very clear strategic goal. So, so I guess the idea is we not only need to confirm what has, you know, what would have happened but need to offer things that other observations cannot provide. Okay, thank you when you're in. I'm going to add into comments that have come in in the Q&A so the first one is from Emily Brodsky and she is partly responding to Colton Linner's question. And Emily says that it is possible that seismology will motivate a new set of questions that have not yet been asked in allied fields because they have never had the high temporal resolution of seismology. Okay, her concerted effort to explain the capabilities and get these questions asked. And I think that's a really nice follow on from the comment that when you're in just made about being in the validation stage. And then below that there's a comment from Cindy Ebinger, who points out that industry data and partnerships could be very helpful, connecting with DOE and offshore O&G, which I'm not quite sure what that one means but and wind farm industry could be critical at this stage. So Danica, I'm going to do, have Danica say something. And then, and then I know that Diego, I think also has a question that he wants to ask to go ahead Danica. Yeah, I wanted to respond to both Emily's and Marina Munion's points that I think kind of speaking from my experience and running these AGU sessions on. I think one of the things that I, and a few of us have highlighted is this interdisciplinary collaboration. I'm seeing a lot of very, very advanced work happening from seismologists that isn't only making it into the process communities, because geomorphologists are watching seismology talks and don't understand what we're seeing. I've been running these AG sessions for, I think, three years now and I struggle to understand some of the talks they look very exciting and it's hard to take away live true process information right and there's a good chance that we are capturing very exciting new signals to Emily's point that have not been picked up yet. The temporal resolution is unbelievable. I mean, for sediment transport we're talking like point measurements with a bucket basically right compared to every like sub second sample. So, I think the information is there but I think there's a communication barrier between processed based fields and the seismology community that really needs to be overcome before we can take advantage of that joint expertise. Great, thank you and Sarah is going to jump in and I think our other speakers from both the first session and the upcoming session should feel free to jump in at this point as well so take it away Sarah. I just wanted to add that there's some analogies. Like your regulatory problem or issues dealing with fiber optic cable access. We share. Well, I think we will share considerable regulatory issues as near surface geophysics moves to drone transport of instruments. I think there's a lot of value in community organization, so that it's not just, here's one researcher who wants to access this cable, right, but some, you know, the community as a whole can talk to regulators in a different way and I don't have the solution on how to do that, but I think that, you know, there are a lot of common issues for all of us going forward that will benefit simply from, you know, meetings like this and community building so that we can deal with with people outside our sub disciplines on these issues. John when I absolutely agree with Sarah on that. But I also think that just to come back to Cindy's question earlier, I think there are the environmental geophysics or science, there is a fundamental research side, and it has more stuff related to practical applications. You know, than many of the other, you know, for example, deep earth study, right, so I think this is where we need help to connect with this all this other agencies with practical interest, right, like I can, I can give you example recently when we were working on some of the data on hydrology kind of research, and we were able to connect with Los Angeles water and power department, and they're still kind of testing and see what you can contribute. But that's a positive step, right, and you'll be nice is as I was saying you mentioned about wind farm offshore wind farm, right and the DOE all these agencies or industry have a practical application and see the value here. And then I think that's when this kind of community where start to take off. I think just to add something in the same vein of what folks have been saying is I think a lot of what we've talked about are still focused on what are the signals that we can actually see. And to go towards what are then given the signals that we can see, what are the disciplinary questions that we are then able to ask. And so I think that that's really the connection that we need to like look to strengthen through these types of partnerships that everyone's talking about. So this has been a great discussion we have maybe one minute left and Diego if you think we can get this Diego has been asking to ask a question that's going to take us in another direction. So I lost minute for the session. And so I'm just gonna say quickly that two remaining questions in the Q amp a but they are speaker specific and so in the break if the speaker speakers could address and that would be great. Okay, go Diego. Yeah, actually we drifted in this direction, which is fine so one of the main goals of this meeting was to explore that connection between, you know, the seismologists and the users, so to speak, and in these other disciplinary fields. And what I've heard is a lot of like seismology starts out in the periphery and starts to nibble into this other disciplinary field so my question specifically for for John went because for mometry is well inside of physical So I'm kind of wondering what the response was from oceanographer pure work and what your thoughts are on getting physical ocean offers to do seismology instead of seismologists doing oceanography. Yeah, that's a that's a great question we had concerns in the very beginning right. So I think one key issue is that their cemetery approach is marrying something at a different scale than some of their current approach. It turns out oceanography community was very welcoming to this new development to the they see that this add value to what they are measuring which is kind of point wise very accurate measurement but we are doing a long baseline average measurement that kind of average over some of the smaller scale, you know complexities that they may not care about for particular problems. And then because of the interest I'm actually quite hopeful that there could be a concerted effort right for example, if we need more hydrophones, even if it allows. If we need more hydrophones in the ocean to do some obituary where the same hydrophone can be used for seismology as well. If you put a hydrophone in the seafloor you need cables to connect the data to the land. Well can that cable then be used for that or some of the other research right so I start to see they're actually quite a common interest there but of course this is still in the very beginning and we need to be able to break some of the barriers, you know to understand what each interest are and what's what's each other's needs in this case, but a very good suggestion. Great well this was a wonderful discussion this is a nice way to end thank you to all of our panelists for giving your talks really excellent talks and also for this great discussion it was very thought provoking. Welcome back everybody. Hi, so my name is Steve Naram and I'll be moderating this last session. Our final panel will look into some examples of satellite grimetry and hydrological applications. Our speakers are Matt Rodel and Bridget Scanlon. These speakers will give a 15 minute talk with a few minutes for some questions followed by discussion with both speakers at the end. So let's begin with Matt Rodel so Matt is the acting deputy director for earth sciences for a hydrosphere biosphere and geophysics that NASA got at Space Flight Center and Greenbelt Maryland. He's a member of the science team for NASA's Graduate Recovery and Climate Experiment follow on, and he is co leader the mass change designated observable study on research application team. He leads the global land data assimilation system and, and also projects focused on monitoring groundwater storage changes, mapping and forecasting drought and wetness and detecting climate related variations in the global water cycle. I'll let you take it away. Okay, thanks Steve. Okay, bear with me for a moment while I get my presentation started some multi step process here. Okay. Full screen and then stop sharing and slide show. Okay, how does that look. Can you see my full screen Steve? Yep, looks good. Okay, great. So I'll be talking about the application of satellite gravimetry for water resources monitoring and see if we can get this to move forward. Here we go. So let me just provide some motivation here. If we look at how we monitor the water cycle, and I'll be talking mostly about trusted water stores in the water cycle and monitoring that with satellite gravimetry. Our ground based monitor monitoring the water cycle. We look at several variables. So, for instance rain and snowfall in the upper left here this shows an example of a snow tail site in the mountains in Colorado with various instruments. The top middle is an example of measuring the stage in a river from which you can infer a river flow. Top left is just a wing like centimeter and any covariance flux tower, both of which are used for measuring about transpiration. And then we have some moisture snowpack and groundwater monitoring across the bottom. All of these require quite a bit of labor to install the systems or to go out and make the measurements that they're expensive to maintain. And of course you need to have access to the site where you want to make these measurements. So as a result it's basically impossible to have measurement stations everywhere on earth continually and in all the places to really represent the water cycle across the earth. This shows some examples of some of the in situ measurement networks that we have. The top left is the global telecommunications system meteorological stations. And you can see the list of the variables that are measured at these does not include groundwater or soil moisture, for example. And if you look at some of these areas like northern Africa you can see there are huge distances between these dots. So really not completely representative of the entire world outside of the well-developed wealthier countries. The lower left shows the river flow observations that are archived at the Global Runoff Data Center in Germany. The warmer colors indicate stations where it's been a longer time since a record was made available. And for example the Nile River, one of the most important rivers in Africa, the most recent observation you can get is from about 1983. When we look at groundwater monitoring it's okay in the U.S. You can see the USGS Groundwater Climate Response Network sites in the lower right. Pretty dense in the northeastern U.S. and not so dense in much of the rest of the U.S. But that's a lot better than it is in the rest of the world in the upper right. These are the stations that are archived by the Global Runoff Monitoring Network in the Netherlands. And they do not even make the raw data freely available. So the issues here, as I said, are installing and maintaining these stations but also there are political issues in countries that don't want to share their data. And of course, even if there are data, there are formatting issues and making them available in a centralized location. So that's sort of the motivation for satellite remote sensing. And this shows the current fleet of NASA's Earth Observing satellites. Many of these circled here are relevant to hydrology and I'm going to be focusing on grace and grace follow-on, which you can see in the top middle there. So what makes grace and grace follow-on different? So grace is the gravity recovery and climate experiment. And whereas most satellite-based observations are looking down at our Earth and making measurements of various wavelengths of the EM spectrum shown in the upper right there. And these can be related to various quantities of interest, but they're limited in what they can do. You can't see below the first few centimeters of the snow canopy soil column using this type of a measurement. Graces and grace follow-on are different because they're actually measuring the gravity field. They're not actually looking down. And because of that, they're able to infer changes in terrestrial water storage at all levels down to the base of the deepest aquifer. The way this works is grace and grace follow-on. Each of these missions was actually two satellites, two identical satellites, one following the other, an initial orbit altitude of about 485 kilometers above the Earth, and they're about 200 kilometers apart. And every five seconds, you use a K-band microwave and then the case of grace follow-on, a laser-ranging system to make a measurement of the distance between the satellites with an accuracy about the size of a red blood cell or even better with the laser-ranging. So as these satellites orbit around the Earth, as they come to a mass anomaly, which means there's also a gravitational anomaly, the top panel there, the first satellite sort of gets pulled forward by this gravitational anomaly and the distance between the satellites increases as they pass over the mountain range. The first satellite sort of gets held back and the second satellite speeds up and the distance becomes shorter and then things even out again as they pass over. Because these observations are so precise with this micron level distance ranging, we can not only measure or infer changes in the orbits caused by these static mass and gravitational anomalies like mountain ranges, but also changes in time or changes in the temporal gravity field caused by mass movements around the Earth. So if we can then account for the atmospheric mass variations using measurements of surface pressure and atmospheric models and also account for move those effects, what's left is changes in the gravity field over the land that are mostly caused by changes in terrestrial water storage. So terrestrial water storage is the sum of all the groundwater, soil moisture, snow, and surface waters. And on the top here I'm showing a time series based on in situ observations in Illinois. The blue shows the time series of relative levels of groundwater and then the red is soil moisture and the white is snow. And these are stacked on top of each other so that the top contour is the total trust or water storage. And the changes in the total trust or water storage are what we infer from the grace and grace follow on measurements. I'm circling here in yellow a drought in the Midwest at first and then on the right very wet year when there's flooding in the Mississippi you can see that there's not only seasonal variability but interannual variability in this total trust or water storage. To see what we get from grace and grace follow on if you look at the bottom right this is an animation of monthly observations of trust or water storage anomalies meaning relative to the long term mean trust or water storage. So at each location if it's blue that means there's more water than normal and if it's red that means there's less water than normal. If you just focus on a particular region for example the Amazon you can see that there's a seasonal cycle but there's also this interannual variability. And these again are equivalent heights of water in centimeters in the lower right there so it's all relative to the long term mean. If we remove that seasonal cycle and then fit a linear trend at each location on earth using all the data or in this case we use data from grace from 2002 to 2016. We can map the long term rate of change of trust or water storage at all the locations across the land. And then the question is which of these apparent trends in trust or water storage are just natural variability and which ones may be caused by human actions or human water mismanagement and which ones may be associated with climate change. One of the first trends we zeroed in on when we had grace data back in 2009 was this big bullseye in northern India. And it turns out this is caused by farmers pumping a lot of groundwater to irrigate their crops. They pump the water out of the aquifer, they spread it across the farmland, most of that water evaporates and you end up with a net loss of water in the region. We estimated based on grace that the rate of change of that water loss in the region that's outlined here is about 19.3 cubic kilometers per year. To put that in perspective, the largest surface water reservoir in the U.S., Lake Mead holds about 30 or 35 cubic kilometers of water. So in just about a year and a half in northern India they lose about one lake mead worth of water, which is a lot. And this sort of opened some eyes when we first published this back in 2009 and India has become much more aware and more serious about the issue. And what's interesting is that it was not, you know, the fact that they weren't really aware of this and paying attention was not an issue of data scarcity. This is a map of wells, all the little dots are well locations. The problem is these data were not publicly available. And so people couldn't do the sort of scientific analysis to understand what was going on. This is the map again of the trends in terrestrial water storage or apparent trends. And if you're interested, back in 2018 we published this paper in Nature where we tried to explain all the major trends that we saw in the grace data and then attribute them either to natural variability, direct human impacts, or possible climate change impacts. So changing gears a little bit here, you may have noticed that the grace and grace follow on data resolution is pretty low, the spatial resolution is low, the temporal resolution is about monthly. And there's also a significant latency. Typically we don't get the nominal grace and grace follow on fields until two to five months after real time. So to deal with that, one of the things we do, and we do this for many other variables in hydrologies, we use something called a land service model, which is like a land component of a weather climate prediction system, except more or less optimized for terrestrial hydrology. And it's something that it's a computer model that divides the world up into a grid, and then it may divide that each grid square into subgrids in this case based on the land cover type. And it uses a system of physical equations to solve for the water energy cycle components, the fluxes and states of the water energy cycle. The inputs are things like precipitation and solar radiation. Outputs are things like snow water equivalent, soil moisture, runoff and evapotranspiration that are not necessarily easy to measure. And as I said, we use these land service models for data integration. So we have a lot of observations from satellites and ground based networks that are not necessarily exactly what we want to know or there may be gaps in the observations. And so we can integrate the data within the model and sort of fill in these gaps. So we do this by inter-comparison and optimal merging of the available fields. We then use things like precipitation and solar radiation as inputs to the model to drive the model forward. We call that forcing. We also do a process called data simulation where we optimally merge an observation like in this case snow cover with the simulated snow water equivalent in the model. Snow water equivalent is a much more valuable variable than snow cover from the MODIS sensor on the Terra and Aqua satellites because snow cover just tells you if the existence of snow is binary, snow or no snow, whereas snow water equivalent tells you about the amount of water in the snowpack available for water resources. And then we can use ground based observations to validate the output. So we've been doing this with the GRACE data for about more than 10 years now. And we've been applying this to develop, for example, drought and wetness maps. So the upper left here shows a typical GRACE terrestrial water storage anomaly map over the US. This is from May 2014. We assimilate that into the model. The model allows us to spatially and temporally downscale it and also bring the information up to near real time. And so we get outputs, for example, surface soil moisture, root zone soil moisture and groundwater across the bottom. And what we've done here is we've used a long term simulation of the model going back to 1948 to develop a climatology. And then for each location, what we're plotting here is the wetness percentile relative to that climatology. So for example, at a given location, if it's in the fifth percentile of that location, that means it's only been drier than that at that location in the same time of year, 5% of the time in the past 70 years or so. So we developed these about 10 years ago. We make them available at nasagrace.unl.edu. And they're actually one of the inputs to the US drought monitor, which you may have seen in the back of the USA today. It's sort of the premier drought map for the US. One nice thing about our maps is they divide things up into three levels. The surface root zone and soil moisture and groundwater. And we also account for wetness, not just dryness. One thing we've done recently is also develop a forecast capability. So on the right there, we're showing on the top our forecast for root zone soil moisture and initialize on February 1st, valid for March 1st. And then the lower right shows the actual March 1st soil moisture conditions. And so we can see we did pretty well. It's a little wetter across the mid-Atlantic than we expected, but otherwise a pretty good result. The other thing we're doing now is doing the gray state assimilation at the global scale. So this animation shows in the top left the model running in the open loop, which means before we assimilate the grace data, the upper right are the grace and grace follow-on observations. And you can see that the spatial resolution is quite a bit coarser with the grace observations. And then when we assimilate the grace observations into the model, you get the maps on the lower right. So you can see it maintains the patterns seen in the grace observations, but also has the higher spatial resolution of the model, which is attributed to higher resolution inputs, such as precipitation, solar radiation, vegetation type, soil type, etc. So in summary, as I said, it's expensive and difficult to make ground-based observations around the world is basically impossible. And that's why we use remote sensing to fill in these gaps. The grace and grace follow-on missions provide a unique type of observations. There's no other way to measure total treasure water storage from space. These treasure water storage observations have been used to identify areas of increasing and decreasing freshwater availability, which we've attributed to natural variability, direct human impacts or climate change. We use land service models to integrate various types of observations and fill in spatial and temporal gaps, and also bring the data up to near real time. We've been doing this with the grace and grace follow-on data, enabling us to map and forecast wetness and drought conditions. And as Steve alluded to earlier, we're now studying architectures for the next generation of mass change mission that would be the successor to grace follow-on, which launched in 2018, as recommended by the National Academy's Decadal Survey in Earth Sciences in 2017. So thank you for listening. I'd be happy to take questions. Thanks, Matt. That was great. We have time for one question. There was a question on the chat about what is the spatial resolution of grace FO. You could answer that real quick. Yeah, so the spatial resolution varies depending on whether you're looking at the low latitude near the equator, near the poles. Generally, it's on the order of about 100 to 150,000 square kilometers at mid-latitudes. So you're talking about the size of the state of Illinois, for example. Hey, Matt, we'll, why don't you hang around and we'll have more questions after Bridget's talk. So our next speaker is Bridget Scanlon, and she is a senior research scientist in the Bureau of Economic Geology at the University of Texas at Austin. Her research focuses on the evaluation of the impact of climate variability and land use change on groundwater recharge, application of numerical models for simulating variable saturated flow and transport, and controls on nitrate contamination and aquifers. So Bridget, I'll let you take it away. Can you see the full screen now, right? Yes, yeah, we see it great. So I'm going to talk about comparing gray satellite data with the monitoring and modeling data, and I'm going to focus on the US where we have quite a lot of data for comparison. I would like to acknowledge the other people working with me on this project. Dr. Fratheb was the postdoc and it was a collaboration between US Geological Survey people, Don Poole, Lenny Connickle, Ward Sanford, and the NASA folks, David Wies and Eman Chosave, and Alex in our university. This is funded by USGS and NSF and we had a week long programs each summer where we sat together and discussed and I think it benefited both groups to try to better understand grays from the NASA folks and then vice versa. So as Matt provided a great introduction to gray satellite data and I'm just going to follow up on that then so we're talking about water storage, which represents the balance between inputs and outputs, and a lot of emphasis on water scarcity in these days. And so reductions in storage can result from a reduction in input, such as related to drought or groundwater, and an increase in output related to irrigation pumpage or things like that. So Matt mentioned the traditional approaches that we use for monitoring storage and groundwater is a big component of total water storage in many systems. And so looking at groundwater levels and multiplying those then by a storage coefficient to estimate groundwater storage. There's a lot of uncertainty in storage coefficients between shallow, unconfined aquifers near the land surface where storage coefficients could be like 0.1 to 0.3, and to deeper confined upwards where it could be a couple of orders of magnitude less. And so when we have wells going through both then it's very difficult to figure out what the water level change means in terms of storage. Regional models are used to have been developed by USGS, state agencies, many different groups to look at groundwater storage and traditionally looking at recharge estimates and then subtracting discharge to estimate groundwater storage. And we think that these are very accurate because they're high resolution but that's not necessarily the case. And as aquifers develop as I show in this slide here. So this is from Danny Kanaka's work. Initially, most of the water that's pumped out of an aquifer comes from storage so storage declines rapidly, but over time as those cones of depression expand then you capture more water from surface water or increase recharge. So this is important to understand and on the right then show, you know, when you drill a well near a stream. Initially, the water is coming from aquifer storage, and then over time you may be capturing some of the stream flow and this will be important when we try to understand grace data later. So Lenny Kanaka developed these maps of groundwater depletion in the major aquifers in the US and this is extensive from the period 1900 to 2008. So you can see a well known hotspots of depletion the Central Valley minus 145 cubic kilometers over this time, and Arizona, Louisville systems 100. The high plains and minus 340 and Mississippi in Bayman minus 180 and slight rises in the Northwest Columbia plateau and snake and very little change along the East Coast. So this has been a greater framework to look at the grace data. So I'm just going to look at three basic questions. How reliable are the data when we compared with groundwater levels or regional models. What's causing the storage changes is a climate human use or both like Matt was talking about globally. And then how can we use these results to move towards more sustainable water management. We use different grace solutions from UT Center of Space Research or JPL or spurt harmonic solutions and we look at climate impacts with detailed data in the US and human water use. And then we estimate groundwater storage by subtracting those other components snow surface water and soil moisture from the products that Matt was describing. So we compare to the data with regional models that the USGS and other groups have developed in different occupors with the groundwater level data up to 23,000 wells, and then I won't have time to talk about comparisons with the global models. So this is the total change in trends in total water storage from 2002 through 2017 so over this 15 year period. You can see the reds are declines in storage in the Southwest and South Central US. The yellows are buffer zones with almost no change, and then greens and blues are slight increases in storage in the humid East or in the Northern and Northwestern regions. And then when we subtract those other components we end up with the groundwater storage. So these numbers are very similar to the numbers we saw for total water storage and just likely less related to soil moisture storage impacts. You know declines here of almost 30 cubic kilometers in the Central Valley, 40 in the Central and Southern High Plains, an increase of 20 in the Northern High Plains Nebraska area, very little change in the Mississippi environment and slight increases here. So if we compare those data then with what Lenny Connick had in his earlier work, you can see that the biggest difference really, I mean similar in the Central Valley between the two, even though the time periods are slightly different. This is 15 year time period, this is the most recent eight year time period from his work. So the declines are similar in the Central Valley, and very little change or slight rise over this period in Arizona from managed up for recharge and Colorado water. Similar declines 40 cubic kilometers and when you go into detail you can see that this refers to mostly in the Central and Southern High Plains, but the biggest difference. What was really surprising to us was the Mississippi embayment. We have almost no change in storage, and they have 100 times greater depletion in storage from the regional models. So who's right. No, so we have to duke it out. It gets quite complicated. I tried to show some data to you know this was an order model and is it only 40 stream streams within the model so underestimated the stream capture like I spoke about earlier. And so most of the water may not be coming from storage recharge was estimated from the literature. And so there were a lot of issues and they are now updating the model and improving it. And I think that shows where grace can question our thinking, and, you know, and then help us reevaluate. So here I'm showing comparisons with the groundwater levels. This is in the Northern High Plains. So we saw a trend of about 20 cubic kilometers. Is it really a trend. I mean do we expect it to continue into the future. It's really just inter-annual variability. And over this time period that the linear trend comes up positive, but if different time periods you get different trends. The Central and Southern High Plains then a decline in storage. And this we're well aware of because there's very little recharge and there's no surface water. And the Tulare in the Central Valley declines very good correspondence between grace in black and the groundwater levels in many of these systems so a correlation of 0.95 here. And then the Mississippi and Bayman, you know, groundwater levels averaging increasing and decreasing over time. So a good quality comparison with groundwater levels, and I'm not going to go into the details of all of these systems, but basically a good qualitative correspondence between the grace time series and groundwater level data for each of the occupers. So we also compared with the regional models and this is the Northern High Plains the time period of overlap is pretty limited, but you can see good correspondence, generally good correspondence there. Southern High Plains, the grace data is more dynamic and the model is more uniform, and then the big discrepancy that we found in the Mississippi and Bayman system, where grace suggested flat and the model suggests declines. So to evaluate whether it's a true trend or it's just interannual variability we compared the trends to reconstructed grace interannual variability over the past 100 years Humphries based on Humphries data, and then we calculate the trend to interannual variability. And where the trends exceed two to three standard deviations of interannual variability we feel like they are reliable and this occurs mostly in the central valley and the central and southern High Plains region. So then, so we've looked at the reliability by comparing water levels and regional models now want to look at the causes of the change. And here we are comparing grace total water storage with the US drought monitor data and future precipitation anomaly. And so you can see the 2007 nine drought and 12 to 15 drought declines in storage corresponding to the US stock monitor. So this is the Sacramento Valley and the San Joaquin and very high correlations with the US drought monitor. So drought has a big impact on storage there, and then the Arizona Luvium, Northern High Plains, and good correspondence here with the US drought monitor, and maybe because of surface water irrigation, and then very little drought in the human Eastern US. And then drought in the Southern High Plains and good Southern Texas region and good correspondence with grace. So the Southwest and South Central region drought seems to be pretty important. And then is the land use change and we focused on irrigation in these different aquifers Central Valley snake Columbia High Plains in Mississippi. So 2010 was a wet year in the Central Valley and 2015 was a dry year. So you would think that the pumpage would be that irrigation would increase during drought, but actually the total water use decreases during drought because of land following. The different colors, the light blue represents surface water use the dark blue represents groundwater. So when you go from wet year to a dry in the Central Valley, you increase groundwater pumpage. And so you go from 70% surface water during a wet year to almost 70% groundwater. And this is one of the major reasons for the reduction in storage. And I never realized that that they pump more brown water in the Mississippi embayment than they do in the Central Valley for irrigation. And that's why it's really shocking that we don't see their total water storage depletion. But as I mentioned earlier, they are capturing perennial surface water in this human region and and that can explain the region so then I'm not going to go into detail and the others but what can we do to make water resources more sustainable in the future. And I think where we have access to surface water, such as the California and Arizona and other regions, conjunctive use of surface water and groundwater is very important. And inefficient surface water irrigation is similar to manage to offer recharge. When we're using groundwater we need a very efficient system, and, and now increasing managed up and recharge in different regions in areas with no surface water availability like the high plains, you just have to use a very efficient groundwater irrigation and minimize pumpage. I'm trying to show how we compared grace data with the groundwater monitoring and regional models and how it helps to constrain. I think the regional models and the bottom line is we need to use everything we can put our hands on to try to understand how these systems are working. This is proving useful in the US. And I think the USGS in the future will use grace to constrain their regional models, and then it can also help us develop more sustainable water management programs. So, thanks very much. And this is the group that were funded and we had summer week long summer sessions to work on this topic for the last couple of years. Thanks for answering any questions. Yes, we have time for a few questions. I'll read, I'll read a few questions here from the chat so the first one from Patricia is does grace data do equally well for us spending wire storage and continental interiors versus the edges of the continent, e.g. the Florida and Texas coast, and does aquifer shape matter. I mean, I can try to address that. The new mass consolution tries to help with the issues with large gradients along the coast and corrects for those, but Florida is problematic and we had differences between the Center for Space Research and JPL data in Florida. The Himachasavi and others at JPL so Florida is difficult. You know, in northern high plains, the Nebraska is an ideal aquifer shape for grace the Central Valley is not ideal because it's north south and it's small. So the bigger the offer the better, and the more round, historical shape or rounded the better also. I don't know if Matt wants to add anything. No, I think you covered it for it. Okay, we have another question here from Rosemary. She says great work Bridget. How do you go about getting the estimate that you need for storage coefficient to go from water level measurements to change in storage. Excellent question. So, the storage coefficient doesn't affect the correlation between grace storage and groundwater level storage. It just affects the position of the line in the graph. And so you can back calculate an effective storage by correlating ground median groundwater levels for the aquifers with the grace storage. Or you can see what they have indicated there from regional models what the average storage coefficient is, and either way we kind of got similar results but we worked with the Ginny McGuire and many other people on these regional models. And, you know, it's a bit of a career. All right, well now I'd like to bring Matt back into the discussion so we can have a kind of a group discussion here. So, as a reminder, you can type your questions into the Q&A at the bottom. But as moderator I'm going to ask the first question of both Matt and Bridget, which is, you know, we've all been involved in these grace measurements as applied to hydrology for a long time and I know early on that really the hydrology community didn't really accept the grace measurements I think they were so different from what they were used to. I'm wondering now here we are, 20 years later, do you feel like the hydrologic sciences community has embraced this satellite gravity data or do you feel like we still have to convince them to make use of it? I think most of the convincing was done in probably the first 10 years of grace and then I remember one of the important things was Bridget and I worked on convincing the members of the grace science team to produce a gridded grace product and there was some resistance that at first some of the geodesists felt like if they put a product out there that it would be misused by the community. And that's not necessarily incorrect but I think in order to have people embrace it, which they eventually did, I think having that sort of standard product made available without needing to have a geodesist as a collaborator, I think that was one of the keys to getting more people, more hydrologists using grace data and getting more comfortable with it. Like I said, there's still definitely some misuse of the data but at this point I would say that it's definitely accepted by the hydrologic community and the fact that it was so highly, you know, a mass change mission was so highly ranked in the National Academy of Sciences Decadal Survey in 2017. A lot of that was because of what Grace did for hydrology. So Steve, I'll just add a little bit to that. I mean I was one of those skeptics early on. And we're working with the data and everything you know you realize its value and it's kind of like being a parent you know you always want your kid to be something they're not. People always want Grace to be something it's not I mean it's forte is large scale. It's ideal for the global sea level change. The models are not, you know, and so we need to embrace what the positives are, and then not emphasize, you know, trying to apply to a very small aquifer, you know, or things like that, you know, and I think the gridded product really helped. Whenever I do studies I try to work with people who produce data if it's global modelers or regional models or groundwater level people so I work with all of those people to make sure that I'm representing what they're doing appropriately and so I think hydrologists should do that and and the really convinces me that over the weekend, we were responding to reviewers comments and some of the USGS people said, Grace is right for the Mississippi and and the regional model is incorrect, and it's outdated. So I felt like that was, that was, we've come somewhere, you know, but anyway, other questions. Well I have another one then. So I'm going to take it a little bit a little bit outside maybe your areas but you know another year or two we're going to launch the surface water ocean topography mission and the first two letters of that surface water are certainly applicable here. Do either of you have a feel for how that will change what you do and might augment the grace data. I do think that that of the terrestrial water storage components surface water is sort of the least appreciated in people studying the grace data and least well understood. I think we have better understanding now that places like in the in the Amazon surface water is a significant component of terrestrial water storage. And so having a mission like SWAT will enable us to better quantify the changes in surface water storage and better understand therefore how the component changes of terrestrial water storage contribute to the whole that's observed by grace grace follow on and future mass change mission so yeah I do think it's going to be valuable. The other part of that is that, you know, we've always, we like to be able to close the water cycle, meaning, you know, the change in terrestrial water storage is equal to precipitation minus about transpiration minus runoff. And we can, you know, we can observe precipitation fairly well, we can do okay. Well, we're okay with it with about transpiration not great. And then runoff was something that was missing in terms of satellite based observations so SWAT will enable us to sort of close the water budget using all satellite based observations. I mean, I think it will be transformative. I mean, I last week I was talking to a group from Uganda, I mean, and Ashraf had pulled the altimetry data for the Victorian stuff and so they're concerned. They're so dependent on hydropower, you know, and what's, are they vulnerable to climate cycles and teleconnections and, and so the more data that we have on reservoir storage and things like that will help us understand the differences between hydropower and, and, you know, climate forcing and stuff so that's just one example but I mean I'm sure there'll be lots and lots more so it can be huge. Thank you. I have a couple of questions in the chat here. The first one is, were there cases when the grace data was somewhat wrong and traditional groundwater model was correct. If there was such a case then what was the reason? I mean, I think for the Central Valley, maybe the grace data wasn't as accurate as the Central Valley but it got the general trends and everything so I mean, so I think in that situation maybe you would believe the regional model more than the grace data, but they were both very fairly similar. In most cases we didn't have regional models that overlap with the grace data because it's a total career to update those regional models. I mean it is a nightmare to pull all those data together so we have very few that corresponded, you know, to act in time and it's very difficult for them to update them and keep those alive. The northern high plains, I think that was a good one. I mean, I think most of the regional models are not getting the dynamics of recharge. They simplify the recharge and they're not getting the dynamics that we see in grace. And the global models overestimated I think depletion in many of these are for some way did the comparison with the global models. I think we have a ways to go. Modeling is pretty complicated. I'll just add to that that I think in most cases, you know, the grace data are not wrong. It's a matter of misinterpretation. And so I would give an example, you know, I worked on a study in the high plains aquifer where we have a lot of observations and the observations didn't match up with what we're observing from grace right away. And what we realize is there are different levels of, you know, there's the deeper high plains aquifer and there are these shallow alluvial aquifers and they're not necessarily both well represented in the observations. But when you when you do a careful analysis and account for both of them, as well as the soil moisture, then you get a much better comparison with the grace observations. And we've seen something similar in India where most of their most of their ground based groundwater observations are measuring just the shallow, unconfined aquifer and they're not measuring the deeper aquifers that are being depleted. And so there are some studies, you know, they were actually published that said that, you know, grace is over estimating the depletion northern India. My response is no, it's not you're, you're looking at these shallow well observations that don't tell the whole story about about total groundwater change in India. But I mean, Steve, just one last point in the Florida and was the case, you know, where there was discrepancy between JPL and Center for Space Research, so they wasn't very reliable so we didn't do a detailed comparison but I wouldn't trust probably the grace data very highly for that sort of system. All right, I have one more question here. When estimating groundwater changes, how do you how do you remove signals of snow and ice, tree water content, surface water bodies from the total water storage measurements. So we typically use other observations or models and this is not going to be perfect. I can say that the biomass water storage changes. We did a study on that back in 2005 and we're able to determine that those are within the uncertainty level of grace so they're they're pretty small compared to the other components. But if you if you happen to be in a region where snow water equivalent and so moisture are in surface water perhaps are significant components of trust or water storage and and usually, you know, one or more of those is is a very significant component, then you better do a good job modeling them or else, you know, your results are going to be imperfect and I think that's what we have to accept is that those in removing those other quantities we do end up with imperfect results and it's just a matter of accounting for that and our uncertainty bars. Do you want to add anything to that Bridget. I guess the other thing Matt is just like your 2018 paper you know is total water storage is a very an excellent measurement and I think we could put a lot of emphasis on that to begin with you know, rather than always jumping to, you know, groundwater or something else so I think we can learn a lot from total water storage. Matt Pritchard has a question. Matt. Yes, no thanks for these great presentations. Bridget I guess I was curious about your work at the Powell Center so I've been involved in the Powell Center a little bit on the volcano connections between the USGS and the remote sensing community. And if you could just say a little bit about, I think one of the themes that's come out of today's meeting is really the need for interdisciplinary communication. And, you know, people who learning to speak the same language, and sort of, you know, going to sessions, you know, outside your field you know whether it's hydrologist going to learn about space gravity or whatever that there's a lot of a learning curve and you know we're not always communicating to each other so I guess I'm just wondering if you have any thoughts based on that experience with the Powell Center where you were bringing together the sort of interdisciplinary group on, and how to break down barriers between groups what are lessons learned that could be transferred, do we need to all go on white water rafting trips to be able to make this happen. What do you what do you think. I think so. I think the power group was actually the map participated in another power group I guess is a little bit different. But it, you know, we have these week long meetings with the people from, you know, NASA and UT Center for Space Research and then many USGS folks the people doing the ground water level monitoring, people doing regional modeling and everything, all together and I think it really helped the conversation and then it helped them understand the grace a lot better too. But, and so I think that that was really good and then Ashraf did a lot of the data analysis for different groups. It's still very difficult and I think, you know, most of my collaboration with USGS is with some retired people. who seem to have more time to devote to some of these things. This, you know, we had funding for one postdoc so everything else was kind of on our own. But the other thing I think about all this large group research that we do these days and you write a paper and you have everybody and their brother on it. Nobody agrees the co authors don't even read it, you know, it's very difficult to get critical viewpoints in this these sorts of settings until you send it out for review and they knew it, you know. So I think I benefited from working with USGS guys and learned a lot from them even in the past few weeks. And so it's a lot of back and forth one on one and, you know, talking with the people that are developing new Mississippi models talking about their airborne Yemen all of this stuff so it's a huge career, you know, but the power really helped power center and that approach. Matt I don't know what you would say from your experience with power meetings. No, we unfortunately didn't do that when I was involved in the power center. I think. Yeah, I mean, of course, getting these cross community collaborations are really important and that's what I was saying before about, you know, these meetings between geologists and hydrologists really helped. To help geologists understand what the hydrologist needed and then the hydrologist became more interested and embraced the data, you know, one of the struggles we have now is, is the research to applications. You know, you can develop a tool scientists can develop a tool but then actually getting that into operations is so difficult because typically so many, you know, the people who work on a weather forecast or a stream flow forecast. You know, they spend 95% of their time just churning out the product and maybe only a 5% of their time available for research and looking at something new. And so that's I think one of the struggles that we have is, you know, we can develop new, new, you know, useful products that make use of grace and other data but then the uptake is really hard to push that along. I just like to add one other thing, you know, we've talked about satellite gravimetry, but I think ground based gravimetry is way underutilized also. You know, I mean it would help with the storage coefficients in the Mississippi and all of that sort of thing with their regional model, but instead they spent a bomb on airborne EM, you know, which is great to have to but ground based gravimetry is a real nice supplement to satellite gravimetry. Right, there were a couple of questions at the end of Bridget's talk that I want to return to there are related questions. One was, which one was right in the Mississippi grace or groundwater model. And the second one was related to that which is, how do you relate these differences in the Mississippi based into uncertainties and the different data sets that you used. So the, the first question, which one is right. So the regional model that was being used in the past that was developed before was published in 2009 so it's quite dated, and, and they've been updating it since. And the, the recent data that I've gotten on some of that suggests very low changes in in storage in some of those units so it seems like braces on the right track and I think some of the things is that under representation of stream aquifer interaction in the regional model. And then I just found out today they had some problems with the cells going dry and represented them mostly as confined cells and you know so there are a lot of things. It was the first model developed for the region so I think now they're improving it a lot and I think grace will help constrain some of those results. And what was the second question Steve. Oh, just about the second part of that was just about the uncertainties. There's a lot, there's no shortage of uncertainties and everything and I think your conceptual understanding conceptual models, you know, and, and now the developing easier ways that original model had 43 stream networks now they have 1000 kilometers of stream network from that should help with that. The airborne EM will help with looking at the connectivity between streams and groundwater and improve recharge modeling, you have to start to somewhere you know with an initial model and so I think now they're improving it and I'm hoping that the USGS will use grace data to constrain their regional models in the future. And so there was a follow on question about the ground based gravimetry. The question was, how much use of ground based gravimetry has been done to look at the grace results, you know compare them. The USGS in Arizona is done a lot of ground based gravimetry and so he's done some limited comparisons for the Arizona Louisville system there but we don't have the networks in place to do you know a larger scale comparison between grace and ground based gravimetry but I think we should be moving in that direction. So imagine one of the problems is that the ground based measurements are very sensitive locally and grace is giving you, you know, a 300 kilometer average and so it may be very hard to compare a single ground based measurement to grace. You know I think we're, go ahead Bridget. I was just going to say that, you know, it's really cool what Bridget was saying about if you have a ground based government or co located with a, with a well measurement, then you can use the information to understand, for example the storage coefficient which is, which is pretty cool. In terms of difficulties I would say that I've also seen at least one of their study where they're looking they're comparing the ground based gravimetry with other observations and in one of the complications if you if you have a mountain range around, they actually saw where this this mountain range. When it had snow on it that caused the that actually you know sort of pulled upward on the gravimeter. So you had like the opposite, you know effect there's more water in the area but but but less gravitational potential at the site of the gravimeter so it, you know, requires some some careful interpretation. Okay guys, we're to the end of our time and so I'm going to thank our speakers Bridget and Matt those are great and I'm going to pass it back to Matt for some Matt Pritchard for some closing comments. Great now thank you Steve and Bridget and Matt that was a great session I think a lot of good lessons learned and we all learned some new stuff there. Alright so this is going to have a final sort of discussion and a couple of comments from a couple of speakers that came up earlier today. So let me just show you and I also sort of tried to do a little summary of things I heard today so this is just sort of my hot take on on what was happening today. And in the final few minutes here we're glad to take any additional thoughts or things that I missed that we should talk about more. But I feel like it was extremely successful day I thank all the speakers for giving great presentations. I feel like we've shown today that there's been amazing progress and applying to your physical data to a variety of different type of environmental applications that improve the spatial and temporal resolution our ability to measure certain phenomena. So here's just a partial list of some of the things we heard today I think the other really interesting thing was just the variety of types of geophysics that were used today. It's not just, you know, environmental seismology or environmental geoelectrics that really there is, you know, each one of these techniques has its strengths and weaknesses, but used together there is, they're quite an impressive arsenal and so. And again also talking about the different ways that these data are measured ground based submarine airborne satellite active and passive techniques and so I think that was sort of another big take home I had is not only the diversity of applications but the diversity of types of geophysical data. And so that also sort of prompts thinking about the challenges and opportunities of how do we take the next step of a we've detected a great signal on our geophysical network or using our geophysical measurements, and how do we then make it useful to society or that we can make and I heard sort of many different themes in terms of the environmental applications but the one that I sort of felt like summarized it a lot is that I think we're sort of at a point where there's a value in a community organization on these topics and of course there's several separate communities there's groups that are focused on just one technique or focus on one application, but I feel like there is evidence in the talks today that we could accomplish more working together across a variety of fields of geophysics and talking across many different types of applications. And so there were many comments made about facilities and I feel like, you know, a couple of things we heard a couple of times is, you know, both the need for, you know, kudos to our existing facilities. But there's always new data sets coming along, and lots of new challenges that we as a community of geophysicists need to think about in particular, data sets are growing in volume, the type and data sets may not be the same as what we have archived in the past. There are huge data sets that need to be processed. I think another theme that came across was, you know, there's a lot of interdisciplinarity of both among different types of geophysics but also dealing with other communities of hydrologists or oceanographers or geomorphologists. And so as a community maybe that will be something that we can engage in more. Certainly the question of funding came up many times of both, you know, a lot of again kudos for existing agencies for funding PI led projects that lead to these discoveries. But then the question is sort of where do we go next, how do we fund these for operational use. And I think there were a lot of question marks on how all of that can happen and sort of more discussions of best practices and examples that could be used to make that sort of routine societal benefit. A little bit more widespread. And I think another great question that came up a couple of times was outreach to communicate what type of signals geophysicists are now detecting. Are we too geared in the jargon of being data heads and signal processing gurus that we can't always take it the last mile to get to the user and are we sort of not helping people understand what our capabilities are and how that data can be used and so I think the question of communication across disciplines is one that, again, things like the Powell Center certainly help that that those are sort of short lived activities that we need to find longer ways to sustain them. All right, so that's sort of my quick hot take. I'm glad to take any other comments in the chat, but I did have a couple of points that came up that I wanted to give a couple of people a chance to further elaborate upon. And so let's see if we can make this happen. So, so both Ben Phillips and Rebecca Bendick from you now co and from NASA respectively will wanted to have a couple of discussion points. So let's see if we can bring them into the conversation. First of all, then maybe I'll have you go first and so there was some discussion earlier today about, you know, the question of funding some of these type of environmental applications of geodesy. You know, some case studies of, it's not always easy to get that funding from some program solicitations, if they don't always think that, for example, GNSS is a NASA instrument. So, I just wondered if you wanted a chance to respond to that. Thanks Matt. Let me know if I'm not coming through clearly. I'll cut out the video but Well, first of all, thanks very much to committee for bringing together this topic and tall speakers for really interesting talks. And I wanted to echo your comment Matt about, you know, opportunities across disciplines and across measurement types. In the context of these presentations today, the last session and the first, you know, grace and GPS is a fantastic example there. You know, for bridging, you know, scale, you know, scale issues and and across disciplines, for example, in, you know, in areas covered in some of the presentations in Central Valley and Sierra Nevada and studies that have been done. You know, integrating GPS and grace and showing that, you know, there are, there is for example, you know, additional water storage in in basement in the Sierra Nevada that isn't potentially captured in some hydrology models so I'm really interested in, you know, the opportunities for for further bridging these communities and, you know, getting to hydrogeotasy and really linking up, you know, hydrology and solid earth investigators to make more progress in these areas and so on on the GNSS front. Fundamentally, from my perspective, and sort of the NASA mantra of, you know, focusing on remote sensing observations from space, while GPS and GNSS are satellite based observations, you know, that you don't get anything at your receiver without without the constellations up there. And so, from my perspective, that's, you know, an area that's fully responsive and and an area in which NASA has historically continues to make significant investments from, you know, hardware to analysis to to research. So, you know, of course, we have invested in some of the early networks and have our global GNSS network and, you know, analysis systems and products that contribute, you know, broadly to our, you know, both to our space space missions down to basic research. We have a GNSS science team, which is explicitly interdisciplinary to leverage and promote, you know, advancements using GNSS for our science. We call for it also explicitly through our annual ESI solicitation. And it's one of the relevant data types. So, so I can't speak necessarily to perhaps some historical, you know, challenges but certainly, you know, critical and relevant data type for, for NASA, especially where, you know, there's an opportunity to really promote advancements in technology or, or analysis to, you know, to get it new science. And, and certainly, as I kind of as I started to, to integrate GNSS data with other natural data types. So, that's my perspective. Great, thanks. And in case I didn't introduce Ben properly Ben is a program director of the Earth Service and Interior Program at NASA. Thanks so much, Ben. All right, so I will try to introduce our next speaker is Bex Bendic, who is the president of NAVCO, which is the NSF-funded facility, Geozy, and hopefully you're going to meet yourself Bex. But I won't have too much time. I just wanted to say that I thought this was a really interesting discussion in a bunch of topics were raised throughout the day that I think we've been talking about a lot inside of NAVCO because our two highest priorities for this year and the few years to come are one to aggressively build out data federations that would allow cross-disciplinary data sets to be much more discoverable and hopefully much more interoperable. So, you know, the vision is that the days of having to know the special website where you pick up weird data set X are gone, that there will be tools that leverage cloud cyber infrastructure that mean that you never really know which data archives the interesting things that you're pulling is coming are coming from, but they should be much easier to find in the future. And in concert with that we really recognize that it's one thing to find interesting data sets and it's another to be able to use them. So there was a lot of discussion about, you know, requiring kind of expertise in a particular data format or analysis technique. And I think inside you now could we really view that as a barrier toward the most innovative cross disciplinary science and so hand in hand with building out these data federations we're working on tools that would allow data to be much more interoperable and to be processed by people who don't necessarily have that highest level of domain expertise. And we really felt that that will enable the next generation of exciting discoveries and and so on that note, I really would urge this group to reach out to me or anyone on the data team at you now code because we're actively looking for interesting use cases. And we're actively looking for information about how we should prioritize federating data so with the most useful thing be to, you know, we already have beta products that allow you to see SAR holdings and the GPS holdings but is the next thing we bring on should it be grays or should it be modus or should it be, you know, geochemical data, it would be really great to hear from the community. What, what are your dream scenarios what are the interesting use cases that we can work towards so please do reach out this is a huge priority for you now code. And as we join forces with iris integration of geodetic and seismic data is will clearly be totally seamless going forward. Great, thank you so much backs that's a great note to end upon today. So I don't see any other comments in the chat or Q amp a but I will say that there was a question from rosemary night about how big is this community how many people attended today's webinar it's not always obvious to the attendee so I think maybe definitely correctly but I sort of saw us max out around 160. And we have, you know, at least 70 people or so are sticking around to the bitter end. So, I think there's a huge amount of interest I think there were over 300 people who had registered some form of interest in this webinar so I think that there really is a lot of enthusiasm around this topic that hopefully will be going forward. All right, yes and Deb is just telling me that my number 160 was just how many are live that's not even the number of years so just to give you a sense of the scale of the interest in this topic. So, anyway, so I think we'll end there finally. I'd just like to think on behalf of the committee I want to thank all the speakers and each of you for attending. As a reminder, the recordings will be available on the website in a few days, and we'll be sending out a short evaluation after the meeting and if you have a moment please consider providing some feedback so we can continue to improve these meetings which happened several times per year. Thank you again, and I hope you have a great afternoon.