 Welcome to the April COGI seminar. My name is Pedro Arduino and I am the COGI moderator for this webinar. COGI is the Committee on Geological and Geotechnical Engineering, and it is one of the standing committee of the National Academies of Science, Engineering, and Medicine Board on Earth Sciences and Resources. COGI was established as the focal point within the National Academies for government, industry, and academia on technical and public policy issues related to earth processes and materials, soil and rock mechanics, responsible human development, and mitigation of natural and human hazards. If you have any questions about COGI, please contact Samantha Maxino from the National Academies group, she's the staff director of the committee. This webinar is part of a quarterly webinar series produced by COGI through the support of the National Science Foundation. The webinar will be posted on YouTube and announcement will be sent out when it is available. If you have any other chats, please for messages from us and for the speaker bios as we proceed with the seminar. Also, I would like to thank Samantha Maxino and Sara Heidrich for helping to organize and produce these and all the COGI webinars. There are questions and answers after all the panelists give their talks, so the audience can submit their questions anytime using the question and answers tab on the Zoom panel on their screens. We will pose as many questions as possible during our time. First, a small disclaimer, any opinions, conclusions or recommendations expressed by the panelists or anyone during this webinar are those of the individuals and do not represent conclusions or recommendations of the National Academies of Science and Engineering or Medicine. With that, I would like to introduce to the moderator of the group of speakers that we have today for this webinar. His name is Mike Olson. So Professor Olson is a professor specialized in geomatics at the Oregon State University. He serves as the editor in chief of the ASC Journal of Surveying Engineering, President of the Surveying and Geomatics Educator Society, technical director for the NSF, as well as as the research infrastructure rapid facility, and the director of the Cascadia Lifetimes Program. His research program centers on advanced geospatial technologies for geohunters assessments and infrastructure management. It is a pleasure to have this group of people today, who I admire a lot and in particular they are good friends. So Mike, the floor is yours. Awesome. Thank you so much, Pedro, and thank you to everybody for participation in today's webinar. We're very excited to talk about a lot of new geospatial technologies and how they can be used in geotechnical investigations and really kind of get home the point that these technologies are available and ready to use now. So I'd like to kind of start talking about what called the geospatial life cycle. You know, we've kind of seen a big shift in the engineering community recently where over the past several decades where, you know, prior to this we were thinking in terms of project by project we would build something we would put a lot of energy and and investment into building a project and then a lot of the records and information would kind of disappear. Over time we started to realize that that wasn't the best way to manage our assets and we've more been thinking in terms of a life cycle approach looking at our integration between infrastructure. And geomatics technologies are really at the heart of effective asset management and maintenance practices and really integrating all the different disparities that need to come together in order to effectively manage infrastructure. The geospatial technologies are there at the start of the life cycle and acquiring the information on what the conditions are, modeling it and doing the analysis of what needs to be updated in terms of what's going on with the infrastructure where hazards are located landslides, rock balls, all those kind of things. And then at the core of applying the fixes and machine control and construction automation. And so there's a lot of different technologies and a lot of different ways that these can be used. And I think that leads to one of the challenges in terms of the technology adoption life cycle. And so kind of start this this figure with the angry purple dude. And basically I think this is a situation we find ourselves a lot in when we see a new technology where we're doing something we're getting frustrated because we're not getting the quality of data that we want. We want something faster cheaper better right that's that's what everybody wants. So then we start talking to people so and so says hey this is the technology you should be using this is going to solve all your problems and oversells it right and so you're all excited and saying all right let's let's get on board do it. You try to do it and the technology doesn't quite work out the way that you hoped your interiors, the vendor over promised and the technology under delivered and so on. And then you go to experts and others to kind of help you get it and you get more experience with it and then you start to realize oh yeah this is this is going to work this is what I need. But then, oh wait, no I want to now do it faster cheaper better now that I know I can do this right and we kind of get locked into these these different cycles. What we wanted to kind of talk about today is how to deal with all these new emerging technologies how to make effective use of what's available in practice, and really start to make decisions on, you know what technologies are effective to use in different applications and show some demonstrated case studies as you'll see from the different panelists in just a moment. With these technologies there's huge variabilities in terms of accuracy and what you can do. One of the big challenges there's fewer opportunities for education and geomatics and so you know a number of years ago. All your civil engineering students would take multiple classes and surveying and measurement science and now that's been shrunk down quite a bit and so people don't really get that kind of exposure to to measurement understanding terms like uncertainty precision accuracy resolution all those those kind of things and get thrown out and confusing. And that leads to some of the issues that people face in terms of adopting the technology. The good news is though the technology is becoming more available, and so it's getting easier to use their simpler forms of ways to use the technology. For example, when we talk about LiDAR technology it used to be limited to very bulky platforms and we've seen downsizing. In fact, the newest iPhones in the pro series have a LiDAR sensor on it. But the caveat to that is we have this technology that's now more readily accessible. But it doesn't necessarily provide the accuracy that we need an engineering type work and so we have to be very careful in interpreting data and analyzing data coming out of these these different newer sensors and making sure that they meet the need of the applications. You know the other key advantage to these geomatics technologies is we have the ability to capture the context of the entire scene. You know, a lot of what we do in engineering we're looking at plan and profile. We're just seeing bits and pieces of the picture of what's going on and with the geomatics technologies we can really kind of see that full picture and see how observations of what's going on in one part of the infrastructure affects what's going on further away. And we get that increased detail to be able to do it. So I'm talking a little bit about how how can we determine what the best tool is for the job in terms of doing things. Now with these technologies there's a whole ton of applications and lots of different ways that that these can be used this is just a sampling of different types of geotechnical or structural analysis that can be done. With current technology that's available for for anybody to pick up and use obviously requires a lot of training and experience to be able to use these and be able to use them effectively and engineering analysis. But we can use these for looking at all sorts of different challenges in terms of sediment analysis looking at what's going on in terms of slow stability, we can look at change detection what's happening is rivers migrate through areas and kind of predict where where those changes are going to happen we can also use them to look at the structure and connection with the soil and make some decisions on what's going on where the stresses are accumulating where the deformations are and in many cases monitor things before catastrophic failure happens to get that. So there's there's lots of different applications, but the important thing to keep in mind is that every application has a different quality level necessary for it. And so in this this plot this is work we did for the Transportation Research Board in Mobile LiDAR guidelines and transportation. And on the x axis is our 3d accuracy and on the y axis is the point density or the resolution of the data. And so there's a big difference in terms of different applications as far as what you need in terms of the resolution and the accuracy of the data. And you know the question you may be asking yourself is well why don't we why don't we just always collect the highest resolution and the highest accuracy. And the answer to that is that oftentimes it becomes very expensive to do that it becomes orders of magnitude to just improve the accuracy slightly. There's a whole lot more work you have to do in terms of your survey control and other things to achieve that or in terms of point density now you've increased your data volume and the processing workflows become very challenging. And so what's very important is to identify your applications and identify kind of where on the spectrum they fit in terms of accuracy and point density requirements, as well as kind of the size of the area and then you can use that to kind of make determinations of what is most economical in terms of your data collection strategy and make those decisions with that. So the key driver really in a lot of this is the cost. And so here's just another plot that shows spatial resolution and measurement uncertainty and kind of where different technologies fit in and on the spectrum of the plot. So something like GPS technology there's many different forms of it and there's many different actresses associated with that you know from your consumer devices like in your cell phone. That don't really have the best accuracy usually on the order of several meters in terms of accuracy to static GPS where you can get down to about millimeter centimeter level type accuracy with those systems and similar in LiDAR. So there's there's a huge disparity in this that that's very important to kind of get down in those details as you work on adopting these these different technologies. So now where can you go to get kind of some information about it well this webinar is a great start you'll hear from some experts who've used technology and a lot of different types of applications. There's a facility I'd like to highlight that's the rapid facility up at the University of Washington that hosts a wide range of equipment in terms of laser scanning us other types of instrumentation like seismometers that are available for for testing predominantly focused on focused on natural hazards reconnaissance, but part of what this facility does is also provide training and support and education on using these technologies effective in a lot of different applications. So with that I'd like to turn the time over to our first panelist as ongoing me from sixth sense engineering who's going to talk about a lot of the different monitoring applications and innovative work that they're doing to monitor a wide range of infrastructure projects. So, thank you for the presentation. Thank you for attending. So my presentation is more of a state of the practice showing you what can you do with the robot is total station for large infrastructure geotechnical monitoring. And first of all, the main purpose of performing geotechnical instrumentation monitoring for urban construction project, mostly is for risk control control risk during a construction and for design verification and today also provide rich and valuable data set for research, particularly like machine learning based research. These are these are typical scenario of infrastructure monitoring as you can see. So what we come and do today using a robust total station for ground surface and for beauty and deformation monitoring and sometimes remote sensing, so that remote sensing got involved. And for most of the underground subsurface for your deformation power pressure practitioners are using a more conventional geotechnical sensors such as the incremental rate extents will meet up with all of you that might talk today about primary focus on robots, but mostly total station and then total station I think is a a widely known instrument, survey instrument with a long history of development. And with today's new news development sort of total station can do a lot of things, deformation monitoring but also integrate camera to SS scanner, and they are most importantly allow for remote control. So which is very suitable for real time deformation monitoring. And so the idea is, you will. So first of all, that choose to modes by using total station ones using the targets are the survey presence, another is a reflector list. So basically you identify the zone of the influence caused by construction and then you install the target attach a target to buildings or the to the road services. And then, but most importantly, you have to identify, you have to do the reference presence or benchmark of a bedside, whatever we call it outside the zone of influence. So, and then through the automatic target condition, if any of the monitoring prison moon, the total station automatically tracking the small movement of the system and then deliver really, really better level accuracy, modern data. And we can also do reflect a list. But in this case, you only get one D most of time, set him a he so compared to lighter on the using total station for reference monitoring and produce lower density points that allow for much faster processing. And so today, most of the, so that's the wireless communication that connecting different components of the the monitoring system. So usually through IOT device, you can remotely control the total station. And then all the data will be transmitted to a cloud based server and then allow the engineer to do the data analysis or allow the the survey or to remotely control reprogram the total station. And here are a few typical configuration of setup of a total station for infrastructure project you can put a total station inside a tunnel along the bridge or along the highway. And then I'm going to talk about a few case studies real quick. And this one of a typical scenario using total station that combined the prison mode and then refer to this notes. Think about as a tunnel construction underground. So the surveyors can install the prisons on the facade of the building and also identify lots of refer to this point on the surface. So without interfering the traffic. And in that way you can achieve the real time monitoring for both the buildings and for ground surface. And so one of the examples as I 99 tunnel project Seattle between 2012 to 2017 I think is a well known tunneling project. And it's not a very long tunnel that was constructed by the largest TVM EPV TVM at that time, and then the over 160 buildings are within a zone of influence and then nearly 40 total stations were installed to monitor both the the prisons as you can see the attached to the which is now completely removed now replaced by the new tunnel. And this is a very quick video shows you during that monitoring 2017 to 2017 2012 to 2017 that the main Seattle downtown area was covered by these robust deterioration monitoring network. Another ongoing project in Canada near Toronto as we can see these a low shuttle overburden railway tunnel and then constructed under the 21 land highway so require very high density and very high frequency monitoring by the meantime you will not allow to interfere the highway traffic. So a total station will install and nearly 500 reflect this point, which provide high frequency settlement monitoring of the highway. And yeah, so, so the project. So for each. So what this photo show one of the total station tower with the one to total station one on top of each other. So we can we can program the total station the one station cover a larger area and another one focus on active construction zone. So with that different monitoring frequency. And there are a few more slides showing the application. So these ones are using total station for the excavation project, which is monitoring the lateral deformation of the shorting structure for the excavation and can also be used to monitor the bridge and during the foundation repair of the bridge. And the last one is for them for now there are lots of agent names across the USA, and then lots of them require repair and foundation improvement. So that's another application of using the post total station. Finally, summarize some advantages of limitation of the technology. As I mentioned earlier, it provides very high frequency measurement that require allows you to connect the measurement to a real time early warning system. And when using reflect a lease does not require traffic control closure, and also is a cost effective solution for long term monitoring. And one thing that I would emphasize here is also reduce the exposure of a few monitoring personnel to calm and construction site hazard, and then also reduce copy exposure in the last few years. And certainly that some limitation if a lot of the site got blocked and then you're not able to gather data and the also the data quality will be affected by the weather condition is not going to be the high quality data during the rainy season or the snowy season. And you do need a permit or right of entry to install the total station of the prison at the desired location. And then when the reason is a concern urban area that happened a lot in the last two years during called the opportunity. And then you need to have a real you really have to a stable reference point. So make sure you put a point outside of zone of influence. Some of those limitation will be overcome and then by other technology that will be introduced by the next two speakers. So with that, let me introduce our next speaker. And Shensky associate professor from Oregon State University. Thank you. Thank you for having me here today and appreciate it. We're going to switch gears a bit and go from the urban environment to the coastal environment. So coastal retreat is a problem worldwide wave attack, you know just the waves hitting these sea cliffs can result in a variety of mass wasting events, including scour, overhanging rocks failing slumps and sea cliff collapse and the case case of coastal adjacent landslides, debutter sink in advance towards the ocean. Here in Oregon, we have a project where we've been monitoring a series of sea cliffs and coastal landslides for years now, looking at the change over time using variety of tools. So in Oregon we have about 300 miles of coastline, about 100 miles of which are sea cliffs of sauce rock. Our beautiful highway 101 is beautiful because it happens to span those sea cliffs and have an amazing view of the ocean. The problem is, the highway itself is quite close to these areas that are quite unstable. To look at the dynamics of this coastal environment, we've used a variety of tools including in situ instrumentation like MEMS in place inclinometers, terrestrial lidar, UAS lidar, photogrammetry, GNSS surface monitoring, and using these techniques together or alone to look at how the coastal environment changes over time. Here is a typical winter day. This is low tide. Now, pretty stormy as you can see. It's a bit grainy but there's cobbles down there and there's a 100 foot piece of driftwood being picked up and hurled into the base of a sea cliff. These are the typical conditions that result in quite significant scour as we see in the winter environment and quite a bit of that scour results and change. The sites we'll look at quickly span from northern Oregon to southern Oregon with very different geologies with the northern sites being more prone to just failure from scour and sea cliff collapse while the southern sites tend to be home to very large coastal landslides. So to look at change of sea cliffs we look at terrestrial lidar collected twice a year to evaluate what we kind of see with our eyes but it's difficult to quantify. We see that those mechanisms as I've described collapse, scour, accretion of debris that's failed occurs on a yearly basis and the retreat is variable in space and time can span orders of magnitude. One of our sites, the northernmost one, you can see kind of color scale here in the change determined from lidar change detection, red being retreat in two meters and blue being accretion or the gain of elevation here. We can see big collapses in the north as well as overhang failures to the south and notably a bit of green and blue here which actually reflects the advance of the slump down the site. We can also see the typical erosion rates that we see per meter of sea cliff here and start to constrain the rates of lateral retreat we associate with coastlines and use that as a metric to evaluate hazards and risk. Arch Cape, well this is a site just adjacent to a tunnel, we can see that typical wave scour occurring at the toe of this site and it is pretty significant. So one of the homes down here in the south actually has a hot tub that's hanging two feet off the sea cliff erosion is pretty severe here. Spencer Creek, our longest site central Oregon coast, we can see large changes slumps cliff collapse. Another change here in this notable region of green, there's actually a landslide advancing in this location, we can start to quantify some of the change of these these contrasting processes. Speaking of landslides, you can't just evaluate landslide movements from change in the sea cliff itself landslide movements landslides move in a very complex fashion they often do at least developing means of evaluating those movements and how they vary across an actual landslide is one thing we've also looked into. We've used LiDAR at the surface, as well as MEMS inclinometers, GNSS surface monitoring to start to characterize the motion use these techniques together to evaluate how are these environments changing. First, we'll take a trip to Arizona in. You can see here the result of MEMS inclinometer, we have a change of profile and time. Each one of these colored lines represents half hour increments, and you can see that we can start to develop a velocity profile vertically over time, and we can even see some large jumps that occur during big events both wave events and rainfall. So this site happened to be prone to very severe coastal erosion and significant debutricing of this slide. We have GNSS units placed with these red cross symbols here. And notice that there's Rover one two and three through the main slide area Rover one is moving the fastest Rover two is moving a bit slower Rover three the slowest a sign of retrogression driven by very significant coastal retreat here kind of a bottom up process. We have another landslide in the very south of your coast called the who's Canadian landslide, one of our monitoring sites. It's a it's a tiny landslide at 20 million cubic meters, and it's an earth flow that advances a teeny bit two to four meters a year. This highway is repaved constantly throughout the year. Well, in 2019, the landslide decide to move just a bit more 40 meters had a surge event, which just happened to be the most severe right where the highway traverses the landslide. It closed the highway for two weeks, they were constantly trying to rebuild and open the highway during this period. We had a MAMS inclinometer in there. It lasted a good 40 days rest in peace. And we saw clear shear zone about 30, 30 meters in depth. We have rovers still placed on it after they re graded the whole area, and it's still constantly creeping despite the hydrological controls are trying to tap. But we also have some data from the actual search event itself. Here's some imagery from UAS. You can see seeps coming out of the landslide the main search zone, which again dislocated Highway 101 quite a bit. Here's here's a ODOT reopening the highway or at least trying to get construction access as the landslide started to slow. Right there is our instruments, obviously they're gone. And with that imagery, you can actually start to look at surface displacements here. This is using particle image velocity. It's a hard word to pronounce, but PIV. And we looked at change of images and you can see the clear flow of the actual search itself dislocating the highway. And it gives us high spatial resolution of surface displacements. After the search event was over, we decided to collect UAS lidar and start to look at change using high spatial resolution data here. We performed some manual feature tracking to evaluate the kinematics of this earth level. And here's what we see. Similar to what we saw from the PIV analysis, we can look at lateral surface displacements. The largest being, of course, right where the highway crosses. Vertical surface displacements can also be evaluated. There is pretty large vertical changes stemming all the way from the source area all the way to the lower bench. In this case, there's pretty significant erosion of the coastal bluff here, but it's marginal compared to how this slide is driven by water in particular. In fact, one interesting thing when we started looking at the site years ago was the how eroded the sea cliff was at the base. We were pretty surprised that after the search event, all of a sudden there were two sea cliffs because the landslide advanced so much and actually towed out at the beach. So in summary, there's a variety of monitoring techniques that you can use to get high resolution data in space and time and often as a balancing act as Mike alluded to. These tools are valuable for coastal environments in particular. It's dynamic, it's always changing and there's plenty of hazards in an environment like that. In terms of a monitoring plan, redundancy is key. In place systems like inclinometers are valuable for giving us geotechnical data, but they might not last long in these dynamic environments. It's good to have other data sets to compare and continue monitoring even if you start to lose instruments. And the reasoning being that we all know if something will go wrong or could go wrong, it's going to, as well as whenever you're not looking, that's when events tend to happen. But having technologies like this allow you to infill those data gaps. So with that said, I'd like to introduce the next speaker here, Chris Massey. Hi everyone, how are you? My name is Chris from GNS Science in New Zealand and I thought we would go through some kind of take things from away from the coast and start looking at earthquakes. And so what I'm going to talk about now is monitoring rock slopes through an earthquake sequence. And I'm going to focus mainly on the Christchurch earthquake sequence, which occurred in 2010 and 2011 in affected New Zealand's largest city. And most of you may have heard of the liquefaction, which is pretty famous around the world for what these earthquakes actually cause. However, as you can see from this map here, the earthquakes also occurred under a relic extinct volcano. And so therefore there was quite a lot of slope instability. So what you can see here on the screen now is just the different earthquakes greater than magnitude three that occurred within that occurred associated with each particular earthquake sequence and within a certain time period. So we started with the Darfield earthquake in 2010, then we have the 22nd of February earthquake, then we have the 13th of June earthquake, then we have the 23rd of December earthquake. And then we have the 14th of February earthquake, which was kind of centered under all this mess over here in the Port Hills. So you can see we have considerable number of earthquakes and there were multiple recordings at the strong motion stations of greater than one G in several of these earthquakes. So this is an aerial view of the Port Hills of Christchurch. You can quite clearly see the remnants of the volcano here. The center of the volcano is around Littleton Harbour, the liquefied areas around central Christchurch. But the main issue in the Port Hills were Rock Falls and debris avalanches caused by the earthquakes. So the geology of the Port Hills, it's a dissected slopes of the Miocene Age Volcanoes are 11 million years old. It's multiple volcanoes that are basalt lava flows which grade laterally into breccia, scoria, glomerates. They're mantled by younger quaternary age Lurse, which is windblown sands and silts, and they are highly variable both vertically and spatially horizontally. These photographs show some of the Rock Falls and debris avalanches that occurred in the Port Hills. Now people built right up against the toes of these rock slopes. People built quite long distances away from the rock slope. So this house at the bottom left the boulder travelled 700 meters from the slope at the top left and went straight through the house. The Rock Falls and debris avalanches killed five people in the Port Hills. And affected 400 properties. So the inventories that we had to collect to try to put all this together and to understand what was happening so that we could then inform the decision makers with regards to where to rebuild post earthquake. We had to collect multiple datasets, multiple epochs of ortho rectifier photos, repeat airborne terrestrial LiDAR surveys, field mapping, so analog surveys, i.e. mapping Rock Falls. Then we had to look at ground deformation monitoring and surveys using a whole combination of factors, GNSS, static survey marks. We installed rain gauges, soil moisture monitoring, ground pour water pressure monitoring and robotic total stations, as what we heard previously. So this is just some examples from the Richmond Hills slope. What you can see here is the difference between airborne LiDAR versus terrestrial LiDAR in the number of points that are actually on the surface. And obviously terrestrial LiDAR is much more effective on steep slopes because you get a higher concentration of points actually from the surface. And what we can do here though is over time and multiple epochs of survey, we can use these multiple epochs of survey as Ben and Mike and others talked about to difference. And we can look at the differences which we can then turn into volumes of material that fall off between epochs. Now in the Port Hills, we carried out around 14 to 15 surveys of these rock slopes during and after the earthquake sequence. We also mapped the geology. We also mapped the cliff top cracking. We also installed GNSS, which is the marker here, as well as having traditional static survey marks that we went to monitor. We then mapped the crack distributions using the high resolution aerial photographs and we put all this together into these engineering geological conceptual models to try to understand how the slopes might perform in future earthquakes and non-earthquake events. And so these are these kind of a typical conceptual model. So what we're seeing here is we're seeing cliff top cracking, slumping, rock and debris avalanches and falls from the face of the cliff between the successive earthquakes as well as this slumping failures behind the bits that fall off. We're seeing significant impedance contrast within the slopes themselves, you know, we're going from up to 4000 meters per second shear wave velocity into 600 and then up into a couple of hundred. We're also seeing topographic effects from amplification of shaking because the slopes are relatively steep and narrow. And so we put all this together using all these different technologies to try to then forecast what the slopes might do in future events. We can do this statistically and this is where we use the change model data sets. So we take our 14 epochs of change models for the different slopes. We look at those epochs of change that relate to earthquakes and we look at kind of statistically where those changes are occurring on the face. And we can use machine learning algorithms like logistic regression to analyze those areas that fell off against the factors that may contribute to why they fell off and where. And so this plot here is essentially telling us that the ground acceleration is the most important thing during earthquakes to generate rock falling from the slopes. The next important factor was the relative elevation so the higher the slope, the more material fell off or the material fell from maybe the higher portions of the slope. And then the next interesting factor is the percent of the neighbors that failed. So this is the number of cells adjacent to the one that failed in previous survey epochs. So essentially this is looking at the stress redistribution as material falls from the cliff face. So it's a very powerful way of taking those change models and then starting to really get into the nitty-gritty detail of what's driving them, which we can then feed into our numerical physics based simulations. So where we're going with this then is the benefit of having these terrestrial lidar change models through the sequence and they span from 2010 all the way through to 2016. And so we've captured both the effects of the earthquakes on the rock fall rates, but also the non-earthquake contribution. So this is like the rain, typically rain events. And so what these plots show for each of the main slopes in the Port Hills is that you get the earthquakes coming through. You get this massive increase in rock fall rates above the pre-earthquake baseline rock fall rates, which we determined from trenching and dating the pre-existing talus piles on the surface of the slopes before the earthquake sequence. And then what we can do then is over time, when there's no earthquakes during our terrestrial laser scan surveys, we can track the decay in rock fall rates with time after the major earthquake back to the pre-earthquake rates. And in this case, they take between two to five years after the 22nd of February earthquake, which was the main earthquake that generated the largest ground motions in the Port Hills. And so after two to five years with background rates. And so what we're finding then is that immediately after the earthquake, the rock mass is damaged. And so you only need a relatively small amount of rain to remove those unstable blocks because the slopes are highly susceptible to failure. But as time goes by and as more of the highly damaged rock mass is removed, you need successively more rain and higher rain intensities to move material from the slope, which then drives this overall decay in non-seismic rock fall rates. Now, this is the conceptual model that we've put together. And so essentially what we're saying here is that before the earthquake, we have a few non-earthquake related rock falls. We get a major earthquake come through, it damages the rock mass, which then increases the likelihood of rock fall from non-seismic sources like rain. And over time after the earthquake sequence decays, we then drop back to the tenor or the time before the earthquake. Now, why is this important? It's important because this drives rock fall risk. And rock fall risk drove the authorities in the Port Hills, the central government, with regards to their decision making over where and when people could be allowed back into their homes. When the risk levels decreased to an acceptable level. And also it governed the nature of the infrastructure that was installed. So for example, you can't just wait until the rock fall risk decays after five years because people need infrastructure. But this can help guide with regards to you may want to consider putting in cheap, low cost solutions and wait five years before you start building very complicated and costly permanent engineering works. So it's not just through earthquakes that we can use this information and these data sources, we can apply these to landslides all over the world. This is another example in Fox Glacier. This is a 60 million cubic meter landslide. It's sliding and then the front of the landslide is slumping and generating debris flows to create a big debris fan. We've we've used terrestrial laser scanning airborne LiDAR and we've used the structure from motion digital surface models derived from tri-stereo satellite imagery captured by Plady satellite to generate change models in the same way that we've generated them at the Port Hills. Instead of using terrestrial laser scanning we're using these other geospatial techniques because the scale is so large. But we've also installed a GNSS and tilt meter rain gauge weather saw moisture monitoring on a big concrete block that we dumped into the landslide from the bottom of the helicopter and that's giving us 30 second data calculated at 24 hour over 24 hour epochs for displacement and rain form. What this is doing, it's allowing us to link rainfall to movement which we can then use to give an idea of the when these landslides are likely to reactivate and when the debris flows are likely to come down the valley. And that's important because during the before COVID times we had up to 7000 people a day visiting these these valleys. So thank you very much for your time and I'd like to acknowledge my co-authors and if more information we've just had a paper out JGR on the Christchurch work please feel free to go and have a look. And so just to conclude then I would like to pass back over to Mike who will then bring all of these various threads together. Thank you. All right, thank you Chris. So thank you to both Ben and Zangwei as well for all their excellent thoughts. You know kind of the goal today was to show you that there's a lot of different applications and ways that these these technologies can be used. This was a very quick run through a sampling of different technologies. There are lots of other technologies out there that can be used but hopefully it demonstrated that the technology is mature enough to be used on a lot of different applications and provide usable information that can take it all the way to policy decisions and as Chris mentioned in his presentation. Another thing to always keep in mind is you know that's just because something's new and flashy doesn't mean that something older can't be repurposed or used. So some old technologies can become new technologies and how they're used. For example, the Geo Congress there was a lot of discussion about fiber optic technology and how that could be utilized as a monitoring technique and you know that technology has been around for for quite a while but it's a it's a new purpose and the way that it's being used. One of the hopefully things you came away with from this this webinar today is that these advanced technologies provide a more complete picture and understanding. And so sometimes people are hesitant to say hey I'm not going to use this technology because it's going to cost a little bit more upfront. But the value that you get out of that information and really having that context of the whole broader scene can make a huge difference and ultimately allow you to catch issues early on in the project. Well you can still modify the design and do other things to avoid kind of those costly things that come later on when you hit some of those surprises. And also when you think in terms of if there's potentially going to be a litigation issue or something down the road having that really high quality data front can make a big difference. Another key factor is it's not just choose one technology or another. A lot of cases a lot of these projects as was tied together in the presentations so well was that there's a lot of different technologies that are used in concert together on a project. Because they all have their strengths and weaknesses and when you use them together you can kind of get the best of both worlds. So it's always important to think in terms of value. It's important to know what you can do yourself versus what you need an expert for what you need assistance with. And then there's all sorts of things in just kind of understanding those fundamental concepts of accuracy resolution and you know kind of what the differences are between those. And so it's important to kind of really kind of think through what what do you really need to do on the project and what's going to deliver the most most value to those. But at the end of the day all these technologies are tools and so it's important that they're used appropriately and that you have a good handle on where their strengths are and where their limitations are. And that's the key to being able to use these successfully. So with that we we thank you for your your listening to us today and the presentation look forward to the discussion in the Q&A. And I'll turn the time back over to Pedro as the moderator. Thank you thank you. This was great great I think and I think that we have 15 minutes actually to have some good questions and there are some good questions in the question in the questions and answers. So first of all thank you everybody for joining us today and please stay around because the round of questions could be also a compliment to what the speakers have been talking about. And the webinar will be posted as I mentioned before information will be sent about the where it is posted and when. And please go to the link provided in the chat to give feedback about the these webinars and also any suggestions and also look at the questions and answers if you still have some questions that you want to ask today. And again I have to mention this disclaimer that any opinions conclusions or recommendations expressed by anyone during this webinar are those of the individuals and do not represent conclusions or recommendations of the National Academy of Science Engineering or medicine. So with this I will start with some of the questions hopefully that we can answer. So the first one I will ask is a general question is for my for Mike and so the question is why there is a lot that you have shown here that we should be learning on the technical side. Sometimes the problem or the challenges are in the human aspects and that could be the biggest barrier for the adoption. So what is your advice here is what would you advise management and also others and to use these technologies and also how to learn about these technologies would be something that you can include in your answer. All right sounds good. Yeah those are some questions that I get regularly as I talk with a lot of people is sometimes you know it's like there's a technology barriers of just learning how to use it and and getting the technology to work and do what you want to. But oftentimes a bigger technology is really kind of making the effective case within management as far as this is why we should adopt it and this is why it's going to save money down the road. And oftentimes kind of what I've seen in practice that's worked effectively is starting with some very small use cases and demonstrations. So you know a lot of times in the past I think people are relying on vendors coming and showing flashy things and you know that's sometimes helpful. But if you can have kind of smaller use cases where you pilot the technology it's kind of a simpler application so you can wrap your head around it and kind of demonstrate it. That makes it a lot easier for people to visually see the results and I think that's one of the beauties of a lot of these new technologies is they're so visual in nature. That it makes it a lot easier to communicate what they're doing and what value and information that they provide to the end users and that can help a lot in terms of getting management. There's lots of studies coming out with return on investment of these technologies and in particular showing the cost of not adopting the technologies and then hitting an issue down the road. I think that's always something important to kind of bring into that conversation and wait the decision of does it make sense for us to adopt this technology or not. I had a follow up on that question but I will ask Ben one question here. One of the earlier questions that we had here is about how to use the same type of technology for forest and vegetation monitoring. So that was one of the questions and also if you can talk a little bit about the software's use for your photogrammetry and model generation and analysis. Absolutely. So you know I happen to be in a college of forestry so I only know this first answer through Osmosis but LiDAR is frequently used actually to do forest inventory where they would you know get usually an aerial LiDAR or UAV LiDAR and more recently to actually get an idea of biomass and in repeated cases growth of forest stands. It's also pretty valuable for actually getting an idea of inventory in places where you have sparse data and I can think of some state forests or locations like that where it's commonly used. So it's pretty valuable for that but it is intensive data processing. As a geotechnical engineer I usually throw away the trees and want the bare earth but there's a whole group of people who don't want the bare earth and they just want the trees. So good question and as to the software used to do the PIV well so first we often use cloud compare as one example of the software we use to process the photogrammetric data but there's a variety of packages out there. The PIV that we did so the particle I can never remember the full acronym is hard to pronounce but we use MATLAB but Python which is open source also has a variety of packages for that there's a variety of sources for it in the end but you do need two collections. Thanks Pedro. Thank you for the questions. Yeah there will be a couple of more later but let's go to some way I have several questions for you that are coming from the speakers these are not my questions and related to RTS first and then I have one on tailings so they are separate. So the first one is how to protect from vandalism how to consider temperature and how far you can go with these tools. If you can give us a little bit of hints on these things, particularly as I said in the temperature vandalism how to protect from vandalism and how far you can go. Okay, thank you so those are commonly asked questions about using RTS stuff from the easiest temperature is definitely considered as the level will be affecting the distance measurement so for a total case of anger measurement distance measurement so temperature as well as air pressure usually measure at the same time and then take into consideration in that computation process. In terms of vandalism, unfortunately in the last few years due to COVID that happened a lot. So from from what I heard, I think last two years in Europe, they are about 80 90 total session was stolen. And so the manufacturers, they come up with some device, allow them to track the asset the stolen assets and allow them to rock the device remotely. So those will prevent people to resell them to reuse them but unfortunately that will not prevent people damaging equipment. Sometimes they got frustration. They've been frustrated and not able to remove your the total station but it can still damage the total station. So, maybe you can put a civilian camera, but nowadays people can wear a mask. I'm sorry I'm not able to provide a good answer that sort of share some common challenges here I understand you can say. So I hope some of the audience here can provide a more creative and effective way. And well appreciate. So the third question but distance. Is it about the monitoring distance or how far they can we go what does it mean. Can you rephrase the question, Pedro. And no the question is these total stations and you can do monitoring and but how far. So you can do a monitoring one kilometer outside. Okay, so according to the specification of the total station manufacturer you can go one kilometers maybe but practically we usually control the distance from 100 meters to 150 meters. Number one, the the distance measurement accuracy is a function of the distance, but most importantly the light of the site. So for urban construction monitoring project if you put a target 1000 meter away from a total station most likely something will blow the light of site so practically, I would say from 100 meter to 150 meter. Okay. I will ask you the question the other question later. But I want to ask Chris, one question here is more technical the question, but I think it's good. How critical is in your analysis Chris, and the selection of your spatial interpolation technique. Inverse distance weight being creating etc, in order to get accurate and accurate solution when dealing with this type of data sets or are these data sets so dense that this is not an issue. Ooh, there's a lot of fish hooks in that one. No, just because they're dense doesn't mean that they're right. That's really important to understand that. I use a recent example where we took a digital surface model from before the Kaikoura earthquake derived from aerial photography so this is high resolution 0.2 meter ground resolution aerial photographs. And we had a same survey done after the earthquake. And we corrected those digital surface models for tectonic deformation. And in doing that, we then ran what we call the bootstrap which is where we just take random selection of samples from those different data sets and we compare them. And even after the tectonic displacement we found that there was a that these billions I'm talking billions and billions of points. There was a systematic offset of about 0.4 of a meter in the vertical. So even though we had these billions and billions and billions of data points. There was still the systematic 0.4 meter offset which we could then correct for we just move all the points 0.4 meter. So I think it's really important to use different techniques to try to get into the uncertainties. I've talked about before within each of the data sets, and it might be that terrestrial laser scanning and airborne LiDAR, even though you can use them complimentary you can actually use them to check the quality of each of the data sets but you can also look at things like areas that aren't moving where you're very sure that they're not moving and then you can kind of check on those against areas that are moving. So looking at then creaking and various kind of kind of ways that you actually smooth the data. I mean there's so many different ways that I mean out there and it really you have to really look at what the results are and what best fits what you're trying to see and if it doesn't look right then it's not right. And I think that's I don't think there's any magic kind of like one process or one methodology fits everybody I think you have to really try to work out what suits you and what suits, you know, what you're trying to achieve, which is I think the kind of underlying kind of thread that Mike's kind of putting through all of the work that we're talking about is that we've got all this technology, you can't just throw it all at something and hope you get the right answer you have to actually think really carefully about what it is you're trying to achieve from it. Is that fair? Yeah, no, no, no, I had another very good for you but I would like to go to some way again once again. There was a question about monitoring tailings and you know tailings have a very quick they react because you don't see anything and you have a monster environmental effect like in Bernadino or other places. Can you use any of these technologies for tailings or have been used or there is some thoughts on that? Yeah, total station has been used but for many of the tailing them I think they use a different type of technologies in SAR, LiDAR and RTS and sometimes they separate as what Ben mentioned so it really requires a combination of instruments and then you do need a combination of a real time like wireless transistor as well as the LiDAR which gives you a full picture of everything. As Michael mentioned there, each technology is an advantage or disadvantage. When you get a full picture you get high density data point and the computational process is longer which does not allow you to do real time monitoring. So it's a mix and then I hope this answer your question or partially answer your question. Okay, I have two questions more and we are running out of time so we may go a couple of minutes over but Chris, another or maybe Ben, any of both. So, you know, you can get almost real time images and after you can use AI and AI system can tell you if it is correct or not but imagine that it tells you that it's correct, the image is correct in real time. And do you have a connection with thresholds for that they are overcome so that you can talk immediately about risk and action. Get out of here you have 10 minutes to go, and do you have any example that what this has been used. Any of you if you can answer that. I'm not necessarily aware of say LiDAR being used in that type of application but people have used inclinometers most commonly people are using climatic data, like rain gauges, and the likes of that to inform evacuation, or early warning systems, but it's it's an interesting dance, because false alarms have a big impact to if you make the wrong call. I have heard of systems in Switzerland and Italy, being used mainly with climatic data but Chris might examine this a bit more. Yeah, thanks Ben. Yeah, I think it depends on what you're trying to do. If it's like an individual house kind of setting where individuals on a landslide and they've got cracks at their home then the, there's really low, low tech which is essentially a kind of an extensometer linked to an alarm across a crack. And that can be that can be enough when it, when it comes to Ben said Ben said rainfall thresholds we have to be really careful because you know the conditions that triggered the say the debris flow, maybe quite different to the conditions that trigger the next debris flow so what ends up happening is you set the thresholds very low and you have lots of false alarms as Ben mentioned so it's a real trade off. And, and it depends on what you're trying to monitor if it's for life risk then you know it's really complex the monitoring tech is a tiny, tiny part of it. It's the social science that's the big part and that means you know what does somebody do when they're notified. Can the person actually take evasive action is their time enough for the person to be notified and take evasive action. You know it's really that's the complex stuff the actual tech itself is actually not that important to be honest. That's fair from a life risk perspective. If it's infrastructure then it's different. Okay, I can throw in a quick comment so for most of the infrastructure construction project for surely a threshold related to all the ideas point. But one of the challenges when engineer defile a threshold, the threshold related to construction induced deformation, but when you put a high accuracy instrument that you capture to combine effects. Nature response, combined with the construction use God deformation right so our human eye is not able to capture the deformation but even though without construction our Earth is moving is breathing moving up and down in millimeter accuracy. And then when you put a threshold that thresholds for everything. So, having a baseline is very important. Okay, I think we are right on the hour but I would like to give my last opportunity here to finish. And, first of all, there has been a lot of kudos about your presentation and your image the swarm image, they want that to have that image if you can provide that. Okay, so that that's very good. But what are the main what are the main impact if you can finish talking about a little bit about the main impediments of using these tools more widely. That's a great question. You know I think there there's I saw some of the other questions coming in as well and somebody alluded to the fact of like over trust to the data. You know I think one of the biggest barriers that slowed the adoption of light art is you've got really cool data that visually people like oh this this looks good this makes sense. But wasn't quite accurate people didn't quite know what they were doing or there were issues the data there was a class example in the early days of airborne light are where you would have to flight lines that were offset, and they weren't adjusted together, and it looks very close to a new fault. And so people are like oh we discovered this new fault. Well it turns out it follows a trajectory the airplane perfectly right and so I think that's probably one of the biggest impediments is where people know enough to be dangerous and start kind of working with the data and making mistakes and are honestly interrogating the data enough. And those kind of propagate through and then people kind of hit that point where they, they don't trust anymore and I think we've cleared that path with with lighter I think you know that's that's moved on. There's more opportunities for people to get trained and get that experience and and you know really kind of have that confidence. As was mentioned earlier in the webinar really that kind of concert of technologies puts in that checks and balances that redundancy, you know all of those kind of basic surveying principles that are taught. Get back your work, get major twice you know all those all those kind of things you know are really kind of the key to ensure that that is used effectively you know sure there's there's organizational barriers it's it's sometimes hard to make that case to invest up front. But once you start looking into economics and the value of it then then those impediments kind of start to fall away. Okay, with that, I think that we should close the webinar I think it has been very good I haven't seen a lot of thank yous and congratulations so I would like to pass that to the to this group. It was excellent so thank you everybody for joining and until the next webinar. Have a good one. Thank you. Appreciate the great questions and discussion.