 Welcome everybody to the webinar today. My name is John Stomatochos. I'm a member of the Committee for Geological and Geotechnical Engineering at the National Academy, and I'm the moderator of today's webinar. Coggy, as we call it, this committee is a standing committee of the National Academy of Sciences, Engineering, and Medicine under the board of their Sciences and Resources. The committee was established as the focal point of the National Academy for Government Industry and Academia on Technical and Public Policy Issues related to earth processes and minerals, materials, soils, and rock mechanics, responsible human development and mitigation of natural and human hazards. If you have any specific questions about Coggy, you can contact Samantha Maxino at the National Academy. She's our staff director of this committee. This webinar is part of our quarterly webinar series produced by Coggy through the support of the National Science Foundation. The webinar will be posted on YouTube, so you'll be able to follow up today. It's being recorded as you've heard and to be able to follow and watch it on YouTube. Hit the like button when you do. If you open your chats, you can message us and the speaker. We're hoping to have time at the end for questions and answers of the panelists. The process there, you can submit your questions any time using the Q&A tab on your Zoom panel. And we will collect those and I'll read those targeting one of our panelists at the end and hopefully we'll be able to get through as many questions as you submit as possible. I must say that as a disclaimer, any opinions, conclusions, or recommendations expressed by the panelists or anyone during this webinar are those individuals do not represent conclusions or recommendations of the National Academy of Sciences, Engineering, and Medicine. Sam Exino and Emily Bermuditz has set up the webinar and many Enriquez is producing it. So let me now begin with introducing our two speakers. Breakthroughs in augmented reality technologies can provide geo-professionals new levels of access and insights into field sites. Augmented reality technologies are accessible enough to allow geo-professionals, decision makers and students to visit field sites virtually and literally walk the sites, view them from above the lower within and superpose data existing or planned infrastructure and models onto them. So today we have Ben Rivers, Senior Geotechnical Engineer in the Office of Innovation and Implementation at Federal Highway Administration and Keith Lay, who's Director of Contact at Clearing Technologies to discuss how augmented reality could transform the world for geo-professionals. So we'll be, the bios of both of our speakers are gonna be posted in a moment in our chat. So let me begin and I'll turn over to Ben and ask him to give the first of our presentations. Very good, thank you, John. And I'm going to share my screen. Excellent, very good. So welcome everybody, thanks for joining. I am going to provide an overview presentation leveling up your A-game and your superpowers. If you didn't know you had superpowers, you do and augmented reality can enhance those superpowers. And as soon as I say you have superpowers, I wanna give you disclaimers. So I'm of course with the government and just letting you know anything that I present today here has no force or effect of law. So no superpowers by law. And the US government does not endorse any specific products or manufacturers. We're talking about technologies specifically here. Augmented reality, extended reality and a little bit on virtual reality is sort of the extended reality is considered both of those two. So just a little bit of background on where I'm headed today in this presentation. Pretty much everything that I'm going to present is centered around the fact that we're headed in the future, not too far in digital delivery and in needing digital workflows to do that and augmented reality really plays a part in both of those realms and enhancing our understanding and that workflow process. We also have the opportunity to use augmented reality and improving our learning experiences too which I'll provide several examples or one example on the learning aspect but several examples and how we can apply it into geotechnical engineering practice. So very fundamentally I think where this extended reality both virtual and augmented reality plays a part is in enhancing or making more effective our decisions in planning analysis, engineering and construction and using or enhancing these five Cs. Collaboration, if we're with technical people or with non-technical people, both this has the ability of providing enhanced comprehension immediately with the visual aspects as you'll see here but I think sort of at the very center of what extended reality provides is a communication tool and also this added comprehension. Coordination can also happen as well. I think we can use this effectively in pre-construction meetings, pre-bid meetings as well. So first example I wanted to share with you by the way just as we're presenting here I challenge you to think of other applications and how we can use this in our geoprofession not just related to digital delivery or workflow but that's certainly where I'm headed here and in the educational component of this. So this is something that our hydraulic camera parts in the Resource Center developed recently using virtual reality and the difference between virtual reality and augmented reality, the big difference is that virtual reality sort of cuts you off from the space that you're in and puts you into a virtual environment and so it's completely immersive within that environment and people can come in as avatars within that environment as well. So there can be collaboration but in this instance where our hydraulics folks they used it for enhanced learning and this is the Elwha River valley area and they used this as a module within their training and maybe there's one other C involved within our five Cs maybe a six C in context and you can see this map that orients you in what you're kind of looking at and I have, let me turn on my laser pointer here. Yeah, so you can see this overview map provides some context of what you are seeing or where you're navigating to. In this case, you have a couple of sites that are specific to learning objectives for this training and if you click one of those icons that takes you to one of these dialogue boxes where there's more information that it could be multi-media could be photographs, graphics, videos, reference materials, other materials to aid our understanding in this complete immersive environment but the other aspect of learning especially in adults is having the ability to interact and so with this environment that the hydraulics resource center folks have created I think that provides that immersive environment in a very interactive environment and you can see, here's a, so there's three dimensional cameras that they use almost like a street view application and once you're in that environment that you can look around and I could see us using this for site reconnaissance or training that we want to bring people who have not experienced the field into the field virtually and we could actually in an augmented reality have a very similar experience the difference there is that we're not separated and so if we have people within the room we can also communicate among the people there on the models that we're seeing together but for site reconnaissance, looking at geomorphology, landforms and painting a picture of what the site conditions are sort of walking a learning module in how we would do a site investigation site characterization, I think this would be a perfect opportunity to create a model, so I'm very excited about that prospect there. One other example that I wanna show I have one other after this one but this is Manning's crevice in Idaho so this is an application where I think you can see value with augmented reality this is not a very complex problem from the structural side but from the geology side it is complex. So this is a bridge that was built in the 1930s that was gonna be replaced with an asymmetric bridge and beautiful structure you see the previous bridge in the background in the new bridge in the foreground and you're seeing some of the geology and the structure of the rock here and there's a big ask for the loads for this bridge right at this substructure here in the anchor, the housing that you're seeing there. So not a very complex bridge, beautiful bridge, a lot of ask at those locations and you can see again the structure of the geology here with this substructure and the housing, the anchors here. So for the investigation doing some mapping was done and also confirming those conditions at those locations of the foundation elements and anchors with televiewer data they can provide orientation confirming the conditions of the joints, the structure within that rock, within the mountain side there where the anchors will be located or providing resistance and of course geologists know stereonets, some geotechnical engineers might know stereonets but the non-technical to non-technical folk, this doesn't have meaning in trying to explain a stereonet quickly certainly is not this challenge. So obviously there was analysis involved in this project for the anchor systems, finite element analysis on how these loads were being transferred and the corresponding displacements there but we don't have to the stereonets that we're seeing in the complexity of this project when you see it in an environment that's three-dimensional like this especially if you're immersed in this and you're able to walk around this model you can see it and immediately understand the spatial components of this project and you can put the joint sets right in the model too so you don't have to understand stereonets you don't have to explain to somebody what a stereonet is you can show the joint structure there and how that plays a part in the orientations of those anchors and why just what's at stake on the design. So the idea of augmented reality or extended reality is really improving our comprehension and communication abilities especially where we have those complex conditions and I'm just gonna show you some slides here running through you're seeing what you just saw there but also time related data, virtual cores, information, RQDs there, geophysical data overlapped of seismic and electrical resistivity in different ways to actually show the data and how things vary across the site is quite powerful with this, the technology of augmented reality. One additional case history I wanted to share with you is the US 231 project which is courtesy of Alabama Department of Transportation. This is a project that unfortunately occurred in February of 2020, shutting both lanes of travel down and yeah, so essentially for seven months this was a ER project and obviously they were hustling this is a major corridor out of Huntsville through Lacey Springs is where this was located but they were able to get this back open in seven months by constructing a bridge and so the foundations of this bridge were very robust drilled shafts but as, so they wanted to experiment with this technology getting these models done during the project delivery time was not going to happen but afterwards they were able to show an application here of a dynamic digital twin where obviously since they built this and this is an active landslide they have the ability with this digital model to get this back open and they were able to get this back open this digital model to track the performance of these drilled shafts and I'm not showing you the displacements here but there are accelerates shape accelerates and piezometer data that and you can see all of the data points that are within this baseline model and this is LiDAR with change detection turned on here that you can see where some movements have occurred just very localized and the geophysical data electricity overlaid here too but it's a very powerful tool even for those who are not technical to see what's moving, how conditions vary across the site. So a lot of application in this case for asset management essentially have a structure that it was constructed on an active landslide and being monitored and so this digital twin essentially is aiding the workflow of this performance management asset management and monitoring program where we have data that's coming in and it can be translated through something like digs standardized data transfer schema for data and providing that information into that digital twin where we can look and see what changes have occurred over time. One final case history that I wanted to share this is courtesy of Nick MacIrus. He did a case study just showing that you could take technology that's in our hands are mobile phones and capture core data and with available software that's open source provided by major technical companies he was able to generate what you're seeing here. It's scalable as far as the amount of work that's required to do something like this. So for this simple example or case study that he did here and I don't know how many core sets he actually did but it's quite possible to do that and view that on a mobile like you're seeing here mobile device and being able to manipulate core so just some other possible applications that you can probably think of where we can use photogrammetry like we are seeing here. So just sort of a wrap up here, the fact that couple of takeaways, we can leverage augmented reality to really enhance our experience especially in these complex conditions or complex projects and being able to show these models within three dimensional models or even four dimensional models we can show stakeholders who are not necessarily technical even how these things relate but you can see just how powerful if pictures are worth a thousand words if you're looking at what you're seeing here with multiple data shown on the same screen overlapped geophysical data. So three dimensions plus these layers we're in multiple dimensions or magnitudes away from that thousand word picture now. Eventually we're getting things are moving toward them in three dimensional models that we're delivering for our projects and being able to convey that to all stakeholders I think is pretty much a central part in augmented reality really plays a role with that. So some applications, fundamentally communication tool, rapid comprehension just by being able to see these data within three dimensional, four dimensional spaces collectively is very powerful from public engagement, learning site characterization where we're getting information from the field bringing it in if that model doesn't make sense we know we need to go back out there and understand so that was that final C on the confidence side, we can use these models to provide insight and to our understanding of this. Eventually getting or you saw that the digital twin example there but where we want to eventually get I think as practice where we're headed is we're not there yet but model has legal document and I think the structures folks are closer than we are on the underground but we still need to figure out how we represent data within these models for those applications. So with that, John, I will turn it back over to you. And thanks, that was fantastic. So I'm just gonna then pass this on to Keith. So Keith has some actual demonstrations and I'm really looking forward to that and I'm sure everybody else says don't forget to post your questions in the Q&A and we'll get to those at the end of the talk. So Keith, it's all up to you now. Hey everyone, thanks for joining us today. My name is Keith Lay, director of content from Clario, Inc. And so yeah, I just wanted to give up some demos actually using some of this technology and kind of show how this works. We work closely with Ben's team and other teams at FHWA to help create some of the visualizations that you saw in his presentation. And in terms of delivery, there's actually a couple of different ways that this can be done. You saw some of the slides of Ben's people were wearing a headset. So this is an example of a headset. This is the HoloLens from Microsoft. It's what we call a mixed reality headset. You'll notice that the lens is clear and as Ben correctly pointed out, there is a difference between virtual reality where you're actually wearing a headset that cuts you off from the world and you only see what the computer shows you versus augmented or in this case, mixed reality where you're actually projecting the 3D data into the room. Yet you can also continue to see the room and see the other people around you. So just a couple of different ways to implement that. But we don't even need necessarily to use headsets. We can also use things like our tablets and our phones to participate in this, which is a little bit more ubiquitous because it's equipment that people might have on hand already. But just gonna show a few examples of that here. I'm gonna bring up, I'm gonna share my screen on my iPad and I'm actually going to show some examples of this in action. So if you can see my shared screen here, this is actually coming from the iPad that I'm holding in my hand here. And this is actually the model that Ben referenced in Alabama. This is the US 231 project. And so what we're seeing here is we're able to bring a variety of different data types together in a comprehensive way and also in a three-dimensional way. So, excuse me, this is that LiDAR change detection map that we were seeing earlier. And we actually have the CAD file of the overpass that was built, or at the time of this model planning to be built, overlaid and geo-referenced to that. But where it gets really interesting is I can actually move this around and I can actually go into the subsurface. Now, right now I'm showing you this in kind of a typical three-dimensional mode. But what I can also do is I can actually go into what we call an augmented reality mode. And I'm actually gonna drop this model into my space. So now you can see that I'm actually, what you're seeing is the view that I'm seeing through my device. Again, right now I'm using my iPad, but it's actually dropped into my space here. So I can actually interact with this as an actual three-dimensional model in my space. If I was wearing a headset, this would actually be even more immersive because it would actually be a true three-dimensional hologram that I could actually walk around or walk into. I'm showing you the augmented reality view of that now. And if I have other participants in this meeting, and I'll show you an example of how that works later, you can actually have multiple people joining this meeting either in the same space or in other virtual spaces from around the world, and you would actually see their avatars in this 3D space. So as I say, we've got the LiDAR map, we've got the CAD model sitting on top of that. But again, where it gets interesting is where we go into the subsurface and we actually see some of the subsurface information. So we can see here there's the resistivity that we were Ben was referring to earlier. We can see the actual anchors of the bridge that overpass from above, but we can also see additional data information showing up here as well. And what we're seeing is some of the boreholes. So I have, on the left-hand side of my software here, I have all of my observations that have been added to this model. And we see that we've got some lithology here, we have geophysics, we've got some piezometer data, we've got some SAA data. I've gone ahead and opened up some of the lithology. So these of course are boreholes that were actually taken on the site. But what's important to note is not only do we have these 3D representations, the boreholes that are geo-referenced to where they are actually drilled on the site, this is data-driven. This isn't just a pretty picture of boreholes. As you can see, as I actually tap my finger on the different parts of this that we actually get the actual data from that, from that section of the borehole. So we can see the elevation, we can see the eastings and northings, we can see the lithology and the sample depth. So this is all data-driven. These were actually borehole logs. And if you're familiar with borehole logs, I'm sure most of you are. That's just a big, basically spreadsheet of data. And we can import that and the software will actually automatically translate that into these visual representations. So what's important about this is engineers, as you all well know, are good at creating kind of a 3D model in their own mind when they see the variety of different project data types. That's kind of where the training comes in. But what can be difficult is getting everyone's individual 3D mind map to align. And so by bringing people into this three-dimensional immersive space and actually sharing that same 3D model in a holographic way, we can get to that place of common operational understanding a lot quicker. So what we have here is the ability to bring a variety of different people together, show them what's happening on a future project or a current project, show them the progress of that project, but also take them to places where it's very difficult to take them otherwise. In this case, we are looking at the subsurface of this area. Again, we can bring a variety of different data types together. So as an example, we can switch our base map. So this is our change detection base map. I can go to some LiDAR that was taken in 2020 and that's overlaid on top of a map of the area. And I can also switch that with the 2021 LiDAR. So whatever different data types that you've got to bring together in a project, we can actually show those all as different layers. So I want to quickly switch to another example here. So we just refer to these as workspaces. So this is another project using this technology in Alaska in Denali National Park. And what's happened there is another landslide. You can see the alluvial flow kind of down at the bottom here, I'll zoom in on that. And this cut off the basically the only road that connects the east and west side of Denali National Park. So the plan here is to build a bridge. So again, we see that we brought in the CAD file of the bridge. We've brought the brown areas kind of on the left and right hand side. These are DEMs of the hill cuts that will be happening as part of this construction process. And we have again the actual bridge itself and we've got a cute little school bus there driving over it. So as before, I can actually go into the augmented reality view. And I'm gonna drop into my space. So again, we see this as an actual truth, true three-dimensional hologram of the mountainside actually in my space. And my home office here is quite small, but depending on how large a space you have, you can actually blow this up to a very large scale. And in terms of human scale objects like this bridge, we can actually show this bridge Excuse me, we can show this bridge in a one-to-one scale. So as an example, when we're giving demos to this, to the project participants, we can actually get a large room and show this bridge in one-to-one scale and people wearing the headsets, they can actually walk the bridge and they can actually get a feel for the scale of that bridge, what that bridge is gonna look like. As you can imagine, there's a lot of different stakeholders in a project like this. So we have the actual geotechnical engineers, we've got the construction engineers, we have environmental engineers, we have representatives of federal highways, federal parks, as well as the park service itself. So we have, excuse me, we have all of these different stakeholders that are participating and need to have a say in this project. And by bringing them all together in this virtual environment, and this was especially important during COVID when this project was getting started, it was not possible to have all of these people travel up to Alaska and then travel to this remote site. It's difficult at the best of times. But by using this technology, we were able to send the headsets out to eight or 10 different people around North America. They put the headset on, join the meeting, like a team resumes meeting and set time, except now they're actually meeting in a 3D virtual space and they can actually see all of these different data types brought together and they can actually have a meeting in that metaverse space to discuss moving the project forward. I'm just gonna actually show you here, we can actually even go into the underground. I'm gonna kind of get my, get a little close here. I'm gonna let them all around. Let's zoom in on that, there we go. We go into the underground, we'll actually see what's happening underground with this, that there's both the anchoring that's gonna be happening in the rock face, but we also see these boreholes that are kind of coming up just as an equipment on the surface. And these are thermo siphons that are being brought in to regulate the temperature of the ground where the anchoring is gonna be happening to make sure that anchoring stays in place. So we can not only visualize what's happening above ground, but we can also go into the subsurface as well. But where this gets really interesting is yes, we're able to visualize all of this project data that people can view and meet in a virtual meeting, but it actually becomes a living document because we're actually able to give the ability for people on site to actually take three-dimensional scans of the things that they see on site. So this is the particular device I have in my hand. This is an iPad Pro, similar with the iPhone Pro. You may or may not be familiar with this, but they actually have LiDAR scanners built into them. And so the Apple iPhone and iPad Pros can actually take 3D scans using the built-in LiDAR scanner. The Clario software that I'm demonstrating to you now takes advantage of that and actually allows you to create these three-dimensional scans. And so what's happening is for this particular project because it's in a national park, the blasting that they're doing, it needs to be, when it's done, it needs to look like natural lot rock. It can't look like a blasted zone. So they're able to actually take scans before and after and do comparisons after the project to show the very stakeholders the results of the construction work. And so this 3D scan was literally taken by pressing a button by moving the device in front of the area and literally in a minute or two, we have a three-dimensional scan of that area. And again, I can go into AR mode and actually float that in front of me and I can go to one-to-one scale. And now it's as if I were standing at the site in Alaska in Denali National Park in front of this rock face. So if you have experts that are actually needing to weigh in that aren't close to the field, there are many days travel away, you can have the people on site actually use the software to take 3D scans and then you can have a virtual meeting with the expert back at the office. And the expert can actually weigh in as if they were actually standing on site looking at the issue that has been brought up. So this is another example of how we can use this augmented reality technology to actually bring, instead of bringing the expert to the field, we can actually bring the field to the expert. Some other things that we can do here is I can actually bring up different scan and we can actually do some comparisons. These scans are totally different. So this doesn't really make sense, but later on after the project's been done and they re-scan the same area after the blasting and excavation has been complete, they can compare side by side the before and after scans to kind of see what has changed on that site. So here we're comparing the two scans side by side or we can go into an AV mode where we can actually scrub between them. So again, this gives us this ability to move projects forward, make decisions more quickly, avoid downtime on the site and avoid unnecessary travels. There's obviously still going to be travel, there's still going to be people moving back and forth from remote sites, but this actually can reduce some of that travel necessity. So it's talking about the virtual meeting capability and what I want to do is switch screens here. I'll just show you a quick video. So what you're seeing here is an example of one of these virtual meeting sessions and this is actually a LiDAR scan, sorry, a photogrammetry scan, my apologies, of a project site. And we're actually seeing an example of people, this is being projected as a three-dimensional hologram in the boardroom and you see the people are wearing the headsets. This is the view of what they're seeing but you're also seeing the avatars of people that are joining this meeting from a remote location. This could be anywhere else in the world and not only are we able to see the avatars of where the other people are but we can also see their hands and what they're pointing to and of course we can also have a conversation so they can actually hear each other and it's actually spatial audio. So if someone's on your left, you hear them on the left, et cetera. So they're actually having a productive project meeting around this 3D data even though not all of the participants are in the same room. And in fact, the way that I actually have recorded this video is I'm also a participant in this meeting. My iPad has joined as a member of this virtual meeting and I'm literally have just done a screen recording. So what you're seeing is an actual, at the time, live meeting that was just being screen recorded on my iPad of this virtual meeting. So this holographic visualization is actually projected into the space and we have the local participants that are wearing the headsets and they see this 3D data in front of them as if it were an actual object and then the people that are participating from a remote location show up as the avatars. So we find that we're able to do a lot of this project meeting work where we can get people to have a clear understanding of what's happening, have a clear common operation understanding what's happening much quicker than before and we're able to actually bring people together where it's either difficult or cost prohibitive or in some cases impossible to do. So I know we have a number of questions. So maybe I'm gonna leave some time to answer that, John. Do you wanna lead us off on some of our questions? I do, first thing I wanna just point out to everybody's watching the sort of shuttering of the videos that we see on Zoom is a Zoom problem, not a problem of this technology. It's pretty fascinating technology, the unfortunate part for us where we're seeing these scans kind of to include you around a little bit is because we're using the Zoom webinar software. So maybe that's the next thing that really needs to be improved. And then just a clarification Keith on this model, I just think you should point out what the scale is because this is a pipe and if you look at this photogrammetry you might think, oh, that's just a small pipe. But then you realize that there's probably some stairs down there in the corner and some people for scale. Do you just change your perspective to understand that this is actually quite a large site? It is, and yeah, so you can actually see there's some stairs there and there's some high-vis jackets at the bottom of the stairs and then there's kind of a large hot. So I would say maybe that hot would be six, seven, eight feet tall for someone to stand under. So that kind of gives you a scale. So what we're able to do because obviously we're in a boardroom, we're doing what's called a tabletop mode. And in tabletop mode, we basically just shrink things down to kind of what we call a one meter sandbox. So everything's kind of sitting in a one by one meter but we can go to one to one scale on this. So we could literally, if we had space big enough we could actually go to one to one scale and we could actually be as if you were standing on this site looking at this pipe, looking at this excavation or we can put you in the bottom of the excavation in one to one scale. So I think a lot of the questions that I see have come in and I've really focused around the application and how much do these devices cost? How expensive are they? What's the entry point for the hardware? What's the entry point for the software? Can these kinds of software and hardware devices be acquired by a small geo professional firm? Yeah, so cost ranges in terms of what I was showing you today like using like an iPad or an iPhone to participate in this that cost maybe zero because you may have access to that technology already otherwise it's just making that purchase through the Apple store. Some of the VR headsets like the MetaQuest 2 is about $300. So that's a relatively inexpensive entry point getting into that sort of 3D virtualization. The HoloLens is a little more interactive if you wanna have that true HoloLens experience. This is more like about $3,500. So the cost ranges from $3,500 at the high end down to a few hundred dollars down to possibly zero dollars. The software we've been looking at is called Clereal, C-L-I-R-I-O, and that's something that you can check out. The cost of that, I don't wanna get into too much of a sales pitch being that we're informational here but the cost on that can range from zero dollars to kind of try it. There's a trial version of that up to I think $49 a month for using it in the field. So it's relatively inexpensive to get involved in this. You can literally cobble together some hardware that you have on site or you might have on personally and free trial of the software and you can get started on this with no money invested and then you can kind of decide whether you wanna start spending money on licensing software and buying higher end equipment. So the software you used for today's examples included obviously the clear on the software. Was there other software that was required to pull together your demonstrations? Well, I mean, we're talking about a lot of data visualization here, right? So obviously there's a workflow that leads up to the visualization that they're already being housed. So in terms of like your Geomatics, your GIS team, they would be working with software like Google Mapper or Esri. They would be working with subsurface data maybe from whole base or digs. They would be working with photogrammetry software that picks 4D. So you'd have all of this existing workflow that you would be already doing within your organization. The interesting thing is, or the ironic thing is, is that despite the fact that engineers and Geomatics and GIS folks have been working with 3D data for years, they view it in an inherently 2D way by looking at on their laptop and computer screens. So what we're talking about is taking the data off the 2D screen and putting it into the room as an actual true three-dimensional hologram so you can interact with it in a true three-dimensional way. And is there specific training too that can be, other online courses, other kinds of courses? One of the questions we got is, how do you get the train to use the software? Ben, maybe I'll ask, I'll let you answer that just in terms of your learning curves. Yes. Yeah, so the first time that we developed models using the HoloLens was for the A-game and we developed a very simple model to show non-technical people what we were talking about with the A-game and what the value was. And so Scott Anderson and a few others from BGC developed the models and we purchased the HoloLens through BGC. And as far as the learning curve, we had the first generation of the HoloLens and the interaction or the new models, I think the way you've set them up are very intuitive but it actually didn't take long to understand the HoloLens as a Microsoft products and there were air taps and other blooms and other things that you had to learn. I don't think it's quite as onerous but quite honestly, it wasn't that bad. You know, in a day's time, you could get proficient enough to navigate. So yeah, I think you learn more and more and actually just sort of go back to the picture that you have right here on the screen and the superpower that we have on, you know, at this scale, you're sort of seeing as a bird eye view and I think that's another thing that when you're learning the models and they are scalable, you know, within space, you start to get better insights and you just sort of play around with them but it doesn't take that long to learn the basics. So here's another question is, can you actually measure or draw on these images? So can you use the images to make specific measurements that you wouldn't normally be able to make in the field? Yes and no, you know, nothing's gonna replace, you know, an expert in the field with, you know, standing on the rock with their measurement tools but we can replace some of that virtually. So when you're using the hall lens as an example, the hall lens recognizes your hand. So your hands are actually your tools and so within the Cleario software, you can actually reach out and drop pins just like you were on a Google map. So imagine you're on a Google map and you drop a pin between two locations and Google will tell you the distance. We have a similar thing where you can drop pins on say the bottom and top of a boulder or a slag pile or what have you and it will actually show you the distance between those two pins. So you can actually do measurements in the 3D virtual space. In terms of accuracy, the listed accuracy on the iPhone or iPad Pro scanner to fool is approximately one centimeter. So, you know, it's reasonable, it's not survey grade but it's definitely reasonable to get an idea of, you know, the sizes and volumes of things that you're dealing with. And is it possible to use this actually within the terrain when you're there? So is it possible to see virtual augmented reality superimposed within the real terrain? Have you tried that? Yeah, it's a tricky nut to crack, actually. With VR, of course, that doesn't work because you can't see the terrain and you're gonna trip over a rock so you don't wanna do that. With the headset like this one where you can see out, it is sensitive to light so in bright sunny conditions, maybe you're not gonna get a very clear image so there's some challenges there. Also, you know, on site you may have PPE requirements, Trimble, I'm sure a company you're everyone's familiar with, they actually make a version of the HoloLens where they actually embed it in a compliant hard hat. So you can actually have a hard hat version of this in the field. They also make a polarized kind of film that goes over it that helps with that. So, you know, definitely that possibility exists but there are a few technical challenges to overcome. So, and I have seen other applications not necessarily with the models that we've seen here but with a tablet and looking for applications for like restocking shelves and I've seen applications where they have that data available that was recorded yesterday and they see what's there today and the same thing that has been done remotely and I've seen that with, you know, the models that we have created here take offs on volumes but I could see that application not necessarily with the model types that we're generating here but very similar to the Army training and how they've used it in the field. I think we could see what quantities of earthwork were done yesterday, today or whatever is sort of overlapped on the landscape. I think it's quite possible. Yeah, so that kind of feeds this question of can monitor data being incorporated in the air models as ongoing deliverables? For example, completion of a performance assessment with continuous ongoing data feeds. So, could you think of an application where? I didn't quite hear that entire question. Okay, so the question was can monitor data be incorporated to the air models as an ongoing deliverable following completion of performance assessment? So with examples for continuous ongoing data feeds. Yeah, for like INSAR and LiDAR data. Yeah, in fact that Alabama project had LiDAR data that they flew and there's a time scrubber there that you can see over time the changes, change detection and the same thing. For the data sources, there are multiple data sources that could be included. It's all digital and all the A-game technologies that we've been promoting recently are all digitally acquired. So that part of it is quite doable. So for performance, and like you saw that digital twin on Alabama, I think those types of applications or monitoring the slope are quite feasible. And I just want to note on that, so it's not going to be real, tended not to be real time, but what we would refer to as near real time, right? So that there might be some period of delay in minutes or hours between that data from the field. But yeah, we're seeing some really interesting things coming up in terms of taking stream monitoring, taking seismic data, taking slope data and using IoT and Internet of Things approach using various APIs to actually connect so that you start with that base model that we were showing earlier, but then you'd have a point on that model which would actually show you what the stream level was within the last hour or something like that. Have you guys ever come across cases where this has been applied to mining or tunneling? Yeah, so we use this quite a bit in mining. Again, we have the ability to take those scans. The great thing about the LiDAR scans is they don't need light or don't need a lot of light to take rock face scans. And again, you can bring that field to the expert by taking a 3D scan of a rock face back at the office. You could have the consulting engineer view that in one-to-one scales if they were standing there and then they can make suggestions on how to move forward, maybe more shock, create more anchor bolts, something like that. So yeah, we're definitely seeing this being used in mining. So I'm gonna do two more questions, a big one at the end for you. So just a quick question, I think, for Keith. Have you worked with developers using versions of Apple Vision Pro or do you have plans to use an alternative to the HoloLens? Yeah, so the HoloLens has kind of been the state of the art. Obviously, Apple has got some different ideas coming up in the new year and we're very excited about the Apple Vision Pro. We kind of think it'll be the iPhone moment for AR and VR. It will basically take this from something that some people do to something that everyone does. We'll soon wonder how we live without it, just like we do with our phones today. So we are actually going down to Cupertino in the next month or so to actually test our software on the Apple Vision Pro. So we're really excited about that. You know, I'm an Apple guy from way back, but obviously work with Microsoft stuff, but it really shows the difference between the two companies. Apple's like, okay, we're gonna put cameras pointed at your eyes and then we're gonna have this high res screen that'll project your eyes so people can see your eyes and then Microsoft goes, just make it clear. Just make it clear for us. Okay, so last question I'm gonna ask each of you. I'll start with Ben and then with Keith. So the question here is, what's the most exciting technology you see coming down the road that geo-professionals should look for? You know, it's a bent person and Keith. Oh, I'm sorry. So I think we're at a point, especially with digital information data, geotechnical data, digs I think is going to be revolutionary here and or have a revolutionary place with being able to transfer data among these applications, you know, from the time that we acquire the data to processing and doing analysis to getting them in the models and looking at data. CNFR site model makes sense. You know, throughout that workflow, I think that is going to be a revolutionary point you know, as we sort of go forward. Yeah, Keith. Yeah, I think the exciting thing for me is the democratization of this technology. The fact that anyone in the field can have a 3D LiDAR scanner in their pocket that's easy to use, it's low cost and that's just going to basically open up the floodgates on the use of this technology is just the fact you don't need a specialized, be trained with specialized equipment to go out to site to create this 3D data that anyone from the field engineer down to the bulldozer driver is going to have the ability to capture issues that they see on site in one to one scale, three dimension and have experts that are hundreds or thousands of miles away weigh in and solve those problems. That democratization is really the exciting part for me. Oh, cool. Well, I want to thank everybody for participating. We had a lot more questions that we didn't get through. I apologize that but we're clearly out of time. So I hope that you'll, if you want to see some of this again, visit the YouTube version of the webinar. It'll be posted soon and you should get a reminder of that after this concludes from the National Academy in an email. I have to finish with the disclaimer to just say that any opinions, conclusions or recommendations expressed by the panelists or anyone during this webinar or those of the individuals and do not represent conclusions or recommendations of the National Academy of Sciences, Engineering and Medicine. And so with that, I'll close this webinar and thank our speakers and thank everybody who participated. It was a good one. Thank you. Thank you. Take care.