 Okay, hello everyone. Welcome to today's webinar. I'm Min Fang Wu. I'm a research data specialist working for Australia Research Data Commons, ARDC. I'm honored to host the webinar today. So today's webinar is about managing disasters through improved data-driven decision making and operational readiness levels. Our speaker, Data Jones, is from Washington D.C., so good evening and Dave. Dave has these very impressive and interesting buyers. I just read some of that and Dave probably will talk more in his talk. So Dave is the founder and the CEO of Storm Center Communications in corporate. Storm Centers have this inclusive technology, GeoCollaborate. GeoCollaborate has been adopted as the all-hazard consulting data sharing platform to help the movement of free, utility vehicles across the nation to restore power after disaster. Spend nearly a decade as an air broadcast materialologist for NBC4 in Washington D.C. and launched the very first TV web with a website with a net for in the United States. And Dave is a past president of the ECF Federation and honored to have been selected as Charlie's Focusburg Awardee from EC and the American Geophysical Union. And just to point out that the ECF Federation is the U.S. Earth Science Information Partners. It's spanning across the U.S. now, from down under, ARDC, SARO, OSCOP, NCI, AMOS, and the Exectual, and our members of ESIP. We have E2SIP here in Australia. More fun. And Dave is frequently invited to make keynote presentations, stressing the importance of integrated science and the use of observations as well as disparate data sources into the decision-making process to improve situational awareness, decision-making to save lives and not property. Okay, Dave is about to start his presentation. During the presentation, if you have any questions, please post that in the chat panel. After the presentation, we will hand it down for QA. Okay, without further ado, and Dave, I'll hand it over to you. Oh, that's great. Thank you so much, Meng and Susanna, for setting up this webinar and for the invitation to present the innovations and advancements that we've been making here in the United States with regard to data sharing across platforms and also across devices for improved situational awareness, decision-making, especially when in the area of trust. So as we look at that, I'll be talking about that just a little bit more as to how we can trust data, access that, and share it. So when disasters occur and threaten, many decision-makers accept and digest data and information that may not be truly vetted or may not be even accurate. So the source of the information may be from citizens that don't have accurate measuring tools, that may not even know how to measure information, but they want to report it so they can seem like they have the highest totals or the strongest wins or things like that. So people looking to stand out with extreme measurements that may or may not even be made all goes into whether or not you trust a particular dataset or observation. So my company Stormcenter Communications has been involved in creating a new approach to situational awareness and decision-making from disparate sources so you can access that data and share it across multiple platforms. So the private sector operators such as utilities, the food sector, the fuel sector, transportation and communications, all those operators need to share information to make communities more resilient and enable reentry of essential services after disasters hit. So whether it's a wildfire or a tropical system like a hurricane or even an earthquake or flood, communities need to get back up and running quickly so you can keep commerce going and you can get people back to normal in their daily lives. So I'm going to change slides here and I want to first start talking about the All Hazards Consortium and you mentioned them in the beginning. I just want to give you a sort of an idea as to who the All Hazards Consortium is. They're a non-profit organization in the state of Maryland where industry and government decision makers come together for trusted information and assistance in building communities that are led by the private sector with government invited to the table. So it's not a situation where government is leading these initiatives, it's really the private sector leading them. And so this is really different approach where many initiatives and committees and groups are put together by the federal government, but those initiatives suffer when elections result in political leadership changes. So one group comes in and they want to change everything that's been happening in the past. This has always set operations and transition of researched operations back and so that's been very unfortunate. So the All Hazards Consortium does not collect and filter data sets for use in decision making. They really help to build relationships between decision makers so data sets can be understood but so they can be vetted and so they can be categorized through the ORL levels, which I'll talk about here shortly in much more detail. So the ORLs are rapidly evolving as a situational awareness and decision-making standard, if you will. So the All Hazards Consortium has about 45,000 stakeholders nationwide serving executive and operational decision makers across multiple states and industries here in the United States producing results in disaster management, sensitive information sharing, cyber security, research to operations transition, and solutions development. So the All Hazards Consortium, I'll do this purple line in here, it really cuts across sectors to put more data to work that is available from the private sector, from businesses, from federal government sectors so that it can really speed decision making so that what has been accomplished over the past year and what these groups do can really help decision making happen faster. So what I was going to say is what has the All Hazards Consortium contributed or accomplished over the last year? So this is one of the things they've done. They've developed a new state private sector data sharing integration model. So this is really an example of the All Hazards Consortium working with the Pennsylvania Emergency Management Agency here in the United States, and they leveraged really the All Hazards Consortium Multi-State Fleet Response Working Group to leverage their legal framework and governance structure to create a private sector driven sustainable working group model. So this really goes to what I just said, and that is having the private sector lead this initiative with government being invited to the table. They have also put together a coordinated ability leveraging federal government innovation with the US Department of Transportation, the Federal Motor Carrier Safety Administration. They issue waivers whenever disasters happen so truck drivers can exceed their driving limits and they can really get back on the road and they can deliver food, fuel, and medical supplies into disaster areas. So advancements in the federal government innovation has the Federal Motor Carrier Services Administration issuing particular waivers for vehicles so they can drive into disaster areas and exceed their normally regulated limits. So that's been a very interesting evolution in 2018 and also expediting utilities across the US-Canadian border. So border crossing is really critical when you have power outages where Canadian utilities need to come into the United States to help in a disaster area to restore power because it's really a very large effort to restore power when widespread disasters hit and frequently utilities call upon the Canadian utilities to come across the border to help with power restoration and that's been very difficult upwards of 12 hours of delay time where those trucks are just sitting at the border because the border security, border patrol hasn't gotten the word to let these utilities come in and across the border. Now they've made tremendous strides in allowing those trucks to come across the border and in some cases just takes 45 minutes for them to get across when they're responding to a disaster in the United States. Also the pre-staging of private sector resources before Hurricane Florence in particular that was a hurricane that hit the North Carolina coastline here in the US and it stalled. It caused a lot of flooding and we were able to stage utilities with a private sector partner Walmart which is a huge retail organization and they have a lot of parking lots and so they opened up some of their parking lots for utilities to come in and gather in one location to use that as a staging area. That was done in coordination with the North Carolina Office of Emergency Management and their private sector liaison office so that was a great success story as well and then also increasing decision maker confidence in data-driven decision making. So this is what I'm going to dive into in just a minute here talking about the operational readiness levels. So the Fleet Response Working Group is part of the All Hazards Consortium and that working group is the group that has adopted Geo Collaborate as their data sharing platform. So that platform enables the access of data sets and can share them across disparate platforms across any device in real time so we can identify trusted data sets. We can take those trusted data sets and share them through Geo Collaborate into any mobile device, tablet, computer, anybody with a web browser and then those devices will get those data sets in real time. So that has led to well gee how can we make sure that we trust those data sets that we're accessing and sharing. So that has led to the operating of a trust framework for data-driven decision making and then developing use cases that organize people and information so we can really leverage the research that's been that's boiled up into data sets that could be useful for operations into real operational environments. So that's really developing a new operational standard for data sets and also a new approach for disaster solutions. So you'll hear me mention the term also data-driven decision making so I'll that we call that 3DM here and so the the data-driven decision making can manifest itself in webinars, in workshops, and in real time training sessions where people are trying to identify data sets and leverage those data sets for real time decision making. So I just wanted to show an example of a Geo Collaborate dashboard that was used with Hurricane Florence as Florence was approaching the coastline of the Carolinas. There's a lot going on here but essentially you see these red states Maryland, Virginia, Kentucky, North Carolina and South Carolina are all red. Why are they red? Well they're red because the governors of those states had issued states of emergency and in order to issue a state of emergency the governor has to see a immediate threat developing and that was Hurricane Florence. So when the governor issues a state of emergency that enables the governor to get a more power to stage vehicles and to put the National Guard on alert so they can be prepared to help citizens. It can also enable them to provide actual declarations so declarations of states of emergency or of transportation waivers that you see here. This is an executive order issued by the governor declaring a state of emergency. Now this actual document is available geospatially through this data layer that made North Carolina and Virginia red. So this is the governor of Virginia's declaration of emergency that you can get by clicking on North Carolina or clicking on Virginia and having that document come up. Why is that important? It's important because when utilities are responding across state borders they're really not authorized to go across state borders and out of their jurisdictions unless there's an emergency happening. So when the governor declares that state of emergency it allows them to get a hold of this official documentation to have in their trucks so when they come across the border on the way to Virginia or North Carolina they can show the authorities if they get stopped or if they get in an accident the reason why they're out of their jurisdiction it's to to prepare to restore power after this disaster. And you can see on the left hand side here are data layers that have been turned on and you can see a reference right here next to this red dot 48 hour rainfall forecast from the Weather Prediction Center. It's an ORL1 data set. I'll explain what that means here in just a few minutes. Weather service radar data is ORL1 this is the radar reflectivity that you see down here those green things. And then weather service watches and warnings the advisories that are issued by the National Weather Service are also ORL1 which means highly trusted and reliable and available 24-7. So I just want to dive into showing some of the partners that have been involved in the in the whole all hazards consortium effort for the Fleet Response Working Group. We have utilities, we have New York City government, we have Walmart, Bank of America, Wakeford Foods, NASA as a U.S. federal agency, NOAA, the FAA. So there are a lot of organizations that are available and and very interested in evolving this trust framework for data sharing. And so we have the all hazards consortium and the E-sub Federation as well. So I just wanted to give you a quick little brief of what the all hazards consortium sensitive information sharing environment enables. It enables establishing this environment for private sector to share sensitive data for federal and state decision making. It allows state and private sector to share information together to keep commerce moving and businesses open. And the data driven decision making or 3DM and ORL models are really taking off within the federal government and within the private sector here in the mid Atlantic area of the east coast of the U.S. And it's really being seen as a model within the Department of Homeland Security, which is a federal agency. So we're addressing transportation, food, fuel, utilities and medical supplies. How do they all coordinate to get back into disaster areas? How can they coordinate with officials to get the right products and services into the right place at the right time to help with response and recovery? And then DHS, the Department of Homeland Security is establishing a long term sustainability approach for the all hazards consortium because they like this model so much. It's all about fast access to data from multiple and disparate trusted data sources. We're also displaying a poster actually next week at the E-sub Federation meeting in Tacoma, Washington, which is on the west coast of the U.S. And the disaster life cycle cluster of which Karen Moe is a co-chair and I am a co-chair as well, are talking about this ORL model, the ORL standard that's evolving, what data driven decision making is, what E-sub webinars we've put on and what the trusted federal data sources for hazard response and decision making webinar brought to light that we had just back on June 24th, where we brought some U.S. federal agencies together so they could describe the data sets and data services that they have available right away so they can be put to use in any sort of decision making environment. So this is the federal government stepping up saying, look, we want to make our data more useful. There was a new law that was just passed making the geospatial data law making more data available from the government in geospatial formats. And you also see the geoclaborate dashboard that's highlighted in the poster that's going to be shown next week and also the operational readiness levels that I'll talk about here next. As a matter of fact, I want to describe what operational readiness levels really are. So ORLs are really creating a standard so decision makers can quickly understand data sets and the trustworthiness of those data sets. All too often in the past, when disasters have happened, officials have used data sets and data sources that have not been trusted, that have not been reliable, and have maybe even been put together by a teenager on Facebook, right? Where they put information out there and somebody believes it, somebody takes it, they share it, and then it goes viral and unfortunately becomes part of a decision making process. So the operational readiness levels are designed to assign trusted readiness level to data sets that are going to be used in official decision making processes by trusted and authoritative sources. And so there is a decision tree that comes with this, the ORL governance, and this is what it looks like. When you have a decision tree here and you start whether the data set is trusted and vetted source, if it's not, it goes all the way over here to a not being ranked. So it won't even get an ORL number. But if it is, you step down here to whether or not it's secure. Is it HTTPS? Is it a secure data set? Is interoperability optimized? Is it restoration and redundancy available? So is there backup? And do they have that readily available? Is there no downtime in their requirements? And by no downtime really goes to maybe 0.997 percent downtime. Are there change notifications made? So in other words, if you change a data set or change its metadata, is there a notification set out? So people understand there's a change and they can alter their training or the materials that they have available for that data for their users. Is the data verified and tested? Is there metadata completeness? We all love to talk about metadata, but how much of that metadata makes that data set complete? How much is unacceptable that needs to be improved? There are a lot of federal agencies here in the U.S. where the metadata standards are a little bit, you know, the standards are high, but what they actually put together is not so great. So there's improvement needed there. Also the extent and latency for use cases. This is a different color because different use cases have different latency requirements. And when you have different latency requirements, it might open the door for other types of data sets to become an ORL one or two for operational decision making. And also the resolution comes into play. Whether you're talking about land use or agricultural applications, maybe a landsat pass, one every 16 days is useful. But if you're talking about issuing a flood warning and you need to see stream flow gauges, you need updates in the minutes to maybe tens of minutes so you can get that warning out. So if all of these are satisfied, you get an ORL one data set. If some are not satisfied, where there's no change notification, it hasn't been really verified and tested, there's some metadata and some is missing, then it can come down to an ORL two. So essentially what we're doing is we're creating a governance structure around operational data sets and even research data sets that may be useful in an operational setting. Let's say for example, you're collecting some research data on the flow of a river and all of a sudden you get a severe thunderstorm and that causes that river to start to flood. If you're gathering data that can be used in decision making, then you should be able to make that data available and useful to a decision maker. So it's kind of opening that door for the research to operational transition so you can put more of that research data to use. It's not like you're really giving all your secrets out there from your research. You're really saying, hey, I have some data and information that can help. I want it to go into this trust network so we can make help make decisions and maybe even save some lives. ORL threes, maybe it's not verified and tested or limited metadata, that's all determined in this area to become an ORL three and then if that's not the case, then perhaps is the data in testing or is it under a development phase? Maybe no metadata, the extent and latency for the use case might be beyond what some people are expecting to make decisions and the resolutions at a temporal or spatial or spectral radiometric level might be subpar. So it becomes an ORL four. Why would that, if these criteria are not met, get an ORL four ranking or instead of just kicked out? Well, some of this research data, like I said, might be useful for decision-making with perhaps a developing drought or stresses on crops or might be useful for some long-term planning that you need to make that's related to climate changes and changes we're seeing in extreme weather or climate impacts. So the ORL four might be a data set that you want to put on somebody's radar screen, show it to them, but they understand that it's not useful for immediate decision-making, but it might evolve over the next 12 to 18 months, let's say. And so I just want to caveat this process and this governance with it's still evolving. We're still, I'll put a little of that little laser pointer as an asterisk on the ORL model decision tree because as we get more feedback from groups and we'd love to start getting feedback from E2SIP in Australia for this kind of approach to see if it's useful for decision-makers in Australia, to see if it's something that we can use to leverage research data and maybe accelerate the transition of data from research into operations. You know, you don't have to have every pixel quantified to make data sets valuable for decision-makers. And if that's the case, then the data set might be useful. So this is a little story map that's put together in an Esri environment just outlining a use case for the ORL model. I'm not going to read this text, you can pause the video and you can read it yourself, but I want to move on to the next slide where we identify ORL 1, 2, 3s, and 4s. And you can see here that ORL 1s and 2s are considered operational. So users of these data sets might make data-driven decision-making decisions with confidence in that data set. And then the 3 or 4s are considered non-operational, but users may still use that data set to improve their situational awareness and they may make some operational decisions if they can justify the value of that data set. But really, these data sets might be used in exercises, might be used in drills, or might be used to evolve products that you might want to show operational decision-makers. So you can say, hey, what do you think of this data set? We put this together, it's got a combination of model output and real satellite data. Do you think this could be useful? So using GeoCollaborate in a real-time data-sharing environment could create a situation where you're accelerating evolving data sets into operational environments and getting feedback pretty quickly. So you're accelerating that loop of operations to research, transition, and then research to operations feedback, which is really important. And so I want to go on to the next slide here, which just shows you the URL model criteria. So this isn't interactive, but I did highlight a couple of these where you have trusted and vetted data source where there's a little bit more meat on the bone. So if we look into the secure HTTPS, there's some more information in here. This criteria has to do with how secure the service is when a URL begins with HTTPS rather than HTTP. So how to determine that? So there's more information there. And if we're talking about data shareability, is the data format, shareability, interoperability there? Is it an ArcGIS REST service? Is it mostly interoperable where you have a shapefile or a Geo database or a KMZ or KML? So GeoCollaborate really accelerates accessing data in those formats and sharing them out in real time. So this is still under evolution, but we're seeing this come together very well with users and how they might want to look at data sets and determine their viability in an operational readiness level approach to decision making. So I've talked about GeoCollaborate quite a bit. I want to tell you a little bit about GeoCollaborate now that you have an understanding of the URL levels. So your first question here is what is GeoCollaborate? What is the mechanism to share this data set across platforms? And I first want to start with a little graphic. And this graphic really applies here in the United States. It might apply in Australia and other countries. I think it does in certain cases where you have certain areas, say weather, energy, state emergency operation centers, or the federal government, and they've created their own common operating picture or COP. The common operating picture is really, you can think of it as a GIS map where you can bring all sorts of data sets in and look at those data sets integrated on a single map. There's been a lot of investment here in the United States by different agencies and organizations in those common operating pictures. And then also you have a situation where you have state department of transportation. Maybe those people who are in charge of bridges or federal department transportation, police organizations, they all have their own mapping systems. But here's the problem. None of them talk to each other. None of them share data between them. And this is a big problem. And it's something that we all have probably spent the last decade or two talking about data standards and interoperability standards of data. So everybody can access those standards and those data sets on their own systems and start to look at them. So Geo Collaborate kind of sets that aside and says, why don't we use the map and web capabilities that every device has to share data sets in multiple formats so people can use them in real time? So in the old way of creating your own common operating picture, you have what I call cylinders of excellence. These are organizations and agencies that have spent a lot of money creating their own mapping environment and their own cylinder of excellence. It's great stuff. The information that they display is fabulous. But they can't share that data with another agency. They can't be on the same map at the same time and change the data sets or the zoom levels of that map and have a change on another agency's web map or a private sector organization's web map. That's what Geo Collaborate does. So if I show you a picture here of what Geo Collaborate looks like, it's in the upper left hand corner, you have a leader. That leader is in control of accessing and displaying data sets. But at the same time, you have an unlimited number of followers who connect to the same URL who are following and receiving those data sets that the leader is accessing and loading at the same time. So that follower is following the data sets. They're following the zoom level that the leader shows. And they're able to stay geographically literate. We're on the same map at the same time while the leader is changing their zoom level and their map extent and then also getting the data sets. You also have dashboards that are associated with Geo Collaborate. And those dashboards can be brought up by anybody that has the URL. They can be password protected or not password protected. And so what these dashboards do is enable anybody who knows nothing about GIS, knows nothing about the data sets to use those data sets in improving their situational awareness and decision making. When they're able to do that, you can now put your data to use and you don't have to worry about them being confused about what GIS is, about what this data set, where this data set came from. It's got an URL level attached to it. They can trust it. They can improve their situational awareness and decision making. So I'm going to widen out here just for a second and show you if you have an organization like NOAA, which is the National Oceanic and Atmospheric Administration here in the US, they want to share data into the trusted cloud environment. Anybody who connects to Geo Collaborate and turns it on will receive those data sets in real time. If any state or organization connects late in a briefing, all they have to do is turn Geo Collaborate on and then they get the same data sets at the same time and they're caught up. They can follow through everything that the leader is saying and they can follow those data sets, get those data sets from the trusted sources right into their system and right into their organization so they can use it later for decision making. And if those data sets are set to update in real time, their data sets will update in real time where you've shared them and those folks that are that are participating in that collaboration session. Consequently, if you have other people in the field that are collaborating or in the previous example of a leader sharing data sets out the followers or collaborators, anyone else with leader credentials or even a disparate instance of Geo Collaborate can collect and share data from their instance into other instances of Geo Collaborate. So you might have firefighters collecting soil information. You might have a high resolution rapid refresh smoke model that you want to share or air quality information that you want to share. A decision maker can then load those data sets right into their instance of Geo Collaborate so they can accelerate their situational awareness and decision making. So in this particular example of a wildfire, data collection could happen in the field, sent into a collaboration session. Another agency that has model data can share their data into the session and another organization with air quality information can share their data. So you have disparate organizations accessing their own information but then sharing it out to go into other common operating pictures that are powered by Geo Collaborate. So it's a very powerful environment and some key recent developments that we've had include the ability to connect disparate instances of Geo Collaborate, the ability to add any type of information on the fly outlined by a type of polygon. So we call this geospatial messaging. I'll show you an example of that here in the next slide and then we continue to evolve the RL models and the rapid assessment of trusted data and what decision makers are asking us to do, particularly in the utility field where they have to make rapid decisions, is they want to look at a data set and be able to make a decision in 30 seconds. If they can't make a decision in 30 seconds they want to send that data set back to the drawing board because they don't have time to interpret it or ask questions about it. You really need to understand what it's going to, how it's going to help them in their decision making environment. So let me show you an example. This particular example is a map of the Mississippi River. I'm here in the U.S. with the state of Missouri and Kansas on either side. St. Louis is here and what we've done is we've outlined by hand a geospatial polygon and then we've added text into that polygon. So you can see what that text says and it basically says over the next week the Mississippi River is projected to sort of a crest of 44 feet downtown. The last time they've seen this type of flooding was back in 1993. So if you really want to identify perhaps down power lines, damage from a storm, vulnerability due to sea level rise and storm surge, you can draw that polygon, add text on the fly and everybody in the collaboration session will get that polygon and that geospatial message. And so it's been received very well and we're testing it now in some operational instances of geoclaborate. So from a schematic standpoint this is what geoclaborate looks like. You can have any data sets that you want out here in the internet and you can access them and immediately share them out to anybody who's following. So that can be an emergency operation center, it can be a tablet, it can be a mobile phone and everyone will get that. As long as they have an internet connection they can get that data. If they lose their internet connectivity they'll still retain the data but they won't get updates. So once they connect back they'll get any more updates that the leader has shared. And so I just want to show you some of the areas where this information could be relevant in a wildfire situation where field operations are collecting data and then you have a command center and then you have a forward command center where people really need to be connected all on one map at the same time putting data to work. So what are these trusted data layers? It could be any data layer that you're thinking about that researchers are collecting that might be valuable, could be remote sensing sources, could be drone data, cameras, video or municipality data sets, and even research data that could be put to use. Geoclaborate could also be used in a research environment where researchers are sharing data and you're not even using them in an operational form or fashion. And so some of the new features that we've added here, this is an example of a real-time smoke model called the high-resolution rapid refresh that is showing the projection of smoke and the red levels here are where it's very hard to breathe at the surface. This is a surface model forecast and then you can also see the perimeter of the fire that's been accessed through the Enterprise Geospatial Portal at the US Forest Service. And so that's something that's been very good. And then we also have a real-time population extraction capability. So if you draw a polygon around a certain area, we have the ability to dive right into the NASA CDAC, which is the Socioeconomic Data Application Center located at Columbia University, where they're running services and they're experts in population or demographic information. So we can circle an area to determine how many people are impacted and then it goes off and makes a real-time call and then brings back real-time population information from the census and even a projection as to how many people are expected to live there in 2020. This is a global database and I'll just blow that up a little bit here so you can see it a little bit more clearly, even an age category here for 65 and older, 15 to 64 and 0 to 14. And so it's been quite useful there. Also we have the real-time messaging capability that you see here and I'll blow that up just saying, hey, we're expecting a wind shift in this area so any firefighters need to get out of there. We've had a number of incidents here in the United States where firefighters have been impacted, some killed because they weren't aware of a wind shift happening and the fire back builds on them. This is the example that I showed you previously of the Mississippi River and the flooding that's happening and I also wanted to show what you see when you have real-time data sharing. You have faster situational awareness, you have faster decision-making that can happen and you have perhaps the ability to save lives. And this technology, Geoclaborate, has recently been adopted by the Department of Homeland Security, their Cybersecurity and Infrastructure Security Agency as a potential addition to their information sharing architecture. And so we're seeing great advances in this and I just wanted to mention as well that Geoclaborate has been put together under federal government funding under a program called the Small Business Innovation Research or SBIR program. So over the last six to six and a half years we've evolved Geoclaborate under federal government sort of supervision and research and a contract funding to do something that's never been done before in the United States and that is to connect disparate common operating pictures together, to put people on the same map at the same time so they can make informed decisions and we can put more data to work in a real-time decision-making environment. So now because we've passed through all those gates, Geoclaborate has become a sole source product if you will for every U.S. federal agency whether it's in defense or intelligence or in the civil sector and so that just means that it's very easy to contract agencies to bring Geoclaborate in to start to put their data sets to work. So we have innovative new functionality, we are able to share a growing number of data formats in these formats, we're able to connect to disparate instances of Geoclaborate. This geospatial messaging polygon-based has been very well received and the ORL level and the rapid assessment of trusted data sets for 32nd decision-making has been really welcomed very quickly. We also tested Geoclaborate in the ESIP testbed. It got a little bit of funding to test various data sets that the ESIP Federation has been putting together, members of the Federation. That's something that could be done also within the E2-SIP within Australia so it would be great to talk about that a little bit. And then these are just some quick examples of what the dashboard looks like in various examples of wildfires in California. This is a visibility forecast from NOAA. This is a satellite image showing smoke from the polar orbiting satellite. This is veers from NOAA. We have some funding from the JPSS program to put more NOAA satellite data to work and then to estimate the population of people that are impacted by these wildfires. We're seeing identifying areas where debris flows and mud flows have a probability of occurring because of post-wildfire burn scars that have occurred. This is a global database of rain that's being estimated from satellite. So you can see that we have satellite estimates of rainfall where there are no radars way out in the ocean which really helps to identify certain things like atmospheric rivers and things like that that may be impacting various countries. This is an example of the big wildfires they had in Canada recently with the smoke being identified with the geospatial messaging capability. So we're leveraging federal government investment in the SBIR program to implement this cross-platform data sharing capability rapidly. We're enabling communication across agencies and organizations. We're placing responders and decision makers on the same map at the same time which has been desired for the last couple of decades and we're disseminating trusted information across all platforms and devices. So whether it's NOAA data, NASA data, federal data sources, private sector data sources, or non-governmental organization data sources they can all be brought in and even information collected from the field. We're putting together use cases which are one-page use cases to focus on ways to access and share data services. Federal and state agencies are looking at this to see how they can really drill down into specific needs that they have. And really cross-platform and cross-sector data sharing is something that we're talking a lot about here. So with that I'd like to I think we have about 10 minutes left. I'd like to end here maybe open up the webinar to some questions or some conversations. And here's my contact information there. I'd just like to mention again the ECEP disaster life cycle cluster. Karen Moe is the co-chair as well as myself and the ECEP Federation meeting is happening next week. You can connect if you just Google ECEP Federation summer meeting in Tacoma, Washington. You'll come up to the agenda and they are going to be live streaming a lot of these plenary sessions and the breakout sessions. And we're having our disaster life cycle cluster breakout session on Friday, July 19th. I think it's from 11.15 to Tacoma, Washington time. I'm not sure exactly what that is, Australia time but it's probably maybe 15 or 16, maybe 14 hours ahead perhaps. So anyway we can talk about that. I'd like to open everything up to questions. Thank you. That's very great talk. The work you presented is very impressive. I have a couple of questions. That's all our levels for levels and the criteria. Is that self-assessment tool? So it is a self-assessment so you can go through the criteria and then the providers of that data can go through the criteria and list their data sets. There's also an opportunity to put those data sets out there in a research or in a test or in a exercise environment so people can look at those data sets and start to ask questions about them, maybe look at the metadata and stuff like that. So initially yes it's self-assigning if you will but there's also a tool that you can use to answer the questions interactively and that tool will then spit out what the RRL level should be. That's great. That was my actually second question. Are there any tooling there? Yeah and as a matter of fact, I'll excuse me just one second. I'll just mention too that Carrie Hicks who works with Duke Energy here in the United States a very large energy company has really she's a GIS analyst. She's really jumped in with both feet here and has worked work with us in the ECEP Federation and also the All Hazards Consortium and she's implementing the RRL levels operationally within Duke and she's the one that has put together the criteria and some of the graphics that I showed in this webinar showing how you can dive in and answer those kind of questions for the the RRL level governance and assignment of those numbers. So it continues to evolve but she's been a really great contributor to this process. Here's a question. What are the license arrangement costs for software and the data not able to scroll down to see the whole question? I mean I'll say it for you. I'll read it out. Okay. What are the licensing arrangements costs software and data sets when distributed across potentially many users? Sure. That's a great question. So the geoclaborate technology is collaboration as a service and so once that is set up within an organization you have leaders that are credentialed. It's not by seats if you will. So you can get a number I think the standard is four instances of geoclaborate and then once that is set up within your organization it's all web-based it all runs right now within the Amazon EC2 cloud it can run in anybody's cloud and there's there's no cost for accessing and distributing data sets if those data sets are freely open. There is the opportunity for data providers that are putting data sets together that are value added data sets to create license fees for those data sets that can then be put into the license for geoclaborate. So it's something that a subscription service if you will could be added into geoclaborate so the users just have to pay one one organization if they want to have that arrangement. If they want to pay the organization they're licensing the data from they can do that as well but we've just had a lot of questions here in the US from companies saying look we just want to we just want to pay one organization we want to pay geoclaborate license fee give us a menu for what kind of data sets are value added that are provided by private sector organizations that want to charge for it and then geoclaborate will turn around and just pay them whatever their license fees are for that data. Thank you. Another question from Kieran and hence the ORAR model being used outside of geoclaborate for classifying data sets. For classified data sets? Okay so so the ORL model itself is being adopted by the Department of Homeland Security that has nothing to do with geoclaborate and so they really like that approach so they can trust data sets quickly when they disasters happen and they have to get this information out to decision makers or within the federal government. So the DHS we have FEMA and their logistics operations are very interested in pursuing ORL levels because this is really the first time we've been able to put governance structure around data sets and around excuse me the trust level of those data sets. So it's going to continue to evolve and we're going to be able to put more meat on those bones if you will so it can really become a highly trusted ORL standard for not just federal agencies but also the private sector as well so they can really you know get commerce moving after disasters and even even in mitigation and preparedness activities as well. Okay I have a question. The consortium has members across many sectors you know and from many levels government from federal states you know private and et cetera how do you get so many members a gray same standard. This is the great thing this is the great thing so the all hazards consortium gets things done the way that they get things done is it's driven by the private sector the government is invited to have a seat in these meetings but they don't determine what is the standard or not right they just want to create the trust level so we've been able to accomplish things within the all hazards consortium in a matter of weeks when some federal officials have said to us it would have taken us a year to determine this and you guys have put this together so it enables business to really take the lead because they need information they need to make decisions based on trusted data because if they can't make decisions they don't make money and the economy crashes so they're really driven by products and services that they need to get out there and then the federal agencies are saying wow this is really working they want to build up more of their private sector partnerships and so it seems to be coming together pretty quickly it's a great acceleration as long as the government understands that they're not dictating what is the standard here they're participating in all the meetings and they have a voice and they have input but they also listen to the private sector organizations and what they want as well great thank you um a couple of questions uh are from the nine response responders able to contribute data from their devices back to a lead collaborator yes that's a that's a great question as well if there are first responders out there or even folks with social media yes there there are ways that you can collect data you can even take a picture you can text it back to a number and it would pop up in geoclaborate in real time so you can get responders in the field that are taking images and pictures and that would go right into the collaboration session and then they could see it pop up even on their device as well if they're connected to that url and they're they're collaborating or looking at the dashboard so there are ways to get information back very very quickly and to get that information into different operation centers uh in real time so they don't have to wait okay and following up question uh how are all our airs assigned to such data mean uh data means that the phone front line responders so uh the um front line responders sure so so um let me let me answer that i'll do it in a short answer in case i misheard the question um so orl levels are put together before these disasters happen right so we identify data sets and data sources and we may use them in exercises we may put them through paces um and then they're given orl numbers if there are data sets that pop up in the middle of a disaster if it comes from a trusted source like let's say so a firefighter that's fighting a wildfire and they want to put a picture of the soil conditions into the collaboration session then that um that orl level would probably be a one or a two if it's discussed ahead of time that hey uh you know these firefighters have the ability to take a picture they have the number to text this information back to so their trust and we've tested it and and taken a look at it if there are social media type of images that are coming in there are ways to harvest those social media um images and posts uh but that probably wouldn't have a very high orl level if it's rated or ranked at all because of the uncertainties and social media trust factors and uh whether or not you can uh you know believe the information that's being tweeted or or posted so um that social media side will evolve and the orl levels for social media and um uh crowdsourcing will evolve as a community uh the same thing with drones right as drones are taking images and pictures so that's something that we look forward to testing uh even here great um agent does it answer your question good i think uh okay great thank you agent says okay um so uh it's four paths um it's very late for dev and probably we'll the webinar here today and thank you very much dev and to give these excellent talk i'm sure we will have a chance to co collaborate in future well that sounds great thank you very much and i appreciate the invitation to uh to at least virtually visit australia yeah i have also thank you nerds from audience that's very insightful talk thank you and and thanks everyone uh you attend the webinar today and thank you we will keep in touch about this work okay thank you very much bye for now