 So Suzanne Pierce is a research scientist at the University of Texas and works at the Texas Advanced Computing Center. She really embraces the challenge of integrated modeling and uses models as well as participatory processes for modeling decision support things, which seems like a very complex thing to do and pulls in a lot of experts and just from seeing her connections and her projects you can see that it's like usually like really large teams of very interdisciplinary groups. Suzanne will talk to us today about decision support systems in a wicked and wild order. Thank you, can you hear me? Yes, we can. Oh good. Okay, great. Thank you so much for the introduction and yeah the title shifted a bit for my presentation today and I'll start by just explaining why I chose this particular title. DSS stands for Decision Support Systems which are typically thought of as being used in a business environment and certainly there are environmental decision support systems and spatial decision support systems that we're really moving towards the application of decision support in more wicked and complex problems and so decision support is needed now more than ever because we're making decisions today that are taking place in unprecedented conditions. We're looking at complexity and uncertainty that we've not seen before and trying to adapt so that we can combine our knowledge together in ways that will help us to move forward towards action that can modify and assist society as we're transitioning into new equilibria points and new ways of being and that requires new ways of knowing so decision support systems are a combination of I think of them as a combination of social processes and computational tools that help us and aid us in analyzing and making sense of challenges and helping us to identify possible solutions or actions that we can move forward so that's the DSS in the title. The wicked part of it is the kinds of problems so wicked problems are complex, ill-structured, dynamic, ambiguous, uncertain, there's rarely a clear solution and there is no right or wrong answer. They can't really be studied easily through a trial and error approach because they're they're very complex and they're in the real world and there's no clear and definitive formulation for how we actually define that problem so our problem framing shifts depending on who you're speaking to which I think is one of the most important aspects that we need to think about as scientists and as researchers in this arena and most of all then we're out in that wild world I've been seeing people in text messages and on social media using IRL in the real world and in real life where do we see it happening all of these problems are out there with facing hazards like we've never seen before such as the pandemic pandemics but also hurricanes, flooding, fires, there are a multitude of things and scientists have to transition as we are transitioning in the world around us. So I want to use a problem and pick a problem that's a wicked problem that's fast enough for us to actually start to move forward and learning how to adapt and use decision support build decision support system and tools for those problems and I've been really fortunate because the University of Texas at Austin launched a program called Planet Texas 2050 and in fact Michael Young who spoke before me actually is a member of this team that's working to find ways to actually address some of the problems that are being faced in Texas in particular so one of the motivating forces is that today's population is going to double by 2050 and so that means that while we're currently at much lower numbers we're going to be moving forward so that we've got communities and societal institutions that are going to have to scale up rapidly and climate change will be one of the driving forces that's going to move that forward. Our goal as a project and as a program is to work across the University campus and outside and beyond the University walls to actually take theory and knowledge across all of the disciplines combine that and then find ways that we can apply it to help make Texas more resilient and that's our grand challenge. I'm based at Texas Advanced Computing Center we are one of the nation's high performance computing centers but beyond the high performance computing services that we provide and luckily for us there's been massive investment in the cyber ecosystem and the high performance computing infrastructure for the nation but beyond just the high performance computing systems it's people, it's systems and services and also assuring that we've got the capacity to actually conduct and complete modeling at large scale but that includes data storage and visualization networks, crowd computing, code optimization all of these different ways that computing services can support us they can help us if we leverage them properly to make sense of these very complex IRL in the real life problems and we can accelerate our ways of analyzing them and more importantly we can accelerate our social learning about them if we if we're able to combine all of these different advanced computing techniques. In Planet Texas we're actually building a process a cyber ecosystem that sits on top of all of the high performance computing services and it ranges from adaptive sensing that allows you to data stream in we're leveraging EarthCube data streaming services for that, cords for those of you who may be familiar with it is now being integrated into the science gateway portal and we've got a science gateway or a data portal called DataX for Planet Texas, DataX then can feed into a model integration platform that sits on top of everything else and that model integration platform that we're currently working with is still in development but it's called Mint for model integration not surprisingly and then we're thinking about if you're generating data and outputs that are very large it's important to maintain those there's a cost and it's a resource that we want to have available in the future and we're not terribly skilled right now at assuring that large data sets are archived and preserved for future generations and so we've got another part of this program called digital it's called Solche which stands for the digital object lifecycle and that actually is looking at how we take the outputs and results of a complex modeling analysis and we push them into a library system so that they can be available and curated and maintained for future future researchers and for future use and then ultimately the goal is to get things to the people that need to use them in fact one of my primary goals throughout my career is to make these systems more participatory and accessible across stakeholder groups that include subject matter experts but also includes people who are just coming with a lived experience it's important for a particular problem that's being addressed and solved and those participatory interfaces to me are one of the end goals is to remove the experts from the loop so the experts aren't controlling the dialogue but enabling people to have these important discussions around what's the meaning of the data the meaning of the model and how do we actually allow them to have the dialogues and deliberation that they need to have to make the decisions that are informed by the work that all of us here do so today I really want to think about what are those building blocks and what's the basic unit of knowledge and really our units of knowledge are built on data and models and so much of the effort for this planet Texas effort and I know that many other systems are working and teams are working towards similar goals but much of our system is focused on the data with the data portals and the model integration platform so I want to just speak quickly to those basically we're trying to build a better dss and enable people to make put their combinations of information and models together faster and more efficiently and we do part of that through science gateway data portals these are similar to things that you see like quasi and hydro share as an example or ekvi from Hawaii is another example I know that there are many many examples that different groups are putting together but the beauty of this is we're creating mechanisms to actually come in and put our data into shared spaces that we can access them easily afterwards and in the interest of time I've got small videos for all of these but I'm going to try to get to the end point because I want to make sure we really speak to to some of the important end parts of this pipeline once you get your data loaded into a shared space then you can make it accessible to the models and an important thing there is making those models accessible and connected back to the data and I know that CFDMS and and folks that are involved in your society and community have looked at things like BMI we've been using the geoscientific standard names as one of the devices for linking between the data that lives in a portal and the input parameters or output results and responses that a model actually either is looking for or needs for to be instantiated or the response metrics that we want to look at after we've run different models and it helps with the connectivity across the modeling platforms and services we're trying to use artificial intelligence to accelerate that process which of course means lots of semantics many different ways of looking at and categorizing and labeling the data and information and the models themselves we one of the most important I think innovations that all of us need to push to to really drive forward is how to create model catalogs that are intuitive and accessible easy for modelers to actually package and place their models into these systems and make them runnable and and you know currently of course we can we can package them up we can place them in virtual machines we can doctorize them and use singularity in other ways but I hope that in the coming years we're going to see that it becomes easier and easier and that's the reach between a computer scientist and a technical or or subject matter expert modeler becomes easier and easier to get to in the case of model integration we're using this mint platform we're able to actually load data from anywhere on earth we've been using the hand model height of that natural drainage is just a good example base model we set it up so that you can automate the selection of a basin or an area anywhere on earth as clips the DEM 10 meter DEM is what we're using right now but we've started working with Paolo Paoloca and her team to try and move towards light our data this one meter in resolution and run a hand model just out of the box so from zero to an actual product that's visualized and interactive I was able to do it quickly in under 10 minutes for a small basin near the city of Austin that's important in our urban environment and what's important about this is it took me two and a half years well it actually took me four years and 40 people and 2.5 million dollars that I raised to complete my doctoral work and my doctoral work was building the decision support system for one groundwater system in Texas to be able to actually develop and run groundwater models which we can do or flood models or flood representations and then just a matter of minutes is really transformative because it means that we're going to be able to now start to move toward how do we use those models this is one example of data fusion where we took the flood models and the hand map and combined it then with infrastructure and you know the building map for the city of Austin and completed vulnerability mapping also within just a matter of minutes I'll point you quickly there's last August was the first time that we were actually able to run everything from beginning to end from a data stream so this is at the intelligence systems and geosciences research coordination network that we had last year we actually hosted it in Boulder and this is in fact looking at Boulder Canyon we took sensors we deployed them of course we deployed them locally so that we could just put them in and give them the signal that they were wet or not wet but we ran this from the sensor that we set up using cords into the portal from the portal into a model and from the model out into an augmented reality sandbox and it was the first time that we were able to do that and we did it in just a matter of you know in that case it took us a couple of days but now we could actually do it literally in minutes so finally what I want to talk about and I just have a few more minutes but I want to just speak about what really matters in real life and when we start to use our models in real life is that the limitations may not be computational the limit limitations may be more at the level of the speed of trust and that speed of trust comes to us through our human interactions and our human acceptance of the credibility and trustworthiness of the information that's being put in front of us and as modelers I think it's really important for all of us to think carefully about what our role is and I also think that it's very important for us to start to be explicit about where we are in the phases of our model action particularly when you see responses to things like this pandemic or other hurricane events for example we need to start to understand what our role is how our models are being used how we put the bumper rails on them the picture that you see here is a corn speed bend which is actually a sewage treatment facility and they are building trust with the community constantly they've turned it into both a treatment facility but it's also a place where burgers come regularly and they host educational events that are very diverse and hopefully inclusive as well and it's building those relationships prior to an event that are going to help us to become more prepared being prepared includes thinking about how our models are added into catalog so that they can be rapidly ingested into a decision support process and what I've observed as I've watched multiple teams from epidemiology and now as we're starting to use wastewater treat wastewater itself as one of the indicators data indicators for where clusters of pandemic spread may be I'm watching and seeing that some of the things that come up are researchers need to think about scale first before they complete an analysis and present it they need to think about how that necessary knowledge can be scaled across because if you're able to complete a model like an epidemiology model or a flood model you've got to think about how you make sure as a researcher that you can complete it quickly and leverage the advanced computing you've got to make sure that your expertise is properly included in that it's trustworthy information that you're you're sharing and you need to be flexible and accessible in terms of being able to share it with other researchers in your field and you also have to be very aware and able to communicate the uncertainty and it also needs to be recognized that the work that you're doing may not be timely it may be that you're working on a research code that needs to be looking out 15 years and if you're in a code base and an advancement of knowledge that's not ready for real time in real life then it's time to hold that back and make sure that you continue to develop it even as we try to support the ongoing efforts with our knowledge in real life over in the other column I want to make sure that I just emphasize quickly and I've gone over a bit but in real life people need to know what are candidate solutions that are good enough to support insightful dialogues they don't have to be perfect and we can assume that the people that we're talking to will understand that our models aren't perfect and we need to be able to communicate that with them and we need to be ready for that as modelers we need to make an option a choice am I working on the research side during a pandemic or during a critical event or am I working in real life with the people that need to apply the knowledge somewhere and we need to realize too that the framing of that problem is not our scientific framing of the problem we've got to connect the problem frame to the actual decision support issue I'm going to stop here because I've run over and my apologies for that but in real life knowledge is being integrated in application and we've got to commit ourselves to the pace that the society needs us to to interact with them on so thank you thank you Suzanne this is a fascinating topic and I think it's sort of this frontier of like bringing the modeling to like more applications and in an interdisciplinary way we'll we'll have time for like one or two questions in the chat and then maybe like just a few short ones okay I'm sorry I ran over a bit that's all right we did we never said like any times for anything so we have like a little bit perspective there and so Claire is asking how integrated are you with your library and information science and social sciences departments at UT I'm in a graduate graduate program designed to train for these wicked problems and effective decision support I've identified so much with the things you have mentioned through the talk but I wanted to hear more about how you work all right how you're working to make this data more policymaker applicable well there wasn't time to show you but in that model integration and platform I'm spending most of my time thinking about how to design the process that leads a user through setting up their model and ingesting their data and registering it at the time that they start to register and ingest a model they actually have to categorize and register what are the aspects of their problems that are decision relevant so what are the potential performance measures that they may use out of their model or what data may be used to combine to construct an indicator they also have to identify what are the decision variables of their problem up front they have to also identify which parameters or variables may be optional optimization or objective functions used for objective functions so we're building into the structure of the platform upfront identification of the problem frames I think it's the social process so I believe that we need model-based facilitators we need facilitators that are specially trained not to mediate or negotiate but to actually facilitate the knowledge through modeling and so I think the social process and understanding that there are many problem structuring methodologies that are already well understood and well used we've just got to now couple those with our models and data themselves and regarding the library and connecting with the library systems currently we've got it set up so there's an API that when you we track the provenance through all of the model runs using principally Pegasus right now on our system but it's all connected into that model platform so you can go back and find things the interface needs to be improved and we're going to have lots many years of fun fixing that right and making it easier to use but the API goes straight from the HPC systems into the library and we're working with them on how do we maintain large data sets inside spaces like the Texas digital repository which is part of the Texas library system in terms of all of the other disciplines so I use human organizational systems research in my own work Michael and I come to I come to the planet Texas program as computing scientists although I'm trained in hydrogeology and I worked in industry for many years in environmental management Michael also comes at it from a scientific perspective but we have a faculty members from humanities we've got social scientists and we've also got even English professors who sit on our organizing committee and so we're really wrestling with how do you find shared language and how do you understand different epistemological stances and lenses of change thank you I'm so happy to talk more about that yeah and I think I mean you may want to chat definitely the chat window there's like a bunch of compliments which is always good this building me compliments but there's also this interesting point by Donica who brings up this the question about how do you reconcile academic publishing timelines with like rapid response to decision makers and how do you make data available before it's already been peer reviewed is her question so I think you're right at the crux of it I was excited yet honestly this is what I'm looking at I've been observing five teams going through the pandemic it turns out we're in a unique position usually these interactions happen in real life um it's I'm so personally I'm sad that we have to do everything on zoom but from an intellectual perspective we're in this unprecedented opportunity and and as hurricane season is approaching and other events are going to happen over the year we have a rare opportunity to watch how teams are interacting to share knowledge and information we don't do it well and I've observed some people get out a bit too early and they find themselves trying to conduct their research as a fire drill as they try to respond to the request for knowledge and information and that's one of the key things I'm trying to understand is how do we help train scientists to realize when they can transition to a decision support role because I think that I think it's uncertain and we need to learn more how to do it thank you for