 We are going to go whirlwind through Some examples with this pro act Approach we're going to talk about three different methods to assess decision trade-offs cost-benefit analysis dominance methods and multi attribute criteria analysis or multi attribute utility analysis We won't be able to get into a lot of the details and weeds But there's a couple of references at the end in that structured decision-making book I think is a really great one for for being able to look at some of these approaches With these what I'm going to do is I'm going to introduce a concept give you an example from my own research because it's fun to talk about your own work and Introduce the idea of value of information We're not going to get into it in in detail and we'll also talk a little bit at the end of like How do we think about it communicating these forecasts effectively all in an hour ideally so cost-benefit analysis? So this is an approach to compare the relative costs and benefits of a particular solution So you're looking at one alternative and you're looking at you know for that particular solution What are the various costs that are associated? What are the various benefits that will be associated with selecting that option? It quantifies all of the objectives on the same value metric and that's usually monetary That's why a lot of people like it. It puts it into into dollar values That does mean that when you're doing this There are things that are difficult to put into dollar values that you have to put into dollar values to be able to use this method And then the way that you select a preferred alternative or eliminate ones that are less preferred is That the preferred option is the one that maximizes benefits relative to costs So that's oftentimes described as a higher benefit to cost ratio And if you have multiple alternatives that have more benefits and costs than the one with more Benefits than the cost then that's the option that you choose kind of straight straightforward and Intuitive so what I want to do is go through a quick example where we use this approach to assess this question of Urban stream restoration worth it what we saw is that in cities like Baltimore City? they were using stream restoration and Counting it as water quality Benefits and it just seemed like a really expensive way to achieve nitrogen phosphorus and sediment reductions So that was a question was is it worth it largely for these kind of water quality of benefits? so we use those that cost benefit analysis to look at the pros and cons and Then we looked at three major objectives the first was water quality benefits We use an approach called the least cost feasible alternative which I'll discuss in a second Infrastructure benefits which use the same approach and then aesthetics and recreation benefits This is one of those hard to quantify and we use economic Approach called contingent valuation to get the willingness to to pay we didn't consider other objectives So you all might say well, what about habitat protection or other things that? stream restoration might provide Absolutely important for the context of this they weren't considering the analysis and this is something you want to be aware of with cost benefit Analysis, it's only as good as what you come in with this is the thing with these types of models Is that they're based off of values the science supports an assessment of how you make those types of decisions? All based off of your values, so if you're missing values that are important You have a model that's not going to tell you what option would be best based off of your own values It gives you a sense of what would be best based off of a more limited set and Then cost we were able to get data from urban stream restoration projects in Baltimore So the water quality model. This is this is one of the ways that we went from Things like how much like nitrogen sediment reduction one could expect from streams into a dollar value That's linked to a dollar per linear foot of resource stream Was we had to think about it in terms of like what our best management practices? That would also achieve nitrogen and phosphorus reductions So we developed a BMP sizing model We looked at the size that that those best management practices had to be in order for that to happen We then had to cost those based off of the size using you know sort of standard engineering restoration cost approaches and then you're looking at you know How much did it cost in total and that's basically like an assessment of how much it would cost to achieve the same Objective you look at the stream restoration characteristics How much nitrogen and phosphorus it would produce and then you get a sense of like what you would need as a Replacement to be able to achieve the same objective so looking at that we looked at a range of different best management practices and we were able to make calculations both with land costs and hypothetical land cost scenario and I won't give away the the punchline but In general, it's a lot cheaper to do this Then to do to do some of the stream restoration approaches Infrastructure so a lot of times with stream restoration projects pipelines or other infrastructures run right along the stream because you know you have a gravity supporting the movement of water so a number of Stream restoration projects have co-benefits related to infrastructure protection of bridges or or pipelines So we looked at okay Lease cost alternative so we're not thinking about other benefits We're only thinking about what is the cheapest way to achieve this benefit That's that's what you have to think about and so the what we said was well riprap the stream Sides of the stream that stabilizes stream banks. You won't have impacts to your roads and bridges and pipeline along it and you can calculate what the Cost of riprapping that stream is so you get a a cost for for that Aesthetic and recreation benefits are a little trickier So what we had to do was this was a multi-phase process is that people don't view stream restoration Approaches the same way so we looked at four different scenarios. We called one like high and dry with meadows so mostly No trees or few trees high and dry stream banks with tree cover So stream banks are higher, but you have trees surrounding it Low and wet with meadows so you can think about these as like Wetland or wetland not quite wetland like systems and low and wet with with tree cover. So highly shaded We had people assess What they liked best and what we found was that the most favored approach is high and dry with trees This is almost like a Disneyland scenario and the least favored despite its potential You know biogeochemical like benefits is the low and wet sort of meadow Conditions what we then did was we did a willingness to pay survey There are economists that study this whole approach We worked with an economist to do this because I'm not an economist and we needed to make sure we were employing this Right, but this was set up is how would you vote for a one-time tax to pay for this project? And so then you had different tax values Whether they would vote yes no or would not vote they didn't care they were kind of indifferent And we set a number of the conditions as constant. So we said it's unlikely to improve Water quality it would protect existing infrastructure. So we we said a number of things constant So we could really get a sense of what they would be willing to pay for the most favored and the least favored stream restoration approach And so what's clever about that is what you're able to do is calculate an aesthetic premium So what's a difference that people would be willing to pay for something that? Would offer them greater aesthetic and recreation benefits So not how much they're willing to pay for the restoration But how much they're willing to pay for that difference between their most favored and least favored approach And so we ended up being able to calculate how much they would vote We did it for you know all of the city of Baltimore in this area. That's right close by you can incorporate all three of those approaches and Basically build into like a low and high scenario situation The crux of this is is that you know if you're basing these decisions off of infrastructure or water quality Improvements given the cost of stream restoration in Baltimore and a lot of areas around what it says is that you can't justify those projects on those objectives alone You can justify it in some situations with the low cost being like the low cost of restoration and the higher cost being Sort of the high cost per linear foot of rest restored stream You can't justify it in some cases depending on the willingness to pay so This helps you to understand what benefits or things you might need to consider in an Option in order for it to be an appropriate solution It also means that it can highlight, you know where there might be cheaper alternatives to achieve the same objective so cost-benefit analysis I Know none of these are approaches. You're going to apply right now But just so that you know how these kind of forecasts are being incorporated I think it's I think it's useful to just kind of see these different approaches Dominance methods Pareto optimization. This is in Mike Dietz Forecasting book a couple of things to know about dominance methods first is it's an approach to identify Alternatives that would be dominated by all other options So it's it's really almost like a screening approach of you if you have tons of options You have a hundred different options. That's really labor-intensive to be able to go through Are there ones that are always going to be worse off than all of the others on the objectives that we care about So again, it's still based off of the objectives that we care about and we're trying to eliminate the ones that would never Never rise to the top on any dimension Usually we do this through optimizing across objectives and assessing the performance on those objectives You can also include constraints so regulatory or other environmental constraints in that The key thing is is that this does not tell you what to do Like even if you included all of your objectives What it does is eliminate choices that aren't very good because the reason it doesn't tell you what to do is the relative Importance that you put on different objectives depends on your values and what you care about and so Because this approach doesn't assess the weight or the relative importance of those objectives It's not going to be able to tell you this option is better than that one It narrows a set of options that need a more in-depth analysis So so the way that this looks if you're doing like a Pareto frontier is that you might have all of these different options And then you have this Sort of optimal location. So we're looking at two objectives here Here this means that this objective is preferred over any of these other objectives here So anything that's in this space is dominated by solutions that would outperform it Let's look at an example where we did this with a Cross-analysis looking at large-scale diversion structures on the Mississippi River Delta This was done in collaboration with a couple of geoscientists at UT Austin and University of Illinois Urbana-Champaign. So the motivation for this was there's been tremendous amount of land loss and the Louisiana Delta and there's been a Lot of discussion about how a lot of this land is going to sink within the the next century And how might we be able to build up some land that would create some of the buffering capacity that we've lost for A number of the areas in this air in this location just to give you a sense so We talked about there's sediment loss to the deep gulf because it's going off and Landing into larger shelves. There's degradation of barrier islands We're losing wetland habitat and the benefits that come with that as well as swamps and mangrove and This is one of those questions that has a huge number of stakeholders federal state local governments NGOs Whole wealth of different kinds of decision-makers private landowners outside scientific groups And there's lots of people weighing into this and there's a lot of different objectives that need to be considered perfect like sort of multi-stakeholder objective and so one of the questions that arises and some of the discussion that happened was you know We think we can build a massive amount of land using sort of smaller scale diversion structures and What we wanted to do is ask this question is since we we know better how deltas are built and how you could build large amounts of land based off of the Argeous science and knowledge is Would engineer Diversion structures that are either single projects or portfolios combinations of options So an alternative can be a single option or it can be combinations of options for these kind of large-scale Evulsions, which one's going to give you the biggest bang for your buck something That's like super deep into the water column and pretty costly or something that's shallow and deep And this is a real conversation that was happening Sometimes still happening around how you think about restoring some of Louisiana Delta so what we did was We were looking at cost-efficient options So that Pareto frontier idea by looking at water and sediment Diversions coupling that with a land-building model and then coupling that with a diversion cost model. So let's go through these Super fast. So again, this is an optimization approach So let's look at our objectives. So cost we want to minimize cost Sounds reasonable. We want to maximize land. So we're going to minimize cost subject to maximizing land When you have a diversion structure, you're not just dumping sediment. You're also dumping water Mississippi River is one of the Largest economic engines because of the amount of goods and services that go up and down that river And so you still have to maintain navigation within the high-flow period of time And so what we did was we made that subject to a constraint on the amount of water that could be released and that ended up being Really important because what we needed to do was not potentially release massive amounts of water to be able to achieve Land-building objectives. We did need to put a constraint on it so that then you could maintain sort of that critical navigation So the water and sediment diversion this comes up because there's more sand at depth You know when you're looking at a water column in order to build land from scratch, you can't build it with clay You need ideally sand Silt is okay, but you want as much sand as possible And so since we have a limited amount of water There was a really important question about you know sort of the depth and the width and how much sand and water would come out of a structure as a result of that and That's really because again. I mentioned we have more sand the deeper. We are in the water column, so Building deeper captures a higher fraction of what we are trying to achieve If you couple that and you know the amount of sand and water coming out those diversion structures you can build that into a mechanistic land-building model that is Forecasting the amount of land that will be built given an assessment of the local sea level rise that would happen over the next 50 years and subsidence that is Occurring in that area that was a collaboration with geoscientists and Part of what this shows is that also because you have the bathymetry So it slopes downward not intensely until you get sort of to this drop-off in the areas that we were looking at You know as you get farther from the shore It takes a longer amount of time to create land because you just have to build up more over time So then the diversion cost model this was just like a simple empirical model where what we did was we took Existing diversion structures in this area got their depth with and cost and then just developed a Regression model to be able to understand the relationship and it confirmed this hypothesis that we had that It's their scale of economy disc economies with cost it costs more to build Bigger and deeper and so it's it's more expensive to build deeper than is to build wider Which which makes sense if you think about it going deep into a river like the Mississippi Delta Is really complex and expensive going wider is? You know is easier What this allowed us to do was to explore a range of different options again single options and combinations of Diversion structures that would allow us to optimize the objective that we were trying to achieve So let's see what that looks like. Okay, so a little hard to see Circles indicate deep so these are deep diversion structures Squares indicate Medium diversion structures, so kind of halfway through the water column shallow indicate More like freshwater diversion structures, so breaks and levies or or things that are more shallow Additionally, if it's white then it's a small amount of water that's being diverted if it's black It's a lot more water that gets diverted for that particular option So this is the amount of water that's diverted what you see is that if you are Trying to maximize land relative to the cost when you are in the region where You're not trying to build a massive amount of land shallow diversion structures are Cheaper to build then deep diversion structures. They release a lot of water But but they're but they're cheaper as you get up into the Pareto frontier You really can't achieve Larger-scale land building until you start building Medium to deep and the more land you want to build because of that water constraint you really get into deep So what we did here is we basically for each sort of hundred kilometers squared of land We looked at what was the absolute best option on all of those dimensions Relative to the water constraint and so here again when we talk about this in the hundred to two hundred kilometers Squared amount of land what we see is shallow You can build wider because you're not running into water limitations But you quickly run into this water constraint and this is one of the drivers of what approaches ended up being optimal largely because You have to maintain that Navigation channel and so it meant that to build larger amounts of land you ended up needing a deep diversion structure More quickly than I might have might have predicted Otherwise simply because to be able to capture the amount of sand that you need to build the land You quickly maximize the amount of water that you are using You know, this was a generic approach so it doesn't consider Land rights or other is preferred locations exactly where this is cited and when if you were to do this for real Then you would want to consider all of those those factors But to narrow down and create from a scientific perspective What is actually feasible and to put some like science into this conversation where people are saying you can build Hundreds of kilometers of land by doing shallow diversions. We can simply say That's not feasible if you want to maintain your navigation channel So dominance methods a really great way of sort of constraining and focusing on problem the particular alternatives that matter So multi-attribute utility analysis Economics is great when you can collapse things into a single value for a lot of the environmental problems that we deal with There's multiple objectives that are just really difficult to monetize and quantify into that that ever Into into something that allows for that comparison Multi-attribute utility analysis is one of the approaches that helps you to sort of solve that problem It's a consistent rigorous method to making decisions given multiple competing objectives from like lots of different stakeholder perspectives and Scientific information that's uncertain that you want to include in the process when we think about this again This links really nicely to this pro act Framework forecast and information that predicts the consequences of alternatives. This is explicitly Incorporated into these particular models and when you do that similar to how we did that consequence table You can identify options that might be dominated from that from that approach, but These consequences, you know similar to this the Dominance study. I just showed you it doesn't tell you what option is the best It doesn't incorporate the values explicitly. What it does is it limits the choices So these types of forecasts and consequences coupled with dominance methods can be can be really useful But they have limitations if you want to use them to help support a more rigorous trade-off analysis to think through how you might select decisions So the benefit of multi-attribute utility analysis is it doesn't require collapsing those performance measures into the same unit instead What we do is we normalize the indicators through a utility function. That's utility analysis Utility is basically an index of desirability So we talked a couple of days ago about expert elicitation Those are probability distributions based off of what we believe as scientists and by belief. I'm not talking about religion I'm talking about sort of our scientific knowledge that we're incorporating preferences are different So there's a number of the same approaches you would use to preference elicitation But it's not a probability distribution. You will sometimes see them from zero to one I prefer to represent them as a unit list value between zero and a hundred simply because it's really confusing if you have Forecast with probabilistic Information that you're incorporating and then you have a preference assessment That's from zero to one if you change it since it's a unit list value to zero to hundreds It helps to keep those sort of mentally mentally separate So you can normalize so now you have a basis for comparing between objectives But wait This isn't money. So it doesn't give you a direct comparison What you have to do is because you've translated things through an index of desirability You have to wait the relative importance of the objectives Waiting is relative to the other objectives and there's rigorous methods to do this using swing waiting We will not get into those but just know Lots of science and approaches for how to do this call a decision scientist Now the way that you calculate this is through a linear additive model. The math is super simple So usually the limitation here is not doing the math and constructing the model Again, it's the structuring the problem and making sure that you're including all the things that are important because if you're using this as a way of Helping to select the best option or portfolio of options It will only give you a result that is meaningful If you have included everything in your model that matters and everything in your model that matters is the values that are important for decision-making and Consequences that are linked in meaningful ways to assessing those values. It's a decision model So it's it's fundamentally different. So let's go through an example water quality standards So the clean water acts sets up a water quality standard as a narrative design and use think about this as a goal your objectives narrative or quantitative criteria performance measures And an anti degradation clause, which is a constraint. Just don't make it worse a number of years ago 15 ish EPA was really pushing to move narrative bacteria for nutrients or eutrophication Problems to something that's quantitative And eutrophication is a little tricky. The reason why they didn't do this is because it manifests itself differently in different systems and it's not a clean relationship in across an an ecosystem and so the way that they recommended it was through a an approach where you map a distribution, of course it looks normal and if you're creating summary statistics based off of all the lakes you set your criterion at the lower 25th percentile of the Distribution if you have selected out what a reference like is and to find that it said at the upper 25th percentile of the distribution What this doesn't recognize is inherent in these kind of like water quality policy Problems is that this is both a science and a values question So by that I mean, you know when we think about how we want to set water quality standards from a prescriptive Sandpoint we want something easily measurable So something we can actually collect that serves as a good proxy a surrogate for the Water quality goal that we care about the designated use and it should serve as an accurate predictor of whether or not a water body is impaired From a pragmatic perspective, you know, we're gonna get some non-attainment. You just have variability in these systems There's very few systems that will be in compliance all the time and so you want to think about this in terms of a risk of a non-attainment and so You're thinking about this Partially in terms of like what's a probability of of non-attainment and how can you think about that within the context of these problems? So that brings the question of how should you set these criteria So we're gonna go through a mitigation treatment Decision we're gonna build out a decision tree and I'll show you how we sort of combine these models. So From a lake, we're gonna observe some water quality value that I'll call why from that Measurement what we're gonna do is we're gonna make a decision about whether or not a lake is classified as unimpaired So it requires no additional treatment or it's impaired So it requires some level of treatment if it requires treatment. That's gonna incur costs It's gonna cost something to improve it after that. We observe a lakes eutrophication status conditional on the treatment Given the eutrophication status to the designated use and how much did it cost? So to be able to do this what we did was we developed a simulation model to help us choose the optimal criteria for this water quality Value so we chose a candidate criterion level and it was up across a distribution of values. That's what This indicates and then we randomly selected a water quality variable to represent a chosen lake in in the study And then we have this mitigation treatment, you know based off of you know, what we Observed and the criterion that was set is it classified as unimpaired or impaired and then you know How does that then map to eutrophication status and then what can we say about the? Value or utility related to the designated use and cost so what this shows you is we have a water quality model component of this and a Multi-attribute utility model that we've like linked together in order to sort of solve this problem This does not involve forecasting But one could see sort of similarly how how you could think about it within a forecasting paradigm So within water quality we had a range of different data. We've basis Given what states collect what we did was we use existing water quality these data to predict the future water quality state if it was classified as impaired so we Looked at some treatments and said okay if we were to improve this what would it look like with with improvement? And we use my dissertation advisors models to be able to do that To be able to predict eutrophication status This is where expert elicitation came in and the reason is is that because it doesn't translate one to one from like measured water quality variables to a Particular constructed scale of eutrophication status the experts were able to help us understand the direct and indirect effects and Then be able to classify Oblistically so you can have them when they do this sometimes what you'll do is you'll say what's the most likely category? What we did was we had them assess the probability over these categories because they're looking at a region and since it manifests itself differently It's not just one category. It's oftentimes multiple categories with a probability distribution so that gives us our expert elicitation Category ei and then what we use was we use some a binomial regression model to predict the eutrophication Category given these water quality data. We don't observe We don't collect data on eutrophication status So we need to be able to map from the data to eutrophication status So we use this model and magic came out and then we're able to use that and link it to the multi-attribute model So I'll spend more time on this so again Talking about the designated uses so the two designated uses that we looked at for North Carolina are primary contact recreation So when you go swimming like you guys are going to do later today And there's a small possibility of consuming water. That's what primary contact recreation is secondary contact recreation boating fishing other uses where you might get dermal contact But you're not likely putting your head underwater and consuming Consuming water and then cost so remember we talked about Utility functions being a way of mapping from things that we measure to or goals that we have to How much we care about them? How how desirable is a particular state? So let's look at cost first So you can look at costs and millions of dollars If you're gonna spend money on a restoration Project and it's from like zero to five million dollars. What's the best? like option Zero zero. Yes. We want to spend no money So so that's why this has with our utility functions I go from zero to one, but you can think about it in terms of zero to 100 So this this would be like 100. What's the worst? Five million so here's our bounds boom similar kind of in terms of elicitation as you think about with with expert elicitation, but again preferences now Cost is linear because if you're dealing with a large government agency, they should not care about the difference between zero and one dollar and Five million and five million minus a dollar So that means that the difference is is linear To assess it for things like eutrophication status This is a little bit more tricky in what we did was we worked with four different decision makers in North Carolina assess their utility functions Individually and then had the ability for them to meet together and develop a consensus So then we could compare an aggregated versus consensus versus the individual. It's kind of awesome and so when they did this a Utrification status where you have a legal traffic lake was viewed as most desirable for swimming and those kinds of activities a Hyper utrophic is least desirable, but it's not linear So it really drops off after you get to sort of utrophic good into sort of utrophic bad So that's that's where you see it drop off and it drops off a little sooner for primary contact recreation Then it does from boat for boating and fishing in part because with fishing a little bit more productivity gives you bigger fish in Lakes and it's also one where some of the clarity issues may not matter as much to a particular person In the elicitation we did swing waiting where what we did was we Were able to quantify their weights for each of these different three objectives do it as a group and Then we looked at the average So weights are indicated as K and so our multi attribute model is The utility given the eutrophication status and cost it's a simple linear additive model you know Given I and then you know weights which in this case we call K And utility You know, it's it's a simple linear additive model because what you're doing is you've normalized for what you care about in This case we care about designate use but we measure it through eutrophication status Which then maps to the water quality variables and then we weight it based off of what how much we care about these different things so you use this approach you've combined them together and then what you're able to do is in Calculate in this case the expected value or the expected Utility of different levels of a water quality standard in our case what we found was total phosphorus was a Neur-perfect correlate like 95 percent correlated with eutrophication status for reservoirs in North Carolina So we could use as a near-perfect proxy and then we could map levels of total phosphorus onto expected Value and that then means that it provides an approach for how you would think about you know developing a standard that Recognizes the importance of science but also recognizes That values in this risk of non-attainment come into play when setting those standards Needless to say this approach was not adopted, but But gives an example of how one might be able to do that Lots of different approaches one of the things I wanted to to touch on briefly is value of information Different groups think about this differently some like to monetize it if you're using multi-attribute methods There are approaches to do this Quantitatively and you guys would be able to look at the books and pick it up quickly But just to give you like a high-level overview of of what this means is from a decision analytic perspective What we're looking at is does better information potentially change a decision not whether it improves our understanding of fundamental processes or improves our model or Improves our predictive capability, but does it change a decision and What you can do is you can look at this from the value of perfect information So we know what will happen and the value of imperfect information like forecasts However, let me just say like with this often times the calculation for imperfect information is a little bit more tedious if you Calculate the value of perfect information and it tells you that it's not going to improve your decision You don't have to calculate the value of imperfect information because it's also not going to improve your decision So so you can use both of these approaches and know You know use this kind of simplifying because the per value of perfect information is basically the maximum value of information You could you could obtain and then you know one of the bounding conditions is value of information can never be less than zero If someone tells you you know I can't make a decision until I know for sure whether or not something is going to happen the correct response is we make decisions based off of Imperfect or uncertain information all the time and do you want me to calculate your value of information? One last thing that I wanted to to go over is Presentation of forecast so a lot of us are producing forecast and no folks have mentioned shiny and the development of decision support What I want to do is give an example of how we've worked with the NOAA climate prediction center To revisit the way they're visualizing their temperature and precipitation outlooks to make them more useful for people who could potentially use them in decision-making and improve their interpretation so This is all based off this idea that a lot of times with decision support products We improve it based off of feedback or intuition and intuition doesn't always match best practices Very well, and that's in part because we have to understand who our users and our audiences are and Understand how they understand and comprehend the information and the way that You guys comprehend information amongst each other is Fundamentally different than the way that other people outside of this this room and ecological forecasters think about it or ecologists So a couple of quick quick tricks tips It's easier to distinguish color than shape So if you're doing something that's fundamentally different color is pretty powerful Because colors full is color is really powerful you want to be very careful about how you use it if you're presenting something what you want is if You're presenting different things you can use different colors if you're presenting the same thing You do not want to use different colors because you are indicating that you are representing different things and This then comes into play also when we think about maps. I Don't know why scientists love rainbow maps, but they end up being really problematic because what we do is we Inherently think about Changes in the hue or the color imply more difference than changes in Intensity so if we're changing color this means something different than if we're just changing sort of grayscale and The intensity the saturation of what we're showing this ends up being really important, you know, if you look at this you're like Hot cold I like in the middle You know this these are things that we automatically jump to whereas if you see this you think about it in a different way so thinking about your users and Presenting it they use us all the time to talk about how Scientists think about and present our work think about standard scientific paper present all the this information and on the context and all the details and then results include conclusions We tell you exactly what happens Public you know think about through your newspaper article If you're reading scientific information outside of your field kind of want this bottom line So what and then you want you want all the details You don't want all the details and to be able to understand like what you really need to care about Same is true. We would argue for scientific visuals so exact same principles because if you're designing visuals you don't need to include Absolutely everything up front for someone to be able to understand it. It is very likely they will not come to the same conclusion So being able to think about that simple message ends up being really important for thinking about designing visuals So my example All right temperature and precipitation outlooks. How many of you guys know these? These are the seasonal forecast Noah's asking for comments on it if you have comments about subseasonals seasonal forecast a highly Recommend that you submit submit comments what these are are there forecasts on a seasonal basis so Extended range so six to ten days eight to fourteen days to mid-range three to four weeks to Long-range forecasts, which are you know one to three to out out to a year in advance So they call them one or three month forecast, but they're predicting out a year in advance they do it for both temperature and precipitation but What you're presenting here is geospatial uncertainty and that's super hard because people don't really get Probabilities and uncertainty and now you're presenting it in the map and that's really tricky And so it's an open question how to present that the best and so what my team did was we use Decision science and visualization science to really think through The users and how we might be able to design these graphics with them in mind So what we did was we built off of literature that exists related to how you Classify different visualization problems we coupled that with interviews and surveys where we were able to get You know the self-reported what people said they had problems understanding of what they didn't like or what they said Other people had problems understanding and they had to translate for them So we worked with some people who take the forecast and contextualize them given different decisions and Revealed understandability think about this as like a test Can you get questions right that are important to understand those those graphics? And so that gave us an ability to then identify the challenges trade-offs major concerns really diagnose the problems with those particular graphics So the diagnosis for those graphics were you can't use white space inside the US and outside the US because people if they're meet if they mean different things because people don't get it the Understanding of normal so they were using normal or near normal conditions inside the US as white space But even how you understand What does it let mean to be near normal in terms of temperature and precipitation can be confusing clarity and clutter You can get that from the graphic and probability versus intensity This is a really key one to understanding these graphics because they're presenting Probability of being in one of these three bins probability of being near normal conditions Above normal or below normal. They're not giving you how much You know the intensity and so making sure that that didn't get mixed up ends up being a important communication challenge So here's the original what we did was to look at Breaking down two pieces was all of the graphics that we did we collaborated literally Every two to four weeks with the NOAA climate prediction center because to be able to actually test graphics It doesn't matter if it's an academic ideal What matters is whether or not they can operationalize it through their system and that it meets some of the constraints that they have and what they Can display by their agency So we worked really closely with them and they redesigned the graphics to make sure that they would actually meet those operational requirements so The first one was a simple modification where what we did was We said well if it's going out to multiple users and Just to give you a sense this goes out easily to hundreds of thousands of people every day We looked at it with expert users from ag emergency management water resources and energy Sectors and it goes out and is reinterpreted by the public So it ended up being in my inbox with a heating and cooling air company Convincing me that I needed a package for my AC for the summer because it was going to be of average conditions it showed up in a bathing suit forecast in a Fashion magazine and it has been you know reinterpreted for weather channel and other things So these these get used and and reinterpreted and how they get reinterpreted You know similar to your forecast once you put down to the world some of it's out of your hand So how you communicate it matters? So we thought okay Well if it's going out to these broad public audiences Maybe as a default you can always add layers of complexity It would be better to simplify and do these bins of you know leaning towards normal where it's like between 33 and 50 percent chance Leaning below normal or leaning above normal or likely above or below normal where it's 50 percent and above I Chose a term leaning because it had not been tested in the uncertainty qualitative language before and also if you Saw the New York Times like election dial that use the term leaning and when I sort of Informally tested with people it seemed to play better Likely was was interpreted different than leaning So we then meant that meant that you're blocking off You know coloration and aggregating things and then the different probabilities are represented through contour lines Here what we did was we kept the color gradation. We simplified the near normal because these higher values are never Represented because it's near impossible to be able to produce more higher probabilities of near normal for these forecasts And we made sure that if you're doing normal conditions versus equal chances that you're representing those those differently And then what we did we tested that that gave us some information And then we worked with Noah because this was a lot harder to do it required more time on their end to be able to Develop a combined where we got rid of places where you're not well You don't have data if you don't have data don't show it in your graphic We moved up the legends and provided information. We were clear about the probability distributions This is an older version. We also included the qualitative language over here, but because color is really powerful We found that people don't look at the Contour lines for probability. You really need the color differentiation for them to get those probabilities So looking at the you know results what we did was because you have this you can do control versus treatment testing If you set up your experimental design right we did so What we could do is we could look at You know how people interpret the white color mapping and whether they got that right we could look at how they interpreted Near normal with our combined graphic. They were more likely to get that right. So as we made design improvements You know, we were able to improve that. This is Again the simplified versus that discreet legend Color is really powerful and we specifically designed the state specific to pick a state where they had to look at the Contour lines people don't look at the contour lines that that informed our decision for how we needed to develop the combined one and similarly, we were able to show this in general what you find with with the results and this was some of the phenomenon were a little bit unexpected but in total the Combined ended up being better is Experts in general got this right more often than not so they usually outperformed the public you saw the largest gains and Provement by the public and in a few of these cases because you're completely redesigning these graphics you saw slight dips in the way that the experts interpreted it that were statistically significant and we Aren't able to formally Know why this happened But the sort of the hypothesis that is a literature base one is when you have Experts that have seen graphics for the past 25 years and know what those graphics are think about websites And when they redesign websites, even if it's you know exactly what you want you can't find stuff for a little while Takes a little longer. I think this is what's happening with this is if you significantly redesign the graphics It's a little harder to find the information you're expecting So it's not Unanticipated to find a dip if we were to do this as a time series I would expect for those to go up after three to six months So in general what this did because we worked so closely with the NOAA climate prediction center was we were able to use a really robust method to be able to understand how we could redesign a forecast based off of How users interpreted it and how we could improve that Understandability and that provides a really strong evidence-based way of being able to improve the way that you're Visualizing and presenting these forecasts So if they're going out to lots of different types of decision-makers and you're producing this especially for very high-profile federal government products It's worth it to make sure that you're producing and presenting something that other people that you intend to use It actually understand correctly with that we have sort of Completed the pro act approach. We've talked about sort of Presentation and design of you know the consequences and information And I would just say like if we're thinking about these things in terms of intertip forecasts and decision models these ideas of adaptive monitoring adaptive management how forecast get used into decision-making all of this is with the idea that I Think we would be successful if in 10 to 15 years We figured out a way of Operationalizing adaptive management and if ecological forecasts were part of the solution to be able to achieve that So I hope to be part of that. I hope this gives you an Opportunity to at least have one friendly decision scientists who might want to work with you on things related to decision support But if you're interested in more of these things Let me know because I think this is where collaborations between Forecasters and social scientists really come to bear is if we want to do some of these bigger things none of the You know the big dominance methods, you know Whether or not stream restoration is worth it. I used to be a water quality modeler and you know, I can do statistics So so I did those but But you know those kinds of collaborations You're not able to solve these problems unless you're collaborating and so that's why we work a lot with ecologists and geoscientists So with that if you're interested in personal decision-making, I had a couple questions related to that smart choices Simple methods related to even swaps that we talked about in this pro act method is presented here Structured decision-making. This is the one if you're interested in environmental decisions Robin Gregory is one of my academic heroes Asked him for a postdoc. He has gone through multiple examples of things that he and others have done to think about these approaches and how you might be able to structure, you know rigorous analysis and understanding of values into the way that you think about Helping people to make really hard decisions and if you're interested in making hard decisions I'm really interested in the decision models Bob Clemens who is one of my collaborators he wrote the first textbook on Decision analysis and a lot of this is framed within sort of a Modeling paradigm, so I think it's a pretty approachable way of sort of breaking down and understanding these methods And if you're interested in value of information and imperfect information, I would use this as your reference So with that, thank you