 Stanford University. We're now starting our second technical session. We're going to consider challenges and solutions for interconnected systems. I'll serve as the moderator for this session. And I'll let you welcome the speakers for today. This will include Niels Aguiliel de la Bommel and Cameron Terranci. Both of them are PhD students in Civil and Environmental Engineering here at Stanford University and working with me on an offshore wind project that folks will learn a little bit more about. This will be followed by Sora Bamin, he's a professor of also Civil and Environmental Engineering at MIT. And finally, Jean-Paul Watson. Jean-Paul is a senior research scientist at Lawrence Livermore National Laboratory. And with that, we'll get started. And I welcome Cameron and Niels on stage. Thank you. Hi, everyone. Thank you, Anesh, for the introduction and for the honor of being here. My name is Niels Aguiliel de la Bommel. My colleague here, Cameron Terranci, we're both PhD students, as Anesh said, in her lab. And our research is focusing on understanding the potential role of offshore wind and its associated transmission and storage system in California. To give you a little bit of context, as most of you probably know, California has goals of 100% renewables and other zero-carbon sources for electricity generation by 2045. And offshore wind, as we've heard today and yesterday, is expected to play an important role in this transition. Assembly Bill 525 required the CEC to come up with targets for offshore wind in California. And these targets have been established in our two to five gigawatts by 2030 and 25 gigawatts by 2045, which are very ambitious targets, since we have currently zero turbines in the water. A little more context on the floating offshore wind industry. As some of you might know, the floating part is an emerging technology. It is necessary in coastal areas where the seafloor is very deep, which is the case of the California coast. And currently, globally, there are 123 megawatts operating. There's up to 96 gigawatts planned. The technical potential of California has been estimated at 112 gigawatts by NERC. And a little bit over a year ago, Bohm awarded the first five leases for offshore wind farms in California, two of them in the northern region in Humboldt and three of them in Morro Bay. And each of them is approximately 900 megawatts. These farms are located approximately 50 kilometers from shore, and the depth of the seafloor is around 1,000 meters, so it definitely needs floating technologies. And of course, the integration of this resource is facing a lot of different challenges, which come in and we'll get a little bit more into detail. But just very briefly, the demand hubs are very far away. San Francisco and Los Angeles are far from these locations. There's limited transmission, especially in the northern region, and there's a lot of uncertainty because these technologies are emerging and nothing has been done on this scale. We're just going to be focused on answering what are the emissions, equity, resilience, and economic impacts for the offshore wind industry in California, also integrating the impacts of storage and hydrogen in this transition. And so we're going to accomplish this by engaging directly with local communities to understand what their preferences are for this new industry and its associated infrastructure that's going to be placed in their backyards. We're going to be conducting techno-economic assessments and capacity expansion planning exercises to understand the economic viability of this industry. And we're going to be integrating new methods and incorporating uncertainty analysis and how system conditions impact these planning decisions in the long run. And so to give you more context, there are two primary two regions of initial interest for offshore wind development. In the north is the Humboldt region. There's relatively little amount of transmission capacity existing in this region, so we'll need to build a lot of transmission lines to connect that to the Bay Area. So far, analysis has been conducted to analyze HVAC lines going to the center of the state and then down towards the Sacramento Bay Area. HVDC lines as well as HVDC subsea cables have also been analyzed by the KAISO as well as National Labs. And in the center of the state, there is relatively large amount of transmission capacity because of the Diablo Canyon Nuclear Generation Facility. So there, there's still been analysis done to explore options of HVDC subsea cables going north and south, as well as HVAC lines towards the center of the state. And so there's been a lot of really great work in this field so far looking at generation expansion for individual offshore wind sites within the state using relatively coarsely resolved capacity expansion models that don't include a lot of transmission resolution. And in parallel, there's been really highly, highly resolved transmission planning studies conducted by the ISO and the National Labs that look at individual transmission expansion cases to see their deliverability impacts and the economic performance of different transmission lines. And what we're seeking to address in this pilot project is co-optimization of complementary resources like offshore wind, battery energy storage, hydrogen and transmission by developing a new model that can handle the flexible transmission topology needed to study this question. So to accomplish that, we are continuing the development of the pipes of the USA, a capacity expansion model that performs generation and transmission co-optimization. We bring in data sources from NREL, EIA, the CC's Plexus model, WAC-ADS, and also expert elicitation. And the goal here is to design resource portfolios that are robust on certain system conditions meaning weather variability. And to stress test these designs for reliability, resilience and equity implication. So what we're trying to achieve overall is understand the benefits, costs and risk of offshore wind in California. And as Cameron mentioned, part of it is gonna be developing a techno economic assessment where we're gonna look at the potential benefits of offshore wind using NREL wind speed data as well as estimating the cost of all these future projects and the associated transmission. We realized that this emerging technology has very little data publicly available when we decided that the best way to tackle this issue was to perform an expert elicitation where we're interviewing experts, asking them a set of questions to come up with data on cost and performances of different components in the system which we will use in our models to account for the uncertainty of this new technology. In terms of stakeholder engagement, we, as I just mentioned, are engaging with experts in the industry. We also plan a community acceptance study in California where we hope to engage with coastal communities and we also hope to engage with different tribes. Of course, we would like to engage with public agencies that are doing similar work and all of this would be done through visits, meetings and surveys. And so this project is gonna be supporting earnest goals by developing a new model for generation storage and transmission co-optimization. We'll be assessing community benefits, burdens and acceptance of this new infrastructure. We'll be stress testing the designs for resilience and we're gonna be developing affordable and robust decarbonization plans to help meet California's decarbonization goals. And our anticipated findings are to be able to provide some guidance on the economic viability, the public acceptance and engineering pathways to drive forward California's decarbonization goals when it comes to the offshore wind industry. And just to conclude, we'd like to thank the DOE, of course, the Bits and Watt Initiative that we're part of, pre-court energy initiative, Stanford University as a whole and of course our advisor, Inesh, has a video for all of her work and the work we're doing with her and we also just wanted to say we're really excited about earnest and looking forward to all the work we're gonna accomplish. Thank you. Okay, good afternoon. The previous two speakers from Stanford gave a wonderful presentation on how the issues are about co-optimization, capacity expansion planning and incorporating uncertainty are highly relevant to earnest and its goals. This particular pilot is about joint planning of electricity gas infrastructure in the New England context. And together here I am with Priya Donti and Dharik Malapragada and also Rahman Kuram for a post-doc with us. But also through this project, we are hoping to collaborate with many of you, including colleagues from Princeton, APRI and Stanford. So interdependence of New England power gas and electricity infrastructure is very well known, right? So we have a high seasonal energy demand and the national gas demand. We have well-documented constraints about the reliability of natural gas infrastructure and its impact on the transmission and the bulk power supply. And there are extensive decarbonization targets where various New England states have set forth with regard to VRE storage and building electrification. There's also a very famous study of IS in New England on operational analysis and renewable energy integration study from 2017 that we are building on in this project. So the key aspect is that there is a clear need to tackle the interdependencies between these two systems in this extreme climate events, extreme weather and changing climate events and also to integrate fine-grained operational models in the coordinated investment decisions. So these are actually the key starting points of our focus, how to also incorporate operational constraints in the planning process. Now through this project, we of course want to focus on context-specific as our PI in the morning suggested, context-specific insights on how, for example, the dependency of the New England bulk energy system from Canada and other states, neighboring states in the US in terms of import of natural gas has implications on the resiliency, not just in the one-two horizon, but long-term planning horizon. But we also want to understand general level insights about how poor coordination can have resiliency issues. For example, the famous Texas example that was pointed out. And so in the context of this particular pilot and how it fits to the earnest goals, we want to sort of provide an approach for joint planning of power and natural gas interdependent infrastructure from a cost-effectiveness perspective. We want to account for the uncertainty in the evolution of various technology trends ranging from storage to CCS to low carbon fuels in maintaining decarbonization efforts and explicitly including emissions constraints in the joint planning problem. We want to also tackle the uncertainties about supply-demand variations due to weather events, as well as disruptions which are caused due to extreme event situations. And in this process, we also want to account for the equity which is related to the inclusion of, explicit inclusion of the stakeholder preferences on the distributional outcomes. And we want to do this from a data-driven and methods perspective. We want to build on multimodal data analysis, optimization models including stochastic and robust optimization models and rigorous scenario analysis which is specifically informed by the stakeholder engagement and continuous feedback. So let me just summarize the key questions. So the first we want to sort of develop an approach which will help us understand and evaluate regional plans for electrification of heating demand and how to position and hopefully also decommission some of the natural gas infrastructure, how to justify those investments and decommissioning plans. And also in this process, explicitly model the uncertainty due to inter-annual variations and the climate change impact on the key supply-demand parameters, of course, including the role of various emerging technologies, as I mentioned, low-carbon fuels storage and negative emission site technology. And here is where we can potentially also collaborate with other pilots, especially in regards to how the customers respond from due to demand-side interventions and how those response models can be integrated into the demand models taken into account in the planning decisions. And finally, there is this clear interplay of resiliency and equity which this particular pilot also raises. And how does it do that? Well, we know that we need to account for uncertainty in the demand pattern so into climate change trends, right? So we know that resiliency investments need to be made and they need to be made in accordance with the emergence of new technologies. We also know that energy sources and the heterogeneity in building stocks, et cetera, will impact the distribution of energy transition costs. So here is the thing, right? So if we invest in resiliency using these technological outcomes, clearly, and there is no science to this, we already can predict that there's going to be having different distributional outcomes. And since they are going to have different distributional outcomes, clearly it also has an aspect of equity baked in, right? So is there a concept of what we call as an equitable resiliency? And can we through this project and through this collaborative engagement have a feedback loop, positive feedback loop between resiliency and equity outcomes? So this is rather a unique part which jumped out at us in this particular pilot project. We've already started to sort of do some preliminary work building on the fantastic work of Ram Raj Kapal and other people at Stanford to try to integrate various data sources which are available into our data pipeline. So for us, we are building on weather data, data from NREL, data from ISN New England, and also trying to account for the heterogeneity of building stocks. So from a bottom up approach, this word bottom up came up a lot of times in today's discussion. Our goal here is to sort of use this data driven approach using analytics and machine learning tools to create principle ways in which we can actually include scenarios for natural gas demands and a couple demands of power loads as well as VRE profile scenarios and disruptions to the grid. And the goal here is to sort of integrate these uncertainty models in this interdependent optimization and network analysis tool that we have been developing which we call, which Rahman calls as joint power gas planning model under uncertainty. And the goal here is to sort of drive this kind of assessment and metric evaluation kind of process that are the central goals of earnest. And so through this event, through this particular project, we want to sort of make the tools coming from uncertainty and optimizing uncertainty and decision making under uncertainty, uncertainty modeling and decision making under uncertainty as open source tools building on tools from robust and stochastic optimization, amen-enabled uncertainty representations and ideas from Bayesian optimization. The other thing which I would like to focus on here is not only that this particular optimization and decision making tool should naturally result in refined metrics of reliability, resilience, and equity, they should also help us understand what does it really mean to have generalizable performance in terms of planning outcomes. So my claim here is that today we do not have a conceptual understanding and practical understanding of what does it mean by generalizability and robustness of investment outcomes. In this particular project, we do need to sort of hope to create a framework for that. The preliminary work that we have been doing in leading up to this project already make us somewhat comfortable to be able to address these challenges. So for example, we can assess under different electrification and building improvement scenarios how the natural gas demand can be changing or progressively decreasing and what kind of electricity demands we need to sort of provision for going into the future under building heating electrification trends. Now how to integrate uncertainty, how to integrate resiliency into the system, how to bake in equity into the system, this is something that we would want to do this in this project. Now just briefly a few points, refinements to the task and then I'll end here. So we do want to have solicit feedback from stakeholders and partners in this pilot and other pilots to help us characterize the data. We want to sort of have a principal understanding of data-driven representations of uncertainty, tractability and robustness of the planning outcomes and finally refine this traditional reliability and reliability matrix called SAD or SAFE in order to try to push a little bit towards resiliency matrix, which can be quantified and assessed, which also accounts to a certain degree the nature of correlated disruptions in the face of future uncertainties and also then produce from a visual viewpoint spatially differentiated impact assessment tools which can be readily used by the decision makers for various reasons, right, for various outcomes. Thank you very much. Well thank you. After the previous speakers I would really like to talk about stochastic optimization and capacity expansion but I will suppress that desire. All right, so I'm up here to talk about NARM. How many people who are not at a DOE lab or DOE have ever heard of NARM, the North American Energy Resilience Model? All right, this will be interesting. I'm not going to convey the entire thing in seven minutes but I will give it a shot. The key thing about NARM, well I'll get into it. How about that? DOE of course cares about threats to the national energy infrastructure. This is a sampling of threats that we've looked at under NARM. Drivers of course vary depending on how frequent things are happening. What's recently happened, things like that but you can see everything from intentional threat to temperature rise to wildfires and hurricanes. These are threats that are in our space and we're doing analytics around these on NARM. So what is NARM? So NARM is a DOE, OE funded effort. It's driven primarily out of that but it also has work with NRECA, GDO and also NERC. But it's a large scale project. It's been averaging about 25 million a year with all these labs involved since inception, so since 2018. I was the threats lead to ad inception and I work on a bunch of varieties of NARM at the moment. One critical aspect of NARM is it's looking at multi-infrastructure impacts. So you see prominently a pure electricity and natural gas. These are regional and national models that we have. I'll talk a little bit more about this later. This is a lot of data and you might imagine it's very hard to get and it is hard to get. We got it and we can't give it to you. But we can actually talk about ways we can actually use the NARM model and do analytics and maybe share some of the results for some of the pilot projects. So I think that's an interesting linkage. COMS is something we're starting, we have the overlays for. We know where those are nationally in the U.S. in terms of fiber and everything else in the COMS networks and we're starting to integrate that into COSIMs. So that's kind of the general super high level view is trying to do a high level regional national resilience analytics using coupled COSIM relations for key infrastructures that we care about. All right, I'll give you a little bit more sense of what's happening. Everything starts with a threat model in NARM. You need to understand your threats. You need to characterize them with data ideally and figure out how those threats are going to manifest on the system. Once you have those, those form contingencies basically, then you can do COSIM of natural gas, bulk electric systems and quantify and roll up the impacts. And that's what this slide shows. I've got another slide in here talking about data infrastructure. Data is a huge focus of NARM for good reason. Somebody mentioned earlier it's hard to get good geospatial data. We agree, it's still hard to get good geospatial data but if you have money and can pay for it that helps a little bit. And there's other sources in the government that actually can provide good data as well. Far as COSIM goes, if people have heard of Helix, it's a COSIM framework. The idea behind the modeling analysis environment in NARM is it's intended to bring together the best of breed tools within the national lab system, couple them, integrate them and actually get them working in concert with the best of breed commercial tools, for example, powerful solvers. All right, organized around three capabilities. I mentioned multi-infrastructure planning models. We have those. We have tons of data and analytics to support this. And we have a software and computing architecture. NARM is actually hosted on AWS Cloud Cloud but it's protected, of course, by user. So this is a cloud effort. It's a huge software stack. If you know anything about DevOps and things like that, it's a lot of software engineers on the effort. A lot of programmers have gone in, programming years have gone into actually deploying this but we're actually there now and I think the lower right is iCandy. Just it's an iChart that says, look, it's a complex architecture. The upper right is we have two modes of kind of interfacing to this. One is kind of a canned UI where we think of things that people should want to navigate to and create threats and do simulations and visualize them. The others are the Jupyter notebooks so the more programming savvy of us are able to use those. And we actually use both of those. We generally prototype in Jupyter or Hub and then we promote up to NARM UI as we find things that are useful. All right, this is another iChart but we have 90, 70 data layers in here. A lot of this is focused on infrastructure itself but also things that are basically dependent of infrastructure like where are fire stations, where are security stations, where are hospitals, that's all important. Geographic routing, one of our biggest issues, frankly, has been getting good geographic or geolocated information for both the bulk electric and the natural gas side and making that consistent. So a lot of our effort is actually trying to get geospatially located information because when you look at natural threats, the geolocation matters. Whether or not you can actually prevent a wildfire up to a certain point and you use roads as barriers, you need to know where those are and if you don't know where those are, you can't do very good projections. All right, so what are we using some of this for? I didn't say I wasn't gonna talk about deterministic optimization so I will talk about that right now. Within the context of NARM, I mentioned threats are key. So we have a bunch of methods for identifying threats or characterizing threats. I've led the wildfire effort since 2020 so we have a lot of work going on and trying to characterize high-risk wildfire regions in the Western US. This is both under climate change and just seasonal projections, things like that. We have methods that are able to identify rigorously high-impacted minus K contingencies given those high-risk regions. So if you've got a bunch of different, I'll show you an example here in a minute, but if you have a bunch of different regions that could have impacts, maybe there are 200 components in those regions. You can't enumerate all of those possibilities when you're trying to look for high-impact events but you can use optimization intelligently to actually identify those and then simulate those in the context of NARM. So one thing we can definitely help, I think most of the pilot projects on are given particular threats of interest. We can identify high-impacted minus K that you might actually want to be resilient to and do analytics around that. So again, we're getting past squirrels in N minus one. We're trying to push into N minus K. And I should say K, we've run things up to K equals 100. So we're trying to get our simulations to run through major impactful events, not just small one-here, one-there type of events. Wildfire analytics demonstration. If you live in the Bay, which probably half of the people in this area do, the graphic on the left, the red regions are high-risk regions from last summer, or maybe the two summers ago, but a recent summer, that were projected for wildfire burn risks in the Bay. You can see the mess of gray is the bulk electric infrastructure superimposed on that. And then the lines in blue and the buses in blue are basically those components that intersect the wildfire risk regions. So potentially any of those components are likely to go out. You can think about all those components, it's a lot. It's like, in this case, like 97, I think. You can't analyze all those, and you're not gonna lose them all, but what's interesting is if you use these optimization methods I talked about to identify high-impact N minus K events, you can find a couple of components that if you take those out, basically take out all of the load that as if you had taken out every component that's in blue. So that's super critical, is you need to understand what the criticality or critical components are in your system. Once you have those, that can be the basis for actually hardening your system against those. All right, a couple more things to discuss here. Interdependency analysis. We would love in NARM to get past impact as loss of load. Megawatt hour is not an impact. People care about whether that lights on in the right places, and it's power actually providing certain functions. That's something we're starting to get better at at NARM. Again, that's a big data issue. There's also a modeling issue, but definitely data issue. But we have a lot of interdependency analysis layers thanks to Oak Ridge that allow us to actually understand in this particular region, you've got wildfire risk or losing power to firefighting stations, which of course is a bad thing because they're the ones that are gonna put out the fire. On the right side, I mentioned in the previous slides that we're using optimization to identify sets of vulnerable components that if they're outage yield high impacts. You can't just harden those. You can think of it like a portfolio optimization. There's something else you need that as input to an optimization process that's even higher that allows you to optimally invest your mitigation strategies. So in this particular case, in this Bayer example that I showed you in the previous slide, if you do nothing, that was about 210-ish megawatts impacted. And if you actually optimally invest some number of hardened components, basically undergrinding lines, you can actually reduce things down to about 100, maybe 110 megawatts, so you can basically have the impact. But after a certain point that kind of tails off, so basically you would have to harden the whole system to actually mitigate the entire impact. So we have these tri-level optimization algorithms in addition to the critical component identification which are based on optimization to basically tell you what to do if you have a limited budget, which of course everybody does. All right, potential leverage points for NARM resilience analytics. This is just my bias, but there's probably many more things we could use NARM for for any of the pilot studies that I've seen so far. One is modeling threats. We do wildfires, but there's also cold wave team, flooding, earthquakes, you name it. So cold waves, so if you're interested in that, intentional threats, we do that too. Co-simulation of contingencies, so we can actually do co-simulations in our model for natural gas, electric, for whatever contingencies you come up with. So again, you don't necessarily have to have access to NARM, but we have to map it to NARM models and we could probably do a simulation and do some studies. Scalable worst case or high impact in minus K identification methods, these identify credible high impact contingencies for analysis. And then we have these advanced, invest, disrupt, operate. I train myself slowly to stop saying defender, attacker, defender, because it doesn't really cover natural threats, but these optimization methods that actually tell you, given a fixed budget, where do you mitigate things and how was the best to do that? And with that, I will conclude. Wow, I'd like to thank our speakers for this session. We now have a few minutes for questions. So I'll open it up for questions from the room. I had some questions in the MIT work. Actually though, I think you had a slide about the goals of the work. I think they were like spot on. I just wanna understand how you're approaching that within the New England ISO, because one of the challenges with New England especially, is that as we go through electrification, the winter peaks that come, and I just wanna get a little bit more understanding of how you're approaching that in terms of the planning and your resource adequacy in New England for building and transportation electrification. Yeah, so currently what we have done is to sort of look at the residential and commercial, primarily residential sector in terms of specifically focusing on building, heating, electrification, and trying to understand from a bottom sub-approach way how the shift in the energy demand from natural gas to electricity is going to shape up under various technology scenarios and also envelope improvements, right? Now, once this particular understanding of change in the supply and demand is done, we can in principle try to evaluate this under various weather realizations, future uncertainties, either past realizations of future uncertainties of weather data, and then try to sort of utilize these uncertainty representations into this joint planning problem, right? Which actually co-optimizes not just the investment cost in one sector but another, but also has this interdependency taken into account, the interdependency between the two. And then try to understand what kind of decommissioning decisions if at all or strategic positions of form power resources in the presence of future investment in renewables is going to look like, right? So that is actually the kind of analysis that we have currently done. We would love to sort of collaborate on how the scenarios of failures can be integrated in a more principled manner, including the attacker defender, attacker defense kind of a setup. But basically these kind of optimization models are supposed to provide guidance on strategic investments, both spatially and temporarily, and how the nature of uncertainty is going to shape these investments. So it does tie to sort of some of the priorities of the New England ISO, both in terms of the resource adequacy and resiliency needs. And we would love to sort of bridge that gap between the two, yeah. There was another question, but there you go. Hi everyone, thanks so much for your presentations. I was wondering if you could share a little bit more about the equity frameworks, metrics, and data that you plan on using in your projects, or in the case of Lawrence, what that orbit used. Thank you. What do you want to? Maybe you can start somewhere? So I do not have a full understanding of this equity picture, but coming from a resiliency and this cost investment cost perspective, we can clearly see that our models currently in terms of investment planning, we can almost say very clearly they are not equitable, right? So in terms of evaluating, for example, where the investments are going to be, right? Relative to sort of the demand centers, right? So one claim or one sort of open question is, what's the difference between a resiliency improving investment outcome and equitable investment outcome? And are there some kind of commonalities between the two, which we can actually utilize to sort of look at the equity definition in a more nuanced manner? You see, it has to be looked at in a nuanced manner, and my claim here is that the resiliency aspect is actually not disjointed from equity, as it pertains to investment decisions in the face of future uncertainties. I don't have a nuanced, more different answer than this, but maybe others would add to that, yeah. I can try to answer a little bit for our project. This is not something I'm an expert in yet, and the way we're thinking about it for our community acceptance study is that we would engage with the communities and understand their willingness to accept different kinds of transmission system, different kinds of, different locations for the floating turbines, how it would impact them, engage with, again, all kinds of communities along the way of the transmission system, and also the coastal ones, and I would have had another thing to say. Dude, you have something to add for now? No, maybe one thing to add is just when we conduct these planning exercises, we're really looking at the affordability of different pathways of decarbonization, so that plays a big role into the equity of this energy transition, because we really can't take higher rates, astronomically higher rates, yeah. Just to add, it came back to quantify a little bit of environmental justice, we've also looked at which groups are the ones that would be the most impacted by the different systems and who's gonna have the benefits and who's gonna carry the burdens of offshore wind in California. At our NARM, we're starting to integrate some of the equity layers just to understand what is being impacted of particular buses in terms of zip code statistics and things like that. We say that's a very new effort in the last six months, but that's where we are with that. Hi, not meant to be a gutch if it's not included. I saw one map in the NARM slides that had Alaska as Alaska part of it, or cotton and white, lower 48, what are we talking? It's really conus at the moment, to be honest. And there's, I think the beginning data for the other areas has been the problematic aspect. Same thing, no one, well, I'll get out in front of it. No one asked about Canada and Mexico, similar reasons. So as we gather more, a couple more questions, I do have a question for you, JPU, which is this phenomenal effort with NARM is there, right? So how can we at the university ecosystem reach out to interact with NARM in a way that enables successful products and use for earnest? So if you can provide some examples of how should we reach out to you and structure the ask, yeah. So I think the ask, well, one good thing about NARM, I said the data's, we can't give that out, but most of the tool sets are either simulators that everyone knows how to use in the room or maybe knows somebody, you know somebody how to run them in a room. And then also on top of that, most of the lab tools are open source, so that's good in the sense that we can point you to we're using this tool for doing this kind of analysis. So if you have your own data, that's probably very easy to do. As far as engaging with people, I think reach out to me if you're on any pilot programs. If you want to do X, Y, or Z, I know enough of the NARM ecosystem and the management too and a part of it that I can actually point you to the right people and then we can kind of go from there. I think it would be interesting in particular some cases to actually do studies like you might be doing open source data from for the New England case. We've done cobalt of analytics for the upper Northeast too. So we could actually do comparisons of NARM models relative to academic models to just do more fidelity. So I think that's kind of the way to start is, those three aspects, one and another thing I want to point out is most lab folks are academics at heart too. So we do practice, but we also like to publish and do research. So I think if you come to us with a research question and say, hey, we found this cool tool, can we extend it in this way? You're gonna get a lot more traction than if you just say, hey, can we use your tool for free? That sounds great. And there is a question there, so yes, go ahead. Hi, just to follow up on the wildfire thing. So we've seen a lot about the consequences of wildfires. Rates go up, insurance goes away, right? Are you doing any analysis in collaboration with private companies or funders to do like a cost benefit analysis? If we harden stuff up, then what's that benefit? Can we monetize it? And more importantly, for the projects that aren't undertaken by the government, like can we get these things funded? So what's the, is there a value that you can attach to reducing the severity and length of wildfires? Yeah, so we've done a couple of things. So a lot of this still is in the research phase, but the tools are set where we can actually do studies for real customers. So we're at the phase now of engaging with utilities to try to figure out like, what can we actually do to help you out? So the utilities in California are spending a lot of money trying to underground lines. We believe we have better ways to actually prioritize that. And maybe, you know, you can go so far deep into the undergrounding stack. And after a while, it's not gonna matter. So we believe we can provide analytics that will actually tell you it's worth spending. I'm making these numbers up, two billion. But if you spend 10 billion, your marginal improvement is only 5% over two billion. So why are you bothering with that? There's other things you might be able to do in terms of undergrounding. And so is the major stakeholder there, the utilities who are gonna decide where to bury lines? Well, like right now, DOEs are stakeholders. So they're worried about national and regional resilience questions. We have a recent effort within our ECA that we're just starting for wildfire analytics that we'll hopefully be able to engage strongly with some of the co-ops to actually allow us to do analytics studies in their context and actually tell them where to invest and make our tool more, I guess, industry friendly. And that's kind of where we are with that process. So with that, I would like to move the final question to the two graduate students and ask a little bit. You'll have the final words, Nils and Cameron, before we close this session. But for the graduate students that are gonna be participating in earnest and are considering engaging in this sort of research, what are your words of wisdom? Go ahead. I think what's exciting about this consortium is that all these problems are very timely. If you look today, we just had President Reynolds from the CPUC talking about offshore wind. Our project is directly playing into this conversation that's happening on a wider regional basis. So I guess my advice would be stay involved with the conversations happening right outside of your research lab into the broader planning work going on. Yeah, I completely agree. And I would also add that one of the main things that I've really enjoyed about my research so far is how interdisciplinary it is and how many different projects there are, how you can participate in a lot of them at different levels. Of course, you focus in something, but you can discover a lot of other things. And I think earnest is gonna be quite incredible in that aspect and a great opportunity for a lot of graduate students to be interdisciplinary scholars. And with that, we'll conclude this session. Please join me in thanking all these speakers. So.