 Hi everyone. Welcome to the last seminar of this quarter. So today the speaker is Dr. David Chapsson from SNAC. He's going to talk about a brilliant version of his music, Happy for Freedom D. So High Pass Dance is a high-performance music release of Freedom D. And just a quick reminder that these are the presentations in his corner and they are all being recorded. You can go to the business website and find the link or you can just simply search on YouTube. The presentation for today should be available in a few days. So our speaker today is David, Dr. David Chapsson. He manages the dismal group at SNAC and he leads the research and development activities in renewable energy and grid integration systems. Before joining SNAC he was a staff scientist at PNNL. This is the National Lab in Washington. So over there he led the development of building energy monitoring control and household systems. He designed the Olympic and Columbus conceptive energy price-discovered systems that managed to be run by the U.S. Department of Energy. His research focuses on the control and dispatch of past energy on its own. Retail re-time monitoring strategies and prices and conceptive control. Thanks, Jim. So thanks everybody. I know this is the, actually yesterday was the last day, so I know this is like, you're not even supposed to be here. And yeah, sorry for last week. I forgot that I had a vacation planned and I was in Montana all of, well, basically the end of last week to be here this week. It was a great vacation so I'm all recovered and I'm ready to go for the whole summer, right? Okay. So yeah, what I'll talk about today is this tool that we developed for the California Energy Commission. I'm going to spend a little time on the history of the tool. I think it's instructive to know where it came from and why the California Energy Commission was interested in using this tool and seeing it developed further made available to utilities and the CPUC and CDC in California. And then I'll talk about, you know, sort of some use cases, things that we focused our attention on in development of this particular version of good levy and some of the features that we added to it. There's quite a bit of changes from the original version that was developed for DOE. And then just spend a few minutes on sort of things we're looking at doing going forward, you know, emerging ideas and directions that we'd like to take the tool in and hopefully we'll be successful in raising the funding necessary to do that research. So let me just start off with the origin of the tool. So the National Laboratories have what's called the Laboratory Directed Research and Development Program that they've been running since the early 90s where the laboratory can take a portion of the money that they make from conducting research for the Department of Energy and they can invest it in early concept development projects. And so Goodland D had its origin in one of these LDRD projects 20 years ago. And so I led a very small team, very small, it was me and one other person. And we basically focused on the problem of simulating buildings and power grids at the same time. And if you think about it, that problem is not trivial. Power grids behave on the order of microseconds typically that's where we really think about problems on the power system. And buildings behave on the order of minutes to hours. So if you're thinking about how you slow down the simulation of a power grid to incorporate behavior that happened on the order of minutes to hours, and you speed up buildings so that you can start to simulate them at the same time, that's not an easy thing to do. At the time, most of the building energy simulations were energy focused. So they looked at how much energy a building consumed hour to hour, but they didn't really think about what happened sub hourly. You know, if I have a demand response program, and I want to change the thermostat and look at how the building's behavior changes within the hour, you couldn't do that with those tools. It wasn't possible. At the same time, the power system tools that existed either did dynamics, which is, you know, microseconds, milliseconds time, time scales, or they did steady state. So there was no time scale in between that we really thought about. So bringing these two together so that we could study how demand response and power grids and distributed energy resources and smart grid technology would work, that was a huge problem and hadn't been solved. The approach that we took eventually became this tool that we built for the Department of Energy and became a whole program at the UEDE, and it's based on an agent-based approach, where instead of trying to take these two different solvers that were fundamentally incompatible, you know, there's just no way to connect them together. They dealt with different state space. The dynamics were on time scales that are completely incompatible and irreconcilable. And instead we built an agent-based system where different objects exhibited behaviors, and then the tool essentially created the solver needed to make these objects work together on a fly. And so it would generate the solution for the power system, and it would generate the solution for the buildings, and then it would try to solve them on time scales that were relatively compatible with each other by doing things like asking a building how long before you have a state change, or asking the power system how long before a fuse blows, things like that. And by creating an agent-based system we had an environment which we could study the kinds of things that DOE was really interested in looking at at the time, and a lot of these things have become sort of canonical problems for agent-based simulators, and answered some important questions. And I'll go over some of these, you know, there are a lot of questions around whether conservation voltage reduction actually was effective. For those of you who don't know, conservation voltage reduction is the idea that if I would impedance load, and I reduce the voltage, then presumably I reduce the power. And it can be very effective. The question DOE was asking is does that always work, right? Does reducing the voltage necessarily result in less power, given the fact that buildings have thermostatically controlled, thermostatic control systems. And so if I reduce the power, I may actually increase the runtime, and I get a no win. So these were the kinds of questions that we were asking. We didn't really have the tools necessary to answer. They were also asking questions about whether retail real-time pricing was feasible using buildings. And there's a lot of work around distributed energy resources that was beginning to emerge at the time. Obviously, now it's a central focus of what we do. Transactive control, for those of you know about transactive energy, the idea that we use prices as a signal for controls. If I tell you that your electricity is more expensive the next 15 minutes, you might change your behavior in the next 15 minutes. And then a lot of work related to what came to be known as fault-induced delay voltage recovery. The idea that if there's a fault on the system and we have loads that have inertia, motor loads that have varying inertia, they may actually cause a voltage sag that would cause a cascade of motor stalls. And these motor stalls would drag the whole system down. And then of course, micro grids. And so I'll go through some of these. I don't have time to cover them all, but I'll go through some of the results that came out of the early work on this tool. So conservation of voltage reduction is the idea that now all feeders created equally. And so what this study did is looked at how a taxonomy, set of taxonomy feeders, 25 feeders that were drawn from across the United States. And we knew, based on that taxonomy, how many feeders that each particular design represented across the US is we had a weighting model that allowed us, based on the simulation of those 25 feeders, to project across the US, how many feeders it would be productive to implement conservation of voltage reduction. And so the idea is that some feeders it's very effective and others not so much. And so we would focus CVR strategies on the ones where it was very effective. This gave you sort of a cumulative distribution of the impact of implementing CVR across the US. And what we found is that if we did it on about 35% of feeders, we might get 80% of the impacts. And so obviously that was an important insight for utilities. And utilities typically now use this kind of modeling to determine whether it's effective to implement CVR. We've got a good load model. We've got a good network model. We put the two together and we can determine whether voltage reduction is actually going to result in energy saving. Another area of work was this idea of using retail real-time pricing. So if we use price discovery mechanisms on the retail side, we ask people what their willingness to pay is. Rather, we ask the devices that they use what the willingness to pay is. And through automation, we can then discover the price at which supply is equal to demand, given a constraint on the distribution feeder. Does that result in performance improvements, particularly does it mean that we manage our constraint better? And so this was a test using a simulation of a double auction. And what you see is that the impact on the thermostat settings was not significant. You don't see a major difference in the comfort. But through this price discovery mechanism, you can see the price is spiking up a little bit during the peak time of the day. We can see the system does a better job managing a constraint here of about 5.2 megawatts on this particular distribution feeder. So it's this kind of tool that lets us do this. The most recent study that came out of using Goodlady came from KNNL, was this study of ERCOT. This is the Texas interconnection. It was called the DSO plus T, basically looking at if we were using transactive energy strategy on a distribution system across Texas, would there be cost benefits to the consumers in Texas? They started this before things went haywire last spring. And I actually don't know what their conclusions were based on what they saw during the winter storms last, not this past spring, last year in 2021. But the idea was to look at a simulation starting throughout the entire system. So basically they simulated the entire Texas system using Goodlady and a number of other tools in a co-simulation environment. The idea being to see whether employing a sort of retail level price discovery mechanism would result in savings. This is just the baseline cost flow through the system, identifying where the cost, where consumers expenditures end up in the supply chain and distribution at ERCOT. And this is pretty close to what the national results are for our moderate renewables system. And the result was that they found that there's pretty significant advantage to consumers if they use battery storage and dispatch it using a transactive mechanism. And probably the largest benefits are the capacity payments, which seems sort of obvious. I think the main purpose of this was not just to validate what was obvious, but to try to get a sense of the magnitude of the savings and the benefits that we would see in a system like ERCOT. So as a result of this work, the California Energy Commission was keen, and then actually I have to credit one person in particular, it was Jamie Patterson, who I think was the strongest advocate for this at the CDC, really keen to see whether we could upgrade the tool to meet some of the challenges that they anticipated coming in California, particularly in response to climate change policies, and then more recently, extreme weather events and the problems and challenges managing distribution systems under these conditions. And so it was this early thinking at CDC that led to the program that I'm going to talk about, and this is the work that we've been doing over the past five years. So CDC was motivated to meet, to basically support the DRP process in California, which is guided by these basic principles, ensuring customer choice and engagement, recognizing the role of all the distribution assets that are on the system in California's carbon management strategy. Of course, addressing safety, reliability, and resilience itself in the system, which is an ongoing and growing challenge, not just in California, but pretty much everywhere now. And ensuring that we have an affordable system and ensuring that we equitably allocate the costs of managing the system and meeting all of these goals. And then of course, employing a competitive process, we do select DERs so that we don't unnecessarily incur costs that eventually inevitably get passed on to the consumers. So it was this sort of these guiding principles that led to the work that we proposed. When we proposed the work, we recognized that there were a whole bunch of challenges that we had to address in the tool, essentially the tool had to simulate all the conditions and circumstances that would be a concern to utilities. So early on, of course, hosting capacity and location net benefit analysis was considered extremely important. In fact, those were the only two use cases that were identified at the outset of the project in 2016. Locational net benefits analysis kind of fell by the wayside. What happened was that the work that was done generating the LNBA maps was regarded as pretty much comprehensive and there was no need to revisit it anytime soon. And so there was no, there was really, who basically gave up on working on that early on on the project, but hosting capacity or what was called integration capacity analysis state and a couple of other use cases were added, which I'll get to in a second. And of course, the issues that utilities have with DRP in general remained. So the importance of being able to demonstrate technologies in the field and at scale has always been something that utilities have had to deal with. But of course, emerging problems, we've been having issues now with data access. Who owns the data? It seems to be something of a question that doesn't ever seem to get resolved. You know, that in principle consumers own their data. But I don't know if you've ever tried to get your data. It's not as easy as you might imagine. And of course, using that data productively in analysis, the utilities have the data, but lack the tools. Many of the people who have the tools lack access to the data. So this has been a real issue. And it's an ongoing issue. And then tariffs design. So a lot of these new technologies, whether we're talking about transactive or, you know, the emerging technologies, batteries, these challenges of tariffs that we are accustomed to seeing utilities use. And so they need tools to help them analyze these tariffs, the tariffs that they're using in the context of these new technologies. And so a lot of the design work that we've been doing, the tool has been focused on answering that question as well. And then of course, life safety, especially with PSPS work being done, utility developing methodologies to address power safety shutoffs. And, you know, the methodologies for that, I think still need work. There's really no standard way of addressing that question. And something that we've been working on. So there's been this evolution in the tools. In fact, I would argue it's less of an evolution and more just emerging new tools that are ultimately going to displace the existing tools. Historically, utilities focused on low growth, the reliability of the assets. And on a system that was pretty static, it didn't change very quickly. You think about the difference between the system in the late late 60s, early 70s in the system before the 2000s, that was pretty much the same system. Didn't really change my melody. But then a lot of changes started to happen. And so, you know, we have to really rethink the tools in some ways. There are certain tools that are just going to disappear and other tools that are going to have to be developed to meet these new needs. And one of the biggest challenges we've run into is that historical data just doesn't seem to work as well. And so we rely far more on models of what we think is going to happen rather than data from the way the system worked in the past. And if we do rely on data, it's not going to be a lot of data. And it's going to be data about a system that's fast changing. So we have to augment the models of physics and deal with uncertainty in a much more comprehensive way than we have in the past. That's been a big challenge. So here are the resources that we've been using or dealing with in our modeling to try to address these issues. Demand response obviously remains a very important component and has basically for the last 20 years. And of course, so that involves tariff design questions, you know, when you introduce a type of use tariff, how do you know that it's going to be effective? You rely on simulations in part to examine that. And so we've introduced methodologies to include tariff calculations and tariff modeling in the tool, weatherization efficiency programs. All of these have been around in the DRP process for a long time. But solar and energy storage and of course now decarbonization electrification of our energy infrastructure all emerging as important dimensions of the problem and the tools have to address this. So just briefly, you're probably familiar with all of these. So demand response, this is the idea that you can change the shape of the load, you can shift the load, you can reduce the load by shedding some of it, or you can wiggle the load back and forth in a way that just happens to meet some particular need. Just because they like their, what's the word for things that sound like, liberation. And so for demand response, obviously thermostat setbacks and time of use rates have a very important role. But there's also, you know, a lot of programs relate that use water heaters to reduce load. That tends to be effective, obviously only on where there's electric water heaters. And so if you have gas water heaters, that's not obviously a program that's of any relevance. But as we transition from resistance water heaters to heat pump water heaters, the magnitude of that program to contribution to demand response might produce pretty significant. And its behavior might change. And then of course, you know, the energy efficiency of buildings, the loads in the buildings can all have a significant impact on the effectiveness of DR programs. And so we have the ability to model these devices, the appliances and the equipment in the buildings and evaluate whether DR programs have been affected by this in any way. Solar impact, solar resources impacts, obviously with more solar showing up in the system, this has become a very important element of what we look at when we run simulations. You know, it introduces a lot of uncertainty. That uncertainty turns out for obvious reasons to be highly correlated to thermal load. You know, your highest cooling load is also where you have the highest solar capability, but it's also we have the highest solar uncertainty. The solar is a lot less uncertain whether it's not there on this night or something. So that correlation is something that's difficult to capture with simple power system modeling tool when you have a joint thermal building load and power system load, you can do a much better job of seeing how those two interact with each other. So, you know, the power system side of the simulation does a really good job of capturing the effective variability of the load and the solar on the voltage in the system and whether that increases the amount of operation regulators, load tap changers and devices that have to manage the voltage on the grid. And so those are all results that we have to capture and utilities are very keen to know the degree to which adding solar affects their assets and whether they need to upgrade the capabilities of their systems. And of course, from a price perspective, solar has some pretty significant impacts. Energy storage is also an important component of the resource mix that we are talking about here. And, you know, that's also modeled in these systems. In some jurisdictions, energy storage requires solar. So if you're going to put a battery in your house, you must also put a solar unit that's not true everywhere, but I believe it's true everywhere in California. I'm not positive about that, but I think it is. And so energy storage has significant impact. And we capture that those effects in simulations. And then low decarbonization, the transition of end uses, which have been traditionally at least in part met with gas, natural gas, heating, cooking, clothes drying, and things like that, hot water heaters, as well as the electrification and transportation, the addition of electric vehicle chargers on the distribution system. Those all have a pretty significant impact on the ability of the system to meet the demand. And in some cases, you know, we see the impact of the low doubling or more depending on how much was already natural gas, or how much was natural gas versus how much was already electric. The interesting part about this is this is not just a problem for the power system, the grid itself. It's also a problem for homes. If you have a lot of gas loads in your home, your power, your electric panel, very likely doesn't have a capacity to support changing all of that over to electricity without an upgrade. And those upgrades can be fairly expensive. And so the idea that we're going to decarbonize our entire, let's say our entire residential system and shift all the residences from gas to electric is not as easy as it might seem, because so many homes might require significant electric upgrades and there just isn't enough, you know, labor capacity out there will do that in such a short amount of time. And so that's, it's a real challenge that I think has to be understood and until you think carefully about how they, how they support the transition away from natural gas in the building, in both commercial and residential buildings. And then of course, you know, good luck. He has a lot of the capabilities already in place on the DOE version already do these quasi study state time series that were required to do these kinds of studies. And so a lot of the tools are already in place, but there were certain things that were missing. So for example, when we were looking at things related to tariff, there's no tariff model. In fact, there's, there isn't really a tariff model at all anywhere. So what we had to do is go to an NREL database where tariffs were represented and try to construct a model that would allow us to study the effect of tariffs on revenue for the utility and of course costs for consumers and how these new technologies would have to be incorporated into tariffs and how you might structure tariffs in a way to view with, with some of these new and emerging DERs. The other thing that's missing is in Gridland B or at least wasn't the DOE version. It only supported what are called the IEEE 1366 metrics. These are reliability metrics. You've probably heard of terms like safety, safety. These are, these are metrics that measure the impact of outages on the loads, but it doesn't measure them in ways that are particularly useful if you're talking about things like resilience or the effect of solar and battery technology on a consumer. If a consumer has a battery in a solar unit and they can run autonomously for some number of days, then the impact of disconnecting them from the system is very different than it would be for a consumer that doesn't have those technologies. And these metrics that exist right now that are used essentially ignore that difference. And so part of the work that we've been doing is trying to understand what kind of metrics do we need to really understand the difference between the impact difference that we see between a consumer that has these kinds of distributed resources versus a customer that does not. And so we've been working on some of those questions trying to deal with that. So this is, this is what CEC has been funding the enhancements for us to do. So the hosting capacity analysis was in there originally, electrification impacts analysis, they're very keen to see us support that, tariff design, and then the resilience impacts analysis. And I'll talk about, and I'll show you some examples of all these here in a couple of minutes. And just so you have context, this is not the only project that CDC funded. There's actually, there were several other projects. GLO is actually still underway and being led by Hitachi. And it's basically a user interface for Good Lab G. The work that we've been doing is basically on the engine and the simulation tool itself. Hitachi has been working on the user interface that makes the tool much more user friendly than it currently is. I would call it extremely user unfriendly in its original form. And then Open Fido is a tool, it's a framework essentially for interoperability and data exchange between various tools, particularly between Good Lab D and the simulators and tools that the utilities in California, the IOUs, the California IOUs, and specifically Sine and SpiteAccount and these other tools that take care of a lot of the data and simulation traditional activities that utilities do with their tools suite. So you can think of Good Lab D as sort of complementary to these tools. It provides a range of new capabilities. It's not really a substitute for the existing capabilities that you find in Sine or in SpiteAccount or other tools like that. So Open Fido provides a data exchange framework. And then another project was PowerNet with Markets, which some of you may be familiar with. PowerNet was a project actually led out of Stanford for ARPA-E by Ron Roger Paul. And then there was an extension to that project, which was looking at using the transactive technology in that framework. And that was done from the CDC. And it used Good Lab D as well and contributed some pretty significant enhancements to Good Lab D as part of that work. And those enhancements led into the tariff design work because transactive can be thought of as a tariff. So you know what, I'm just going to see what this is about. So let's let's just dive into a little bit some examples of the results that we're getting so far. This work is not complete. So you're seeing the work in progress. And so I'll present to you sort of different dimensions of each of these. I don't have time to cover everything, but it'll give you a sense of how we're studying the tool and its behavior and its application. So hosting capacity analysis, which is essentially, think of it as an extension to ICA meant to address storage and electrification transportation as well. So the question is, how much of a particular resource can I add to a distribution system before I hit some kind of limit somewhere? And there are a lot of places in the distribution system where you might encounter a limit, you might encounter it at the feeder itself and overload the cable that comes out of the feeder, because it happens to be in a duct bank and there's a thermal limit on the duct bank and you can't move more than, you know, a thousand amps through that duct bank for the wires and out. Or you might find that there's a limit on a load tap changer feeder or a voltage rate delivery. It just can't tap up or down any further than a certain point. You can't add more resources because the voltage deviates too much. Another kind of limit that you might encounter is that as the resource turns on and off, the voltage fluctuates too much. And so these kinds of limitations, the thermal limits, the voltage limits, the extreme voltage limits and the voltage fluctuations are all issues that the ICA methodology was designed to identify. And we extended that to cover not just solar, but also batteries and electric vehicles. And so those capabilities were added to the simulator. So as the solver runs, it also as part of the solution process explores the limits of the system while it's solving the powerful solution. And so we ran tests of this methodology on feeders that were provided to us by a utility, a partner in the project. They provided us with about 2,000 feeders for their entire system. And so we ran these tests on a number of feeders just to see how well the simulator performs. And so there's a number of things that we're obviously concerned with. One is, you know, is the model sufficiently complex that our test is meaningful? And as you can see, you know, we had models, typical model had, you know, somewhere in the low thousands loads on it, although one of them had over 7,000 loads. And those loads were, you know, allocated to nodes and the rough allocations 1.35 loads per node. If you think about it, that kind of makes sense because you have a tree of nodes. So you would expect to see more nodes and there are loads on the system. But that gives you a sense of sort of the complexity of the model we're dealing with. And then if you look at the runtime performance, you know, it's roughly quadratic as a function of the number of branches on the system. And, you know, so that's not unexpected. So for example, if we have a system with 1,000 branches, it would take about a minute to do the ICA analysis. If it had 2,000 branches, it would take about four minutes, roughly. So it's not bad. More importantly, if you compare running an ICA kind of analysis to running a simple solution without doing the ICA, you would expect it to take longer. How much longer it takes is highly dependent on what's going on in the system. So it's not a very simple relationship. You can't just say, oh, well, if we've got a system with 1,000 nodes and I run ICA, it's going to take twice as long or four times as long. But what we do see is that there's a fairly consistent distribution in the performance. Most of, you know, more than 30%, about a third of the models take less than twice as long when ICA is included in the solver. And then, you know, sort of goes down from there in a kind of, you know, exponential decay. And none of them took more than 250 times longer. Although I'm going to tell you, 250 times longer is a lot. That must have been one nasty model. I don't know which one it is by the way, but I haven't looked. But what we do find overall, and this is, so we also ran these tests, performance tests on the load forecasting exercise. And what we did find is that compared to the DOE version, the CEC version that we created is way, way, way faster. We just found all of the little tricks and things that we needed to make the simulation run a lot quicker. And also use a lot less storage. Again, you know, the DOE version is a research tool and speed and certainly cost definitely storage were not a factor. But as a production tool, those are extremely important considerations. And you can see that we reduced the runtime dramatically for a 2000-fee year load forecast. So it's an annual load forecast. And of course, reduce the cost also very dramatically. And in reduce the storage requirements was pretty significant, although not nearly as dramatic. Now I got to say, going from $100,000 to a list of $25 on AWS, that was something. I didn't expect that. I thought it was going to cost like $2500. But you know, choosing just the right configuration on AWS and setting up the server in just the right way took a little trial and error. But once we got it, it was really effective. It hats off to the team that did that. Alona, if you're on there, nice job. Okay. So moving on, the end-use electrification. This is a pretty straightforward idea. As we switch gas over to electric, what's the impact on the system? What's the impact on consumers? And so this just gives you a quick idea. Here's a test that we did just for a single home. Obviously, if you were to extrapolate this for a large number of homes, we would expect to see slightly different or perhaps significantly different results. This gives us an idea of how the tool is performing, because we can inspect that one home and look very precisely at what's going on to see whether it's working properly. So what we looked for is there's a significant shift in the peak time. And what we can see is that it depends where you are. And it wasn't a major shift, although we might see different response depending on the type of home that we're looking at. This is a pretty standard sort of California house with moderately good thermal insulation. And so we were extrapolating, if we were to switch 60% of the homes, how much increase of the peak load would we see? And we can see it's not exactly a linear, but it's pretty close too. How much energy increase would you see in electric energy, not total energy? Clear about that. The peak time in L.A. is the afternoon because we're in the commercial area? Nope. That's a good question. The time of the peak time is one o'clock. I'm assuming that the house had air conditioning, or maybe this is one o'clock in L.A.? I don't know. Yeah, the other is on the wire. Yeah, the question about the peak time because you're farther south, the solar effects are different. Good question. I don't know the answer to that. But what you can see is that it's very sensitive to location and pre-existing conditions. And so, for example, China Lake, I believe it had almost no gas, and so switching to electric had also. The other thing that we found that was kind of interesting is that if we had homes that did not have air conditioning and had gas heating switched to heat pumps, that introduced air conditioning where there wasn't previously air conditioning. We would see an excess summer load that wasn't present before, even though there wasn't air conditioning before. And it's because the switchover of the heating system added a resource that now was present on the system of the sun. And so it's an interesting question. I think those kinds of unexpected results are not unusual when you use grid.d. Because as an Asia-based system, the devices just do what they do. You don't come to it with a pre-supposition about what they're going to do and tell it. You use the models and the models, then exhibit behaviors, whatever behaviors they happen to embody. And so that was one interesting outcome. For tariff design, here it's a pretty straightforward idea. As you electrify, as you introduce demand response and other measures, what's the impact on the utility's revenue? What's the impact on customers' bills? Who's paying for it and how much are they paying for it? And of course, we can introduce new tariffs. We have this nice tariff database from NREL that they maintain that we draw most of our tariff data from, but you could add your own as well. And then, of course, the idea is to output all those data sheets that the utilities can then use as part of their tariff development and design process. So here's an example. Oops, too strong. Here's an example. So this, just to give you a sense of the sensitivity of the model, this is the electric bill charges as a function of floor area. So higher floor area, you see obviously a larger bill. This is 2,000 square feet, 1,500 square feet. This one looks like it's about $90. It's about $125 per month. This second one is thermal integrity. So a house with pretty good thermal integrity costs less than one that is not so good. And then the bottom one is the heating set point. So obviously, if you reduce the heating set point, yeah, if you reduce the heating set point, you reduce the cost. Nothing surprising there. Obviously, the main point of this is to compare the tariffs to each other for different regions within utilities territory. So here's the comparison of three PG&E tariffs in Region R. And don't ask me where that is. I don't remember somewhere in PG&E territory. But you can see that as you try different tariffs with the same house, you get a different cost for the annual electricity bills. And of course, if you were to change how the house behaves, change set points, change how it responds to the TOU, maybe align the set point schedule to the TOU, you might see even larger terms. Finally, the resilience use case. So this is work that we've been doing for Department of Energy. And it turns it turned into a use case. Southern California Edison has been supporting us in this work and helping us test out these resilience analysis tools. Here, the idea originally was focusing on extreme weather. So well, if we have high winds, how does that affect the power system itself? If a pole is down, or pole fails, or we have vegetation contact power lines, how does that affect the system? So the current version focuses on the pole failure due to high winds. And then we're currently working on vegetation contact with the lines vegetation fall. And hopefully we'll get to fairly soon, you know, interactions with wildfire and so on. And so here's just a simple example. We tested it out on the 230 kilovolt line that supplies slack, because we have all the detailed information about the line itself. And so this is everywhere where there's a nonzero probability or chance of contact with vegetation, given the proximity to vegetation to the line. And we set it to be hypersensitive to that so that we could tell whether it was working. I think in reality, the right of way is maintained so well that probability is zero everywhere on that line. But for our testing purposes, we did this. But it gives you an idea of how the results are generated by the tool and how they can be viewed. So in this case, you can see, you know, looking at a single line, we have information about the elevation, the height of the trees, the base, the cover, that's the percentage of the area in that region that's covered by vegetation, the height of vegetation, and the line sag. And if these two numbers get too close to each other, that would be considered probable contact. And you can see this is a pretty far apart. We made the five meter clearance or criteria so they wouldn't say something. So overall, that's what we've been working on in the coming months and years, I hope to focus more on some of these issues. So the first, we've already submitted a number of proposals in this area in the climate change impacts. So if you think about, you know, as climate in California changes and it becomes, you know, we have more heat waves, we have more, you know, effect from wildfires, more interaction between the power system and vegetation, we should expect to see the way the system behaves change. And the question is, how do we capture the impact of those changes on the utilities, on consumers, on the environment? And so we've got a number of proposals that will bring together to try to enhance the tool to make it possible to do those studies and really get at some of these questions about interaction between the environment and the system and climate. Another one that has recently emerged in something that we're interested in is the energy and water nexus. And so here, you can think of this as a couple of different ways. The traditional energy water nexus in the utility world is, do I have enough cooling water for my thermal plants? That might not be the way it goes. That might be what's important going forward. Obviously, if we don't have a lot of coal and natural gas plants, thermal is not the issue. What is the issue is, you know, the utilities do provide electricity, the energy needed for water delivery and waste treatment. And the question is, can we do things to make waste treatment of water delivery more efficient? And can we make it so that it is not as sensitive to the availability of electricity going forward? If we, you know, if we have to exercise PSPS more often, we obviously have to consider other infrastructures and how they're affected. So this is an area that I think is going to emerge as important, and the tool currently really doesn't support that kind of analysis. I think it's going to be okay. Another area that we've been working on, fair amount, is real-time simulation and developing tools to help with training and testing technologies. So, you know, utility operations are becoming much more complex. It's just not the same kind of system. There's a lot more going on. It's much more dynamic. And so the control room operations at the distribution level is starting to look a lot more like control room operations and transmission. There are many dimensions that need to be considered, particularly under emergency conditions, bad weather, it can get very chaotic. And when you think about, you know, human operators whose fast decision-making skills become essential to the functioning of the system, that starts to bring up some issues about, you know, training and skill and fatigue and experience that haven't really typically come up with foreign distribution system operations. And it's something that we've been looking at. The tools, as it exists now, does support real-time simulation, but it doesn't really have a good framework for supporting modeling of operators. And it's something we need to spend some time thinking about. And I think it's a really interesting research area that's emerging for us. Finally, there's all this transactive energy work. We've got a major project in New England for which we have to upgrade the simulation. The simulation, as it exists today, only models retail energy prices. It does not model any other kind of retail price. And the project is actually intended to study pricing storage separately from energy. And that's something that we need to work out. So that's an area of current work, some of you are working on it, and probably will continue to get a lot of focus, particularly with questions emerging about whether transactive energy meets the administrations and I think generally people's expectations for what would be considered energy equity and environmental justice goals that we have. You know, does pricing energy in a sort of real-time price mechanism really get us what we're looking for in terms of transitioning the system off of carbon-intensive resources towards electricity? If electricity turns out to be a lot more expensive and requires technologies that are very expensive to implement, well then we may be leaving a significant fraction of the consumer community behind because they simply can't afford it. And that's a major concern and it's something that needs to be addressed. And so a lot of the work that we're doing is trying to focus on that question, how we can ensure that our systems are more fair and equitable to the consumers. So that's about it. I'm trying to go quickly because I want to leave time for questions. If you want to find out more about it, the resources are on the internet. We are working with Linux Foundation Energy to deploy and support it as an open source tool. And of course, if you're interested in working with us on this as part of your research or as part of a class project that you might be doing, that's my email. You can contact me anytime. There's a bunch of us who do a bunch of different things. I'm sure I can find somebody that could help you out if I can. And just credit where it's due. This is the result of a huge number of people working on it. I hope I've got everybody. I'm trying to update the list this morning and realize that I was probably still missing a lot of people. But yeah, this has been an enormous amount of work over the last 20 years and just want to recognize all the people who worked on it recently especially. So with that, I'm ready for your questions. So there's a question, what are the top three considerations for tariff design? Well, that is not my area of expertise, but I will take my best shot at it. So as a general rule, what utilities try to do when they design a tariff is ensure that the tariff does not shift costs inappropriately across customers. There's always going to be some sort of what they call cross subsidy where certain customers are going to be essentially paying for other customers usage. But the difficulty that you have in tariff design is that utilities costs do not fit very well with how consumers pay for electricity. If you look at your bill, your bill essentially says, well, how many kilowatt hours did you use? And we're going to charge you this much for each kilowatt hour. And that's essentially going to be your bill. There's also a bunch of fees and there's a little fixed cost component. But if you look at how a utility actually pays for what they're selling you, it doesn't look anything like that. It's not really a function of how much energy you use. It's a function of how many people do they have, how old is their equipment, where do they source their energy, how much did that source cost. There's all these other things that have to somehow be mapped into this nice little clean variable cost function that we present to consumers. And that mapping is not perfect. It's actually profoundly imperfect. And so tariff design is really the art of making that mapping so that it's not particularly unfair to consumers who can't afford energy anyway to begin with. We don't want to hit low income consumers particularly hard. But we also don't want to shift the cost too much under people who aren't actually paying for it. Another important dimension of tariff design is you don't want to present people to fixed costs regardless of how much energy they use or what the peak is. Because if you do then there's no incentive for them to behave in a way that would be conducive to good energy delivery. Let's just put it that way. Everybody would be sort of ignore how much they use if either lights on except thermostats to whatever is convenient not what's appropriate. So you do need to expose consumers to some costs and they need to see the effect of their behavior. So the trick is striking that balance. Make it so that they're exposed enough that they respond but not so much that it becomes deeply unfair and difficult for them to survive. And so to me that's how I see the challenge of tariff design. The way it works is utilities have to design a tariff and then submit that tariff for approval. Well I should say this is for investor-owned utilities. This is not true for public utilities. But they have to design a tariff that's submitted for review and approval by the commission. This is the public utility commission. That process of designing a tariff is tricky. So you can't obviously just take all the customers that you have and run a simulation of that. So they pick like a hundred customers that are representative of their customer base and they use that as the basis for doing the analysis. And yeah it's an arc. It's really not as much of a size as you might expect it or hope it. Hopefully that answers your question. Yeah sorry I got a follow-up question to that as well. On your point around I guess energy poverty which I'm hearing is pretty normal. It's kind of formally where you have solar panels or they're incentivized to buy solar panels. These are typically your high paying customers who subsidize. The poorer customers you end up with a situation where less and less of those are on the grid. So I mean given that you're kind of within the mix of like figuring out this tariff situation what are the current like responses as to how you sort of address that tension? I'm assuming everyone heard the question. Oh yeah okay. So all right so the fundamental problem is that the way we've incentivized solar in the past was essentially you know net meter the idea that if I produce excess energy at my home I run my meter backwards and I get paid the same amount that I would have paid if it hadn't run forward basically. And I don't have to worry about any of the effects on the system of doing that. That's the utilities problem. That's the over simplified way of thinking about it but it's fundamentally that's what's going on. Obviously the problem with that is that if everybody can afford to buy solar does so and install solar particularly if the business model for installing solar is we're going to share the benefits with the installer of that meter run backwards then there's an enormous incentive for over design of the solar production which the utility has to resist. It has no choice because if the utility has to buy all of its energy at the same price at which it sells in the utility goes out of business so it's not sustainable. So you're right as we transition and more and more consumers are producing electricity generating electricity at the same price at which they buy electricity that means that the remaining costs of the utility are shifted onto those who don't do that and you have this equity problem that emerges. Okay so what's the hypothesis for how we get out of this? Well one way is to simply stop doing it and you know there are policy measures that I think are being considered to do that in some places it's already underway. I actually think there's there's perhaps another approach to this and this is this is at this point it's a hypothesis but that if we were to use a transactive approach we could get these high-end customers to switch from a tariff that is a net metering tariff to a transactive tariff get them off of the net metering so that they are willing to essentially to sell power for what is truly worth for people willing to pay and not for what the reverse flow will be. It's our hypothesis it's subject to test I'm not super confident that that will work but I don't have a better idea at this point for a way of designing a tariff that would get us off of that metering in the long run. I think the problem is that people who have metering right now feel they're entitled to it. I don't see why they shouldn't feel that way. I don't think it's good policy in the long run but it was definitely good policy in the short term and you know didn't incentivize a lot of solar? Look at the evidence. I think it did. Should we keep doing it now? I mean the follow-up to that is you mentioned that then there's some tariff simulations that you're doing. Are those yeah I'm just curious on the side of what you guys are doing. Is it more sensitivity analysis to figure out if you increase the time of use times here's a case you can go back to the shift yeah um where's it more? We're not too sure how people are going to use it. I suspect that's how they are going to use it. That's not how we've been testing. We do sensitivity analysis but it's not for that kind of purpose. At this point we're testing as we run it and we look at the numbers and see if it makes sense for the buildings that we put in the model. So yeah we're still in sort of the testing phase of the capability and not looking at how people might use it but as I think you're right I think that's that's how people want to use it is try to understand how how well the tariff is responding to a change in the technological mix that you see in the distributed energy resources so as I add batteries what happens to the revenue flow what happens to the customer cost is there another one of these shifts occurring because of arbitrage events right that I think those are the kinds of questions people want to ask. There was a earlier question asking me the recording will be available. I guess following up on that I'm curious what your thoughts are on the new net energy metering proposal. I know like right now like NEM 2.0 is what's currently like under effect and I think as I understand it the like third iteration of net energy metering would be kind of more this transactive setup that you described but I think it's like kind of tabled indefinitely because it was so politically unpopular and your thoughts on yeah how to balance kind of what would be technically optimal for or like economically optimal almost and like comparing that to like political feasibility and how to kind of go make people be okay with having less benefits. All right let me let me see I'm not comfortable enough yet with all the ins and outs of this. I'm going to wait until we can run situations and we haven't done that yet so I think I'm going to hold off on answering now until I can answer with some insights. I will say I don't believe the the transactive solution will be a comprehensive solution. I think it will attract some people particularly people who have mixed resources that is very favorable but if you already have a solar system and you don't have a battery I don't see why you want to change honestly. If you have a solar system and so for example I so I believe the current rule is we have a solar system and you want to put a battery in that's fine if you want to put a battery in the dry solar that's not okay but I could imagine that you could have those rules be more flexible if you go under a transactive tariff because the issues that those rules are designed to apply to resolve can be addressed by the tariff because the price signals would ensure that none of those conditions unfavorable conditions on the system emerge if they just couldn't and so you might be able to attract people to to switch over by relaxing some of the rules that were designed to protect the system from its behavior. So it's not a great answer to your question but that's I think what we want to spend some time on. Yeah, two quick questions. So I guess the first is have you guys done a test of the circuit because my test of this with regards to microgrid systems? We have at PNL hats. Okay, okay. They've done a lot of microgrid systems. Oh, awesome. And then the second question is does it consider reactive energy power flow? Yeah, so it's a the power system model is a three-phase unbalanced. It does all the interactions between the lines as well as all the interactions within the devices and it's obviously reactive power. Okay. And are you able to, so I, for quick comments, I did a project where we're basically looking at how can you express the sum of orders to inject reactive power to this or optimize losses. Could you be able to do something like that? Absolutely. Yeah, you can change the the relationship between reactive power depending on the device, right? Or you could hypothesize a device that does something that hasn't yet been done and implement that in the simulation to see what effect it has. So you don't even, you're not even restricted to do something that's physically feasible at the device level. You could just say, okay, well, this is what my load behaves. This is how my word behaves, even though there's no such thing in the real world today. So for example, you know, there were, there were models of inverters that had control strategies, which you would not actually, but were not available at the time, which now probably aren't very viable. But because the controller design for the inverter was parametric, you can make it do whatever you want. People are trying all sorts of wow things. And some of them were interesting. That's particularly true when you're looking at microviz where, you know, you have to start to think about performing behavior and, you know, things that you would never really deal with on a normal distribution system. We might change. A good example is the load model includes what's called grid friendly appliance, which is a frequency based load shedding strategy, which turns out you don't, you really don't want to do it at scale because the gain of that control strategy is enormous and dangerous. It's like, it's too much. But it's in there and in simulations and you can turn it on and you can see what it does. So yeah, there's all sorts of stuff like that. Any more questions from the students? Okay, thank you very much.