 My name is Ann Jekyll, I'm the Associate Director of New Mexico EPSCORE, and this is part of our New Mexico EPSCORE Smart Grid Center webinar series, and we're so pleased today to have Seth Bloomsack, who is a professor at Penn State University. I'll introduce him in more detail later, providing a really interesting talk today. I wanted to let you all know that these webinars are recorded and archived on our website, and we will also have a copy of the slides on our website, so afterwards, if you want to revisit any of it or pass it on to somebody else to take a look, that will be available there. And also the chat is enabled for today's webinar, but I wanted to let you know that if you have any questions, please type them in the question and answer box that is part of the webinar interface. I'll be monitoring that in the chat, and so we can interrupt Seth as he's talking. He said that's okay, or we can ask some questions at the end, but please keep them coming. We'd love for this to be an interactive presentation. Hopefully it advanced to the next slide. I'm never totally sure what you all are seeing, but I wanted to alert you to two upcoming webinars that we have scheduled. The first is Making Effective Academic Posters, the Better Poster Approach. A student named Mike Morrison, who's creating a movement called Better Poster Movement, will be presenting, and this is great timing before our New Mexico Research Symposium, and any other conferences coming up this fall and spring. This is open to all, not just our Smart Grid Center project and others, so please pass the word. It should be very interesting to see this new approach to better communicating science through better academic posters. And also, we're very excited to announce the launch of our Entrepreneurship Certificate Program, which will start with a webinar on October 28th at noon, where there will be a number of presenters, Andre Cuscaden from the New Mexico Epscore State Office, but also representatives from the University of New Mexico, New Mexico Tech, and New Mexico State University to tell you all about exciting programs and entrepreneurship that we have in our state. And then this starts a program where there will actually be a series of lectures and quizzes where you can get a certificate in entrepreneurship. So stay tuned for more information on that. We'll be sending it out soon. And with that, I would, it is is my distinct pleasure to introduce Seth Bloomsack. He is a professor of energy policy and economics and international affairs at Penn State University, and also director of the Center of Energy Law and Policy in the Department of Energy and Mineral Engineering. And in addition to other appointments, he's external faculty at the Santa Fe Institute and adjunct research professor at Carnegie Mellon and their Electricity Industry Center. And I don't know why this doesn't make your official bio, Seth, but he's a member of the New Mexico Smart Good Center External Advisory Board and has been a very important part of shaping our project and looking for the interdisciplinary linkages among our project team. He also brings interesting experience because he has degrees in both math and economics and an advanced degree in economics, but also a PhD in engineering. So is one of those rare minds that can see the linkages between disciplines, bring them together, and you'll see through his very interdisciplinary and problem driven research how he ties it together. So thank you so much for being here with us today, Seth. And you can go ahead and share your screen and we, you can get started with your presentation. Okay, hold on just one second. Okay, did I, did I do that right? I had to ask the obligatory, can you see my screen question? Okay, I, so mine's still in presentation mode. Sorry, PowerPoint took it over. Yes, I can see it. Okay, cool. Okay. All right. Thanks, Anna. Thanks to New Mexico EPSCORE for the invitation to give this talk. So this is, I don't, I don't know if people can see me or just see my slides, but this is a truly momentous occasion in my life. This is the first time in seven months that I have been in my office on, on campus. So, so, you know, we're sort of like living life on Zoom. We're all used to getting these little glimpses into like other people's lives. And unfortunately, today you don't get to see my kids' art room, which is where I usually work. But I, so this is my, I normally like, you know, drab university office. So, so, what, what I'm going to talk about today is something that is kind of an, an effort that was funded by DOE, but was a joint with a number of different utilities in the state of Vermont. And this is a project that we actually did several years ago. So in a way, it's kind of old. But I think in a way, I'm giving this talk at sort of a fortuitous time, right, in the sense of everything old is new again, because the kinds of things that I'm going to talk about are not just relevant to how we go about designing smarter energy systems. They're also very relevant to, you know, how we make the, the, you know, how we make the systems that we have more resilient. Because, you know, I think, you know, in, in light of all of the crazy stuff that happened with rolling blackouts in California a few weeks ago, the kinds of strategies that I am going to be talking about today, right, and we did these kind of, you know, experimental projects on with Vermont utilities several years ago, I think are especially relevant to allow grid operators, whether they be large scale or small scale grid operators, to really kind of harness the power of the demand side, right, to allow the grid to ride through really kind of turbulent events without having to resort to crazy things like instituting rolling blackouts. And so I think it's, I think it's a fortuitous time to be giving a talk like this. And I think the, I think this is also indirectly a very good plug for being very kind of broad minded and, and open to how you do research. Because, you know, this is a project where we, you know, design these experiments with utilities, we did some statistical stuff, you know, but a lot of the most interesting insights for, you know, the value of the kinds of experimental programs that these utilities were running, the way that consumers kind of understand or form mental models in their head about, you know, about advanced grid technologies and grid modernization technologies, those things that we used to call the smart grid. A lot of that stuff came out of the surveys that we did of the customers that were involved in the, you know, in this, in the, in these programs. And, you know, and so in a way, this talk is not just a talk about this particular project. It's also kind of indirectly like a plug for people who normally work very close, you know, very sort in a focused way with quantitative data or quantitative modeling, right, to, to kind of, you know, take a look at qualitative data every once in a while. Take a look at kind of the stories that people tell about how they see their systems, how they think about interacting with them. And, you know, there's sort of a very famous quote, and I don't remember who said it, but it was in my, the, my first semester econometrics book. There was, there was this great quote that said the plural of anecdote is data. And so, you know, this is, in a way, what I think is the most interesting part of this talk is, you know, is sort of a plug to get to understand how, you know, how consumers and how customers sort of see the world, right, versus how we as domain experts and academics, right, how we see the world. So anyway, so that's, that's sort of my plug and now I'll actually get into it. So this was a project that I worked on in, so with funding from DOE and with a number of different Vermont utilities. And I also have to acknowledge the folks who basically did most of the hard work on this, on this project. So, so Paul Hines, Jason Cluffio, Suman Gautam and Roger. The first of those is, is a collaborator of mine, a longtime collaborator who's at the University of Vermont and is actually has a startup doing grid modernization stuff. And I'll also plug to Ann that you should really try to get him to, to give one of these talks. And the other three were Penn State students who worked on various parts of this project for their, for their thesis and dissertation. Okay. So what I'm going to talk about is this is one of a number of what we're called consumer behavior studies that were funded by DOE in conjunction with the smart grid investment program, which was, which was an initiative that DOE undertook as part of the, as part of the American Recovery and Reinvestment Act, right, after the recession in 2008. And so the, the SGI program, one of the things that it did is it basically threw money at utilities that wanted to deploy, you know, that wanted to deploy hardware that at the time we were calling smart grid stuff, right. So this is, you know, these are, these are phasers, you know, the sensors on the high voltage transmission grid, this is advanced metering and distribution infrastructure, all this kind of stuff. And so along with that, the, the DOE basically kind of dangled these carrots in front of potential applicants and said sort of, hey, wink, wink, we're much more likely to fund you if in addition to taking our money to deploy hardware, you're also willing to conduct some kind of behavioral experiment to help us, you know, better understand how all of these technologies can be used to promote smarter energy consumption. And so this consortium of utilities in Vermont, which was called the Energy Vermont Consortium, basically bit at that carrot. And, and so, and so I was connected with them and helped them to design and implement their consumer behavior study or one of them. And so even though this, you know, all of this stuff was done several years ago, it's, you know, the reports and everything still live online. I did set the main project report to Anne and she can circulate it, but if you want to see other stuff related to our consumer behavior study or others, then all of this stuff is still lives up on the web. Okay, so the kind of the, the, the idea behind these consumer behavior studies, not just ours, but all of them was, you know, to try to use information that is generated by the smart grid technologies that were being deployed at the time in conjunction with various other kinds of incentives to try to, to basically to try to promote smarter energy consumption. Okay, and even at the time that we were doing this, we were entering just a very, very crowded marketplace of studies, pilot projects, experiments that were basically trying to do the same thing, right? And, and so each of the, each of the bars in this picture represents one study, right? And this picture now is seven or eight years old and there have been a gazillion more studies since then, but even at the time that we did this, there were a, there, there were a lot of different attempts by, by utilities in particular to try to reduce peak time electricity consumption, right? And so, and there were a lot of different mechanisms that were used, you know, there were, there were a lot of different economic mechanisms. So some of the more popular were what are called time of use rates, which basically set specific times when electricity prices are low and high, peak time rebates, which basically give consumers money back for reducing electricity usage during, during peak periods, critical peak pricing, which is, you know, which basically is a mechanism to set a very, very high rate during, you know, during peak periods. And then there were a few that were experimenting with what we, what, you know, what, what, what, what we call real time or, or hourly pricing. So where your price of electricity varies every hour based on the outcomes and some kind of wholesale market. So there have been a lot of these and they, you know, the, the, the, you know, the, the effectiveness of those at reducing peak demand has just been kind of all over the map. Some of them have been very effective. Some of them have been horribly ineffective. But, you know, part of the, you know, part of what comes out of this picture is that in general, right, those studies that or those experiments that have tried to use some kind of, you know, have tried to leverage some kind of technology, you know, to give consumers information to help them make better decisions in the face of some kind of, you know, in the face of, you know, some kind of peak time rate, right, or some kind of pricing incentive, those have generally been more successful or more effective at reducing peak demand. And so the, the study that we worked on kind of basically sort of sits squarely in this space. Okay. So the, the part of this, the, the part of the E Energy Vermont study that I'm going to talk about involved working with a company that at the time was called Central Vermont Public Service and now is called Green Mountain Power. And, and, and so this was a study that was conducted in Rutland, Vermont, which is, I don't know if you can see my mouse, but it's basically right around where I'm circling my mouse, right. It's the second largest city in Vermont after, after Burlington. It's also sort of interestingly enough, you know, it's one of the lower income areas in Vermont. So this, you know, there's, there's, there's, there's a lot of poverty in the area around Rutland. And so this was an experiment that was run over the course of two years. You know, basically it, it, you know, it sort of officially started in 2011, when the advanced metering and distribution infrastructure was deployed. But you know, the, the part of the, the study that I'm really going to talk about ran in 2012 and 2013. And so the, the, the way that our consumer behavior study worked, okay, was that it was, so it was structured as a randomized control trial, which is sort of, which is basically the gold standard for doing these kinds of behavioral studies, right. So we had, you know, customers opted in to participate, and then they were assigned randomly to one of a number of different treatment groups that are, that got, you know, that basically experienced different electric rate structures and got different kinds of technology, right. There was a control group, which basically, which the control group was given, you know, was given some information about when, you know, peak demand days were happening. But beyond that, they didn't have any change in their rate. They didn't have any other technology or information or, or anything like that. And there was also what we call a Hawthorne group. And so the, the, just as sort of an aside here, the, the, the Hawthorne effect is, describes a phenomenon that was sort of originally noticed in behavioral psychology, where people tend to change their behavior without being given any particular incentive or reason to do so, but only because they're told that they're being watched as part of an experiment, right. And so we had a Hawthorne group of, of, of, of Central Vermont public service customers around Rutland, who were randomly assigned, they were told that this experiment was going on, and that we were, you know, and that the utility was going to be monitoring their electricity consumption data, and then the utility never spoke to them again about it. So, so, so there, in addition to the, what we call the treatment groups that had the technology and the different rates, and the control group, there was, there was basically this, this, this Hawthorne group. So the, the way, the way that it worked, okay, was that this was what was called like a critical peak pricing experiment, okay. And so this, so it was an event-based study. So at certain times during the summer, the utility Central Vermont Public Service would declare a peak day, and during that peak day, those who were in the various pricing treatments would see their electric rate change, right. So they would get charged a different price during those, during those, during those peak days, okay. And so the kinds of rate treatments that we had involved, so involved a critical peak price, a critical peak, peak rebate, and then we also had what we call the transition group, right, who started, who in year one of this, of the study were on the rebate, and then in year two transitioned to the, the critical peak price, okay. And then there were groups that had and didn't have technology, you know, consumers could choose how they wanted to be notified about a peak day. And then, you know, sort of, sort of critically, the way that, that enrollment for this, for the, for the project happened was that, you know, we were required by the Public Utility Commission in Vermont to use, use an opt-in enrollment structure, right. So consumers in, in Vermont had to explicitly consent to enrolling in the study, as opposed to being enrolled by default, and then being able to, you know, being given the option to get out of the study at, at any time, okay. And that opt-in at, at enrollment is not the most ideal design, but as I'll, I'll talk about in a minute, you know, the way that you would design these kinds of things in kind of a perfect world academic context in, in a lot of cases really differed from how we had to implement it on the ground, okay. So this is, this is basically a chart of all of the different groups, okay. So there were, so there were, there were basically seven treatment groups, okay. And, and so those treatment groups got varying kinds of different electric rates, either the critical peak rebate, which is CPR, or the critical peak price, which is CPP, okay. And some of those groups were given, were basically given technology to monitor their electricity consumption. At the time, we called this an in-home device, and, you know, now basically instead of an in-home device, we would probably have just like an app that would communicate with the meter. But those weren't really available when we were designing the study. And then there was another treatment group, which is, which is number seven on this, on this chart, which did not have, which basically didn't see their rate change, did not have any flashy technology or flashy at the time, but they were told every time that the utility declared a peak day, okay. And then our control group was the one that was called C1 in this chart, okay. So they basically, they were, they were, did not see any change in their electric rate. They did not, they weren't notified when there was a peak day, although they may have heard about it from their neighbors, because Rutland's a small town. But we did survey them, right. And then C2 there is this Hawthorne group, okay. And this is basically just kind of another way of looking at the same information, right. So the, basically for each of the rate treatments, right, the critical peak price, the critical peak rebate, and this transition group, we had kind of subgroups where they're without technology, right. So, and then the groups that did not see any kind of rate change did not get any, did not get any technology, okay. So there were about 4,000 customers who were involved with this. And I mentioned the Hawthorne group before, and I'm not going to say anything more about it, because we actually didn't detect any kind of Hawthorne effect here. So those that were told that there was a study going on, basically didn't seem to, this did not seem to affect their electricity consumption in any statistically significant way as a group, okay. So these are the, these are the kind of the rate structures that we use. So one was called a critical peak rebate, okay. And so the idea was that, is that customers would be given a credit for measured reductions in electricity usage, right. So the utility had this kind of blunt algorithm for essentially imputing the electricity usage that they thought a consumer would have used during a peak period. And then any difference between that estimate and the actual measured usage qualified for this rebate. And so this rebate was about 60 cents per kilowatt hour, okay. The critical peak price was basically, oh sorry, one more thing about the critical peak rebate. The one element of the critical peak rebate is that if you do something, you get a reward. Or if you do something that the algorithm measures, you get a reward. If you do nothing, you get no penalty, okay. The flip side of that is the critical peak price. And the way that this worked was that on a declared peak day, everybody's rate who was in this group went from about 12 cents per kilowatt hour to 60 cents per kilowatt hour between 1pm and 7pm, okay. Which is basically the time of peak demand in Vermont and New England, okay. And so this is really more like this stick, whereas the rebate is like the carrot that's dangled on in front of you. This is the stick that the utility would beat you with, right. And the way that it works in basically the inverse of the rebate, if you do something, you avoid the really, really high electric rate. If you do nothing, then you have to pay the really, really high electric rate, okay. Now these rates were basically designed to look like the inverse of one another, which was actually very, very difficult because these rates had to be approved by the Public Utility Commission in Vermont, right. And actually, just to make things harder, Vermont actually has two Public Utility Commissions and we had to have everything approved by both. And so, you know, we could not basically choose these rates just out of nowhere, right. The rates had to be based on, you know, some kind of cost causation for peak electricity demand and all this kind of stuff, right. And so how we got at 60 cents was, you know, the utility did all of these calculations about avoided costs and 60 cents was the number that came out, okay. And so, those, and so this is a picture of the in-home device that customers used. At the time, this was a kind of middle of the road, like electricity usage monitor, right. There were in-home devices that were even more simple than this one and this one was pretty simple. And there were some that were really flashy with all sorts of different screens. And what really, you know, what drove the utility to doing this one was a combination of budget constraints, because the really flashy ones were really expensive and, you know, like how easy the company was to work with, right, and kind of really, you know, arcane things like that. Okay. So, on one of the first slides, I just sort of showed that there were some utilities that ran, you know, like, you know, peak time pricing programs, other that have used peak times rebates, okay. And, you know, Green Mountain Power or Central Vermont was basically convinced by us to do both, well, by us meaning me and DOE, to do both. And when we went to their state regulators about this, they were both confused and they were really irritated with us for a long time. I mean, it took us almost a year to get the parameters for the study approved by the Public Utility Commission, right. And part of what made the regulator, part of what made the regulators irritated with us was not just that we wanted to pick an electric rate that, you know, was seemed that we, you know, we originally proposed a peak time rate, right, that was not 60 cents, it was much higher, right, something that struck us as being interesting. And the regulator was irritated by that, basically saying, well, we're not going to improve, we're not going to approve any rate that you can't justify on a cost basis, right. But then the other reason that they got really irritated with us was that they said, why not just do the rebate, we like the rebate, right, because then nobody gets punished. And, and so this, you know, I mean, this was really not exactly what we wanted to hear, but it was a very interesting observation. Okay. And, you know, it was, it was, it was an observation that actually made the decision to do both a lot more interesting than we had at first thought it was going to be. Okay. And, Seth, I was going to just interrupt with one of these questions here to clarify. Somebody asked by rebate, do you mean incentive, like giving incentives to consumer to reduce peak load? Yeah. So, okay. So the rebate is an incentive. The way that the incentive worked was there would basically be a credit that would show up on your next month's power bill. Okay. So, you know, it's now September, if I, you know, if I reduce electricity demand during September, then I see a benefit from that by having a lower power bill in October. Okay. So that's, that's basically how the incentive worked. So, so, you know, through all these negotiations with the public utility commissions and Vermont's, the utility was convinced to do both critical peak price, the stick and the critical peak rebate, the care. Okay. And this actually sort of turns out, turned out to be interesting because, you know, if you look at, if you look at those two rate structures in the carrot and stick context, right, you know, kind of what one of kind of the foundational insights of kind of behavioral psychology says is that, you know, even though the magnitude of the incentive in the carrot is the same as the size of the stick, right? One of the insights from behavioral psychology is that you would not actually expect people to respond to the carrot and the stick in the same way, even if the carrot and the stick are the same size. Okay. And so there was sort of, and this was an, you know, this was an interesting opportunity to kind of test this in a real world environment. And then the other thing that came out of our discussions with the regulators, right, who in a way appreciated the potential effectiveness of the stick, right, but were just nervous about optics, you know, one of the things that came out of it was this idea that if you start people on the carrot for a year, right, then they get used to this whole idea of, like, getting notifications from the utility and responding to them, and they can sort of be trained, okay, to, and they can sort of be trained to just, you know, be used to responding to these things, which would make the stick more powerful. Okay. Okay, so this is my equation that shows that I know how to do econometrics. What basically, so what we did was we ran this experiment for two years in conjunction with the utility. Each summer, they called a number of different peak days. And at the end of it, what we basically had was we had customer level 15 minute meter data for several thousand customers in the city of Rutland. We had a bunch of socioeconomic data, and we had all, you know, and we had the data from these surveys that we ran with the customers at various points throughout the study. Okay. And so we basically, so we basically statistically estimated the effect of the rates, okay, on consumption, you know, on consumption during peak periods, on consumption during, you know, on consumption before the peak period began, and after the peak period was over on the same day to detect any kind of shifting. We, you know, we were able to study overall monthly energy usage. And so, and so I'm going to talk, I'm not going to talk about sort of the full set of outcomes that we learned. Okay. So since this was a randomized control trial, we had to convince ourselves that the people who were randomly assigned to all of these groups, right, basically looked the same. And so no matter which group they were in, they all had very similar kind of daily patterns of electricity demand. So that was good. So what this set of pictures is basically is going to show you is graphically the kind of average behavioral change during peak days in 2012 and 2013. Okay. And so I'm splitting this up by those customers who had the technology, the in-home device, and those that didn't. Okay. And so in, and so basically, you know, what we, so the, in all of these pictures, the control group is the kind of black line with the little circles. Okay. And what we observed more so in 2013, because it was a much warmer year than 2012, okay, was that, you know, on average, customers that were on both the carrot and the stick did reduce electricity consumption during the peak period, particularly those that were on the stick, we observed that they did tend to do some load shifting, right. So while they reduced demand during, so while they, while they reduced demand during the declared peak time, they increased demand during the off-peak time, okay, or during the non-peak time. And that was more so for the customers that did not have the in-home device monitoring technology versus those that did. Okay. So that was kind of cool. And these are the kind of like average peak time load reductions that we saw over the two years, you know, over the two years of this study in Vermont. Okay. And so the one thing about this that's, that's interesting and it's not necessarily surprising if you remember one of the first slides that I showed is that customers that were on the stick, you know, on the critical peak price, and have this technology that allowed them to monitor their electricity consumption achieved peak time load reductions that were about twice as large, or at least twice as large, more or less, as any of the other groups. The thing about this picture that I actually think is most interesting is that if you look at the critical peak rebate groups, the critical, which are over on the left-hand side, the critical peak price group without any technology, the CPP group, okay, and the group that had zero economic incentive, but were told when there was a peak day, all of those, all of those load reductions were like actually pretty similar, right. And so one of the things that came out of this was that on average, you know, over this over this two years is that, you know, for electricity customers in Rutland, simply being told that there was a peak day and being asked to conserve electricity demand was basically as effective as most of the economic incentives, either carrots or sticks. And it's really only when you combine the critical peak price with that technology that you see a larger peak time demand reduction, okay. And this is, and so this is that, that thing in numbers, I'm not going to dwell on this, okay. And so this picture here basically shows the monetary savings per event, right, on average. And so these are basically just the demand reductions times the rate that was in effect, right. And so, and so the, you know, typically per event, the customers that were on, you know, on the critical price rebate or the critical peak price without any technology saved between 15 and maybe 25 cents per event, right. The savings per event was about 10 cents higher for the peak, for the peak price group that had that also had the monitoring technology. And the savings for the flat rate group was really, really low because they didn't see a rate increase, right. So when they reduced consumption, well, their rate was pretty low already, okay. And so going back to this, you know, to this kind of pantheon of studies, the stars here basically show where the studies in Vermont ended up, okay. And so in the pantheon of the studies that were done, the results that we got were consistent with a number of other studies. But, you know, you'll notice that actually compared to some of the studies that have been done, they didn't do all that well in terms of peak time demand reductions, right. And part of this is the nature of Vermont, right. And they don't typically have hot summers, not a lot of air conditioning demand. So, and, you know, part of it was that the kind of, these were, this was not a very, you know, complex study in terms of, you know, the different magnitudes of pricing incentives or other things like that, you know, in part because the PUCs were very averse to, you know, to things like that, okay. So those are kind of the boring results. Now I want to get into the interesting results, okay. So one of the things that we noticed was that on an event to event basis, there was just tremendous randomness in, you know, in the event level response, right. So what this, this is a mess of a picture, but all it does is it shows you the average demand reductions by treatment group for each of the events in 2012 and 2013, okay. And they're just all over the place, right, which I think we, you know, which is an important insight because, you know, one of the potential, you know, one of the areas of potential for demand side engagement is to reduce peak demand, right, to, you know, to avoid, you know, capacity upgrades that are expensive and don't need to be used very often, okay. But, you know, when you have all of this randomness, right, then it's not entirely clear what the kind of, what the value of it is to system planning, right. And, you know, a picture like this really, I think speaks to something that we have learned through these studies over the past number of years is that, you know, pricing incentives are, you know, they're good and everything like that and giving people information is helpful. But, you know, asking people to do things like reduce electricity demand in the middle of the day, which is not always the most convenient time, achieving that probably requires a mix of not just information and economic incentives, but also technology to like automate or program decisions or things like that. Okay. So, one of the other things that we noticed was that there was, was that, remember, customers had the option to freely opt out, okay. And so one of the things that we noticed was that especially around the time when this transition group was asked to, or was basically told that they were moving to the, moving to the critical peak price, right. We saw just very, very high attrition, right. People dropped out of the study in very large numbers. And what is really interesting is that those who dropped out were those who elected to receive their peak time notification by phone instead of by email or text, younger customers, oddly enough, dropped out of the study at much higher rates than older customers, and large households dropped out at much higher rates than smaller households, right. And so the success of these programs is, you know, is not just about technology and incentives, but there's a strong socioeconomic component to this as well. Okay. And, you know, there's, there was, there was some, there was also some variety in the kinds of actions that, that people reported taking, okay, in, in, you know, in response to, to a peak time, to a peak day notification. Okay. Now I want to basically finish up by talking about what we learned through the surveys and, and, and then I'll hopefully leave about 10 minutes for, for any questions. So we conducted all of these surveys of consumers at various points over these, over these two years. Okay. One of the things we asked them was, you know, did, did you like, did you like being part of this study? Did you think that this was a good thing? Overall, it was surprisingly not popular. I don't want to say unpopular, right, because there, you know, there were, because there were people who really liked it, there were people who really didn't like it, and then there were people who were sort of a man about. Okay. But, you know, one of the things that I think is interesting here is that there isn't really a correlation between whether people, whether consumers liked or reported like viewing the study favorably and how much money they saved. Right. So like the most popular, you know, the, this, this, this experiment was most popular with those who were in the critical peak rebate group. Right. Even though they didn't say very much money and they didn't reduce their electricity demand by very much on average. Right. They, they viewed this very favorably, which is interesting because it sort of suggests that, you know, electricity consumers may, they may have preferences over rate or incentive structures in addition to other things. Right. So that was sort of interesting. And, you know, we also sort of asked them where they thought the value in the study was coming from. Right. And this was a very interesting, and the responses to this were very interesting. Right. There was a little more than half of the consumers who reported that they, you know, thought this was a good thing because they saved money. They liked conserving energy. They liked the little doohickey that we gave them to monitor their energy use. But, you know, 40% of customers who were surveyed said that they thought the best thing about being part of the study was that they had fewer, you know, they had fewer outages. Right. Which is really interesting because that, you know, the frequency of outages had nothing to do with this study. Right. As it turns out, outages in Rutland did go down during this, during this, during the study period, but that had nothing to do with the, that had basically nothing to do with, you know, any of the rates or anything like that. You know, outages were less frequent and didn't last as long because of the automated distribution infrastructure. Right. And, you know, the, the ability that that gave central Vermont to identify and fix problems on their distribution grid more quickly. Right. So, you know, it was sort of interesting that people built up this model of how prices and, and information are valuable to them. And that model does not always jive with kind of the way that we understand that the physical system works. Okay. You might, might remember a couple of slides ago, I said that the rebate rate structure was more popular than or, well, that customers on the rebate viewed the, the, the pilot program more favorably than those on the critical peak price. But it was also clear, even though critical peak rebate customers were, you know, were more favorable towards the study, they were also much more confused about how it worked. Right. So we asked them these questions about their experiences in, you know, being in the pilot project and, you know, you know, how they benefited and all this kind of stuff. And we made these word clouds out of them. Okay. And the, the most frequent word used by customers in the surveys for both rate types was bill. Right. But the second most frequently used word or phrase by those on the rebate was, you know, when we asked them like how did this, how did being part of the study benefit you, they said, I don't know. Right. And, and so there is this sort of very odd, you know, tension, right, or, you know, things that don't quite add up. Right. Because the customers on the stick saved more money. They had a better understanding of what was going on. Okay. But they didn't like the stick. The customers on the carrot didn't save as much money. They seemed very confused by the structure of the carrot. But yet they liked it better than the, then, or they liked it better than those on the stick. Okay. And so, you know, I mean, these are, these are sort of some of the lessons that have come out of this. Right. Is that, you know, not surprisingly, the stick works better than the carrot. Right. Even though we found that this, the sticks, you know, that, that the sticks are unpopular. Okay. You know, we found that on average, right, customers do respond to economic incentives. Right. But there's a lot of noise and randomness in there that, you know, you sort of have to watch out for depending on what the goals of having these incentives are in the first place. And so the last one is that, you know, especially when combined with the right economic incentives, even relatively simple amounts of information can be useful, but you don't want to give people so much information that it's overwhelming. And, you know, like the process by which people take information and turn it into action needs to be just incredibly easy. Right. Because people have enough to do with their lives. And so there's, you know, one of the things that I think our results and others speak to is the value for combining various forms of like automation or intelligent decision making with, you know, with, with the right economic incentives. Okay. And then I actually, I want to end on this thought. Right. Do you remember how much money customers saved per event? It was like tens of cents. Right. And even though Rutland is not a very high income area of Vermont, it really has gotten me wondering, and I don't have an answer to this, right. It has really got me wondering why anybody in this study cared about the carrot or the stick because the monetary savings were just really, really low. Right. And, you know, and, and I also think back to the way that those folks who didn't have any economic incentive responded. Right. And it, you know, it almost makes me wonder whether like these pennies that were being thrown at customers in order for, you know, in, in exchange for them doing all this stuff, whether that made the, you know, the choice to reduce electricity demand, whether it made that decision like any more or less complicated. And, you know, I don't really know the answer to that. But I think, you know, it's, it's one of these areas that I think is, is really potentially fruitful for kind of the investigation and the design of, you know, these, these sorts of smart energy systems that are supposed to engage consumers in meaningful ways. Okay. That is all. I will take questions in the few minutes we have left. Yeah. Thanks, Seth. We just have a few minutes. Usually we end these right at noon, but we may go a few over today and it's okay if some folks need to drop off. I'm going to start firing these questions at you because we've had a bunch come in and, you know, I think this really speaks to what you were just saying at the end there. Jason from Idaho National Lab said on a monthly basis, do you have an idea of the total savings for the rebate customers? Was it financially worth your while? I think you kind of got at that, but is there anything else you'd, you'd add? Yeah. I mean, the, I was it, was it worth their while? Well, you know, I, I think there's a lot of variance, right? They're, you know, the, on average, we're talking tens of cents per month, right? For some customers, it was the savings were actually pretty substantial, right? And I would love to know exactly what these customers were doing, or, you know, whether it was a quirk of the algorithm that measured consumption reductions, right, for the rebate. But, you know, I mean, there were some, some customers that saw, did see a pretty substantial declines, right, in their, in their monthly electric. And this, this follows on from that is what is your prediction for the results if you'd been able to offer a larger carat, such as $10 per event? So, my, I mean, my expectation, you know, knowing the size of the, no, no, well, my, my, my expectation is that, yes, if the carat would have been larger, it probably would have been more effective. One thing about the carat that I thought was particularly ineffective was the rebate structure of it, right? If, you know, if, in part because the customers really just didn't understand how it worked, how it was calculated, right? I didn't show up on their following bill in a very transparent way. And then they also had a way to month to get it, right? So, in some ways, I, you know, I sort of think the potential for the carat is, yeah, if you could make it bigger, that would be more effective. But if you can make it more transparent and more immediate, right, then I think that, I think those, you know, those structured incentives could go further than the ones that, that we have. Okay. And then a couple questions here from on study design, how were the communications with the Hawthorne and non-Hawthorne treatment group done? So the Hawthorne treatment group was, was called by, tell, I mean, everybody, you know, this was 10 years ago when we were designing this. So everybody was contacted by phone or by, by phone and or by email. So the Hawthorne customers were contacted once, basically just to tell them that this study was going on and their electricity usage was being tracked as part of the study. They were never contacted again. The non-Hawthorne control group, those that just basically were on the flat rate, they were contacted several times. So they were contacted basically to approve their enrollment, so to speak, in the study. They were contacted during each of the summers while the experiment was going on. And then they were contacted at the end of the last calendar year of the study. So the end of August 2013. Okay. Yeah. I'm trying to get through a lot of these. I'm going to jump in. And really quickly, did you monitor the participants who dropped out afterwards to see if their, their consumption patterns changed after they dropped out? We did, we, so we did not. Okay. I think we could because I think we still have the data. Okay. Yeah. And so there's two I see here that are similar questions. Is, are you aware of any studies where the carrot was environmental benefits instead of money? And Jesse, one of our PhD students in the New Mexico Smart Grid Center said, was there any measuring of institutional or environmental preferences for the customers? And from his recent work, he was able to show that most of participation in demand response programs was due to those as opposed to monetary incentives. Yeah. No, I think that's a very good point. There were, so there were customers who told us the reason that they participated was because they believed in energy conservation, right? As, like, as, as a driver of, of, of greater environmental quality. Right. So I think that, that, so I think that, that is potentially a factor. Off the top of, I mean, there is probably, there, there probably have been studies that have used, used environmental attributes as drivers for behavioral change. One of the things about a lot of these studies is a lot of them are not in the public domain, because they're just, you know, conducted by utilities and their consultants, right? A lot of those that I know about in the public domain are driven by either economic or reliability considerations by the utility, right? So the utility wants to reduce costs or it, you know, it wants to like, reduce stress on the grid during peak times, right? And so, you know, part of, you know, part of my, part of the fact that I can't think of one off the top of my head, although I'm sure there are, is that a lot of these are driven by utilities and their regulators who, you know, historically have not necessarily prioritized environmental attributes as opposed to cost or reliability. Right. Okay. I'm mindful some people may have one PM commitment. So we'll keep going and I'll remind folks that we have this recorded and they can come back and hear maybe the answer to their questions if it doesn't get answered while you're here. But one person has asked if it's possible to introduce a reference that can be cited regarding your subject. And I think that's the report you mentioned that we will post on the EPSCORE website on the webinar portion of our EPSCORE website. I'll post that in the chat so everybody here can go retrieve that. And their other question was, some studies have taken the approach to directly control residential appliances for peak shaving. My question is how realistic is this and which one of the residential appliances can be controlled directly? So, yeah. So direct load control is part of this suite of automation technologies that I think have a very promising place in harnessing the demand side. So the history of direct load control in the utility business has focused largely on hot water heaters, which has been very, very successful. Those are easy to control and customers never notice. I mean there are utilities that have been controlling their customers hot water heaters for decades and nobody knows. The other one is air conditioners. So direct load control for air conditioners is technologically possible. It can do good things. The downside to it is that consumers tend to notice when you turn their air conditioner off. And there have been several sort of, I guess now we can call them sort of humorous circumstances where consumers have gone and figured out where the direct load control device on their air conditioner was and have smashed it with a hammer. But what that really speaks to is the role, basically empowering the consumer to make those automation decisions. So it isn't just the utility coming in and taking over your air conditioner. And so you sort of have to put users in the driver's seat, but you have to do it in a way that is really easy for them to manage. Okay, thanks. And this question from Yuting Yang, who's one of our most recent New Mexico smart grid hires in economics. So we'll hope you'll get to meet her, Seth. She says, if control for the day of the week of the events, or maybe in control for the day of the week of the events, would the average responses in each group exhibit a stronger pattern? And she's referring to slide 24. Would the response be driven by the flexibility of consumption that may be different within the week? Slide 24. She thought slide 24, yeah. So this is probably, so this slide is about attrition. This is the number of people that dropped out. So one thing about the events is that all of the events were called on weekdays. And so there wasn't a, there wasn't a weekend, weekday thing to control for. Would, would, so let me see. So off the top of my head, I don't think we controlled for day of the, I don't think we controlled for day of the week. And it's a, I mean, it's a good question as to whether that has, so actually, I mean, I'm going back to, I guess I'm looking at this one in particular, and I don't remember which days of the week these all were. But that's, that's, that's, that's an interesting thought. I could try to add it to this slide. Okay, thanks. And from Janie, she, oh, she said the previous slide. Sorry, folks can't time in. We've got you all muted. But let's get to Janie's question. How would you think about making the reward more immediate? Um, so you could, you know, so you could make the reward more immediate. So I mean, I guess, I guess I'm thinking about various kinds of payment mechanisms, right? So things like PayPal or Venmo. You know, you could basically, you could deliver somebody like an Amazon gift card or, you know, I mean, something like that. There may be more clever ways to do it. But when I think about, when I think about immediacy, I think about, okay, you know, today I did something to reduce electricity demand. And overnight or tomorrow, I get like something in my email saying like, Hey, thanks for doing that. Right. Here's, you know, here's like, here's like a $5 or $10 Amazon gift card. I mean, you know, something like that. Yeah. And I'm Jesse chimed in and said it would be interesting if the in-home devices told you how much you were saving or losing in real time. So there was, I'll say, I mean, I'll just say that, that the devices that we used were not able to do that. They could tell you the price, right? And they could tell you how much you were using, but they didn't provide any real time estimate of reduction, which I think would have been helpful as well. Okay. Great. Well, there was one more question here, but I think you answered it right after Jesse typed it in about how you determined the 60 cents a kilowatt hour production. He typed it and then you answered it. So I think that's good. I think I would encourage you all to reach out to Seth with additional questions. We should probably wrap up our webinar here. Seth, do you mind sharing your email address for folks if they want to get in touch with you? Right at the very end of the slide. So, sorry. Okay. It's just, it's my last name, Bloomsack at PSU.edu. Okay. Thank you. So if you have additional questions that weren't covered here today, you reach out to Seth. I wrote down a whole bunch that I guess I'll just ask you another time, but my big takeaway from this is how messy people are and how difficult that is for planning in these nice engineering models, which is something that we're trying to do at the New Mexico Smart Grid Center. So there's no option for a little like clapping emojis in the webinar feature of Zoom, but we all give you those. Thank you so much for your time today and your excellent presentation. This work is really fascinating. All right. Take care. And we'll see you all at our next webinar. Bye.