 graduate school of business. I am the host and moderator for this next installment for the Digital Grid Summer Webinar Series. And today we're gonna explore innovations and we have an exciting lineup for you. Here we are, just to situate us in this, our learning journey. Today, being July 8th, we have our innovation panel. We will, or I will introduce each of them in a moment. Looking forward, next week, we have a university panel and following that, a government panel. So over the next three weeks, we really will explore the triple helix of innovation and development. Some housekeeping items. Everyone is auto-muted and there are two ways to ask questions at that time. At the time that we will have audience Q and A, which will be the last half hour of this hour and a half session, you can submit via chat or you can raise your hand and we will unmute you. We are recording and your participation is your consent and recordings and presentations will be posted to both EPRI and Stanford sites. So what are the objectives of the Summer Webinar Series? Well, there are four and we want to convene experts across disciplines to present visions of a shared integrated digital grid, identify gaps to achieving the vision, principally enabling data platforms. We want to understand industry requirements, discuss technologies to bridge the gaps. And truly, since we are a research institute, both EPRI and Stanford, the hosts of this, we want to inform a research roadmap and collaborative initiatives. So we have a shared vision for an integrated grid and really what that is, it's to integrate a number of systems, to create a system of systems across electricity, telecommunications and customer local energy networks. And really the end goal, the silver, the gold medal is to enable a grid that has, that enables local energy optimization to be part of a global energy optimization achievement. Now, we're talking about innovation here and there's this one quote that stuck out of my mind in some preparation. And this is a paraphrase from Andrew van de Ven's running in packs of knowledge intensive industries, our technologies that came out at MIS quarterly about 15 years ago. And he states that digital innovation is a collective achievement by many actors and stakeholders from different fields and diverse knowledge bases. And since we are going to be focusing on digital innovation here and we are at the intersection of many systems, then it's my pleasure to welcome a diverse set of knowledge bases. And here we have three chiefs. We have a chief executive, two chief executive officers, one chief product officer and what they bring is a novel set of backgrounds on data, innovation, information systems that they've honed in perhaps other industries and now they're bringing it to bear in the energy industry. So in order, what we have is Astrid Atkinson, Chief Executive Officer of Camu Energy. We have Bill Burke, the Chief Executive Officer of Virtual Peaker and we have Tim Schaff, the Chief Product Officer of InterTrust Technologies. These are all newer companies or companies who have existed in other industries and now finding opportunity and interesting innovations within the energy space. What we have now for the next 30 minutes is each of them in sequence will give a little bit of perspective on who they are, what their companies do and how they see the energy transition and how data and innovation plays a role in that. So with that, I'm gonna hand it over to the first speaker, Astrid. Thanks so much. Thank you. All right, so my company is Camu Energy and we're a software focus startup providing what I describe as grid management as a service. So for the purpose of this conversation, I wanted to just give a little bit of background about how we think about the grid space and the changes that are happening within the broader grid and energy landscape. And this is really rooted in our background in very large scale systems engineering. So number of the folks on our team have a background in the internet space coming in from companies like Google where I spent 15 years myself along with my co-founder Cody. Also companies like Uber, SpaceX. Our technical team and investor team is primarily composed of folks who led the development of our large scale systems engineering approaches for these companies and developed a number of technology areas associated with very large scale monitoring, very large scale telemetry, real-time load balancing, real-time capacity management, fault tolerance in large systems, approaches for reliability engineering and so forth. So personally during my time at Google I was really fortunate to work on almost all aspects of Google's global computing platform from internal systems to cloud. And a number of the approaches that were developed within that environment are things that we believe have relevance in the grid landscape. So there's a couple of examples on this slide but really broadly this question that is facing folks in the grid environment of how do we manage a very dynamic, interconnected environment of many actors, potentially up to millions of actors who need to act in concert in order to achieve a goal. It's something that we believe that there's some relevance to technologies developed in other industries. Everything from how do you gather data about very large numbers of distributed participants and make sense of it in real time to how do you make decisions about who does work and what the value of that work might be. So sort of looking across the landscape from telemetry to orchestration to control and then to market integration and so forth. And so I'll talk a little bit more about specifically how we see that evolving but it's just worth noting that we've already seen that these technology approaches do have relevance and can really pay off in other areas. Many of my former colleagues went on to work in government in healthcare tech, in aerospace. In particular, one of our initial investors and board members went on to head up flight software at SpaceX and brought many of these approaches into the aerospace industry as part of SpaceX's innovative approach to space flight. So we know that it works in the internet space. We know that it can work elsewhere. And so when we founded the company, we were really interested in bringing these sort of concepts of technologies into the grid landscape. So looking at this in a grid context, particularly, we really like the DSO model. The work that we do as a company is really focused on the distribution environment and kind of bringing these advanced technology approaches into the distribution environment. And there's a couple of reasons for that. The biggest one is that the distribution environment has sort of the largest set of unsolved problems from our perspective. It's the place where new resources need to connect in a way that is not really, really not yet well established within the industry. It also typically is less instrumented than the transmission environment. And the software tools that are available for managing the distribution environment typically are not as real time and sort of not as operationally sophisticated as technologies for managing large-scale transmission environments. And so we were really interested in kind of bringing these advanced technology approaches, particularly into the increased complexity of the distribution landscape. One of the things that I think I personally learned over and over within other large-scale systems environments is that if you want system level behavior, there needs to be some component, some system, some set of practices which can provide system level coordination. And so at its heart, when we think about the transition that needs to happen within the distribution environment, this idea of empowering distribution operators or some combination of retail energy providers, GenTaylors, et cetera, folks who are responsible for managing this landscape with tools to take on proactive management of the assets within their landscape to reason about them to help them to work together effectively. That's kind of what we see as the heart of the future distribution operator role. And the concept of a DSO, really kind of creating a distribution operator that has the capabilities and some of the responsibilities that we typically see with a TSO or an ISO today, like kind of the heart of how people talk about that model. And so concretely, there's a couple of different things that we mean when we talk about the DSO. It's probably just worth going through those briefly. And this view of the DSO is really derived from other work within the industry. So as we were thinking about how to engage effectively with the system transformation in this space, we were looking to existing research in the space, existing thought leadership in the space, everything from kind of New York ROV and the New York ROV explorations of the DSO model to work down by Lorenzo Christoff on distributed hierarchical control and kind of looking at market transformation in the California context. And so we've kind of synthesized a view of what we believe the DSO to actually be, which underlies a lot of our approach in the space. And so really see the DSO as having three primary responsibilities. One of them is systems operation and real-time network operation. So really being responsible for the health of the network and for integrating capabilities in a way that is network supporting and can respect the constraints of the environment that it's operating within. Looking at resource adequacy, procurement, scheduling coordination and capacity planning. So managing supply and demand, both in a forward-looking way and also in real-time. And then thinking about integration with markets and pricing structures. So this is probably not terribly earth-shattering, but I feel like it's helpful to kind of set a baseline for concretely what do we mean when we talk about the DSO as an entity. What does this mean in practice? It's fine to kind of talk about big picture views of how technology can engage with and transform the grid landscape. But one of the big issues that I think the industry as a whole has faced is that it's an industry where change is slow and it's slow for good reasons. It's a really critical service providing electricity reliably is the core responsibility of utilities that underlies kind of every aspect of modern life. Utilities tend to be conservative in how they manage their grids. And that's because they have really core responsibility to their communities and to the entire landscape of sort of fabric of modern society. It's not something that you wanna take lightly mess around with or sort of change without a deep awareness of what the implications of that change might be. And so, as we were kind of thinking about, how do you put some of these ideas into practice that have been floating around about the smart grid, about managing grids differently, about thinking about bringing a more real-time operational model into the distribution environment? That's something that has been a goal within the industry for a while and it's a really core to conversations about digitalization, modernization and so forth. But it's also kind of a big thing to bite off. And so, when we started this conversation, we were really kind of looking around to understand who have problems today that they need to solve and for whom are those problems most pressing? And so, a segment that we got really interested in was the smaller public utilities sector where due to a combination of customer and stakeholder pressures and some of the sort of dynamics and economics of renewable energy development, this question of how do you manage grids in a forward-looking way? It's really actually surprisingly pressing to many small communities today. So, many of you may be familiar with the kind of trends in renewable development, kind of all across rural America, but it's actually like pretty remarkable. I have a slide that I don't think I have in this deck. This kind of shows the growth rate of renewable development in rural territories kind of relative to what it looks like in cities and more developed areas. And it's really a pretty remarkable change that's happening in many rural communities across America, particularly in the West. So, we actually work with several utilities-scale entities at this point, but the case today I want to give here is the utility which serves the area around Taos in New Mexico. This is our first anchor customer and a utility that we've been working with for over a year now to really kind of help them engage with the problems of managing renewable energy within their territory. Kit Carson's kind of remarkable. They have a really aggressive set of goals around local solar development. And so, their stated goal is that they would like to get to 100% of the daytime energy supplied by local solar by 2022. They're gonna beat that probably by about a year. They'll be likely around 85, maybe 95% by the end of this year. By the time they go into the next summer, that goal will be exceeded. And so, what they've done is gone ahead and commissioned a set of sort of mid-scale, small utility-scale solar arrays that are distribution-connected within their environment. The most recent of those will include storage as well as solar. And they have a few motivations for this. One of them is that their community has decarbonization goals and would like to see them decarbonize as rapidly as possible. They have a contingent of folks who are very environmentally-minded within the area and are really looking to see what they can do as a community to pioneer this transition. They also have a large number of pretty poor and not... Members of the community who don't necessarily have access to renewable energy today, who don't necessarily have a lot of money to pay for utility services and to whom they feel a strong obligation to provide affordable service. And so, a lot of ways this is really like kind of a distilled example of some of the affordability and equity of access challenges that are facing utilities across the country. Like, how do you balance the need to decarbonization with desire for a large-scale system change with the need to keep energy affordable for folks who can't really afford to pay that much for it but you keep rates down while also transforming the system. And so, what they found was that the economics of solar in particular really helped, particularly in the Southwest, where there's a lot of sun. Their costs for transitioning to solar are substantially less than half what they were paying for non-solar power sources, most of which were coal in their area. And so, what they found was there was both a really powerful economic driver along with a really powerful kind of environmental driver to making this change. So, they went ahead and actually commissioned these solar arrays enough to get them to sort of 85 to 95%, you know, over 100% of local energy need during the day without necessarily sort of taking the normal kind of 10-year evaluation and rollout period that many utilities do. And so, you know, this is a really interesting example but to kind of cut a long story short on this one, putting those aspects of, you know, what does it mean to become a DSO? What are the capabilities required to take on these responsibilities into practice with them has really helped to inform our approach more generally to thinking about like, what do these capabilities mean? How do they fit into a legacy grid environment and how do you begin to kind of engage with the data sources and the physical equipment that's out there, but then also look at pulling DERs, dynamic asset management, flexibility management and renewable management into that picture as well. So there's really three parts to this transition. This is something that we're also going to with a couple of other co-ops, which are more recent projects, which have started more recently earlier, co-op and CCA. But there's really kind of three parts to this transition. The first one and coming from an operational background myself, it's really critical to have good, high quality insights into any environment if you're thinking about changing it. Many distribution environments don't have fine-grained real-time insight today. The big reason for that is that it actually wasn't that helpful in a past context. In the past things didn't change very fast and now things change very fast. And so we really view the kind of foundation of being able to effectively engage distributed and renewable assets as being visibility. So beginning with being able to see what's out there and then kind of pull that together into a picture of what's happening in real time. And then use that to inform additional decisions around orchestration, balancing and eventually market integration. So our projects with Kit Carson with the other co-op that we're working with currently both of those projects have all three components to them. They're a little bit differently staged. The other co-op that we're working with has a lot of flexibility assets which we're engaging and starting today. Whereas for Kit Carson it's really all about being able to effectively incorporate their large amounts of solar into a dynamic balancing model. For the other co-op they're really looking more to begin with incorporating flexibility and to peak shaving and cost management mechanisms as a precursor to doing large amounts of local renewable development themselves. And I guess the one last thing that I would say about that and I'll move on really quickly because I don't wanna eat up everybody else's time is that because they're co-op they have a particularly strong motivation towards engaging their communities. So when they think about community engagement isn't strictly a financial proposition around, is this the cheapest way to procure services? It's more about how do you effectively bring the community into a renewable energy transition model and effectively make use of community procured services as part of that transition both as a way to provide additional services around flexibility and storage and so forth but also as a mechanism to give folks within the community avenues of engaging in addition to existing net metering structures. So the last thing I just wanna touch on really quickly is that we believe that this is a transition that goes kind of well beyond any individual vendor or any individual utility. We've seen in other industries that taking an open source approach and being able to bring an ecosystem of contributors collaborators towards these kinds of large-scale problems we're really powerful accelerator and making change happen faster. And so we are looking to take an open source approach ourselves, we already have a couple of collaborations going with external partners but we're intending to open source a good portion of our platform beginning with a release later this year. And so we're kind of just looking to put our money where our mouth is in terms of helping to drive this collaboration, provide additional transparency and kind of a basis for a collaboration with vendors as well as customers and critically researchers which is why I kind of mentioned in close with that as part of the ask here was sort of how to how to drive collaboration with the research community. So I'll stop there and thank you and happy to look forward to talking more with other panelists. I'll just kind of skip past this and pass it on to Tim. Bill, thanks after me. Oh, Bill, you're up. All right, it looks like I'm now the presenter. Let me know if you can't hear me. Good, yep, all right. So I'm Bill Burke. I'm the founder and CEO of Virtual Peaker. So the mission in Virtual Peaker is to help drive the adoption of renewable energy and large scale electrification by helping utilities with the problems that those things inherently bring about. So what sort of problems are there on the horizon for the next, let's say 10 years for utilities. So, you know, the first thing I think we all know is renewables, especially rooftop solar like distributed energy renewables really eat away at the utilities core business which is selling kilowatt hours. That means that the revenue is going down, it's flat or going down. The second thing is that it makes their job a lot harder because the renewables create intermittency that wasn't present in the previous generation of the grid. So their job gets a lot harder keeping it in balance. The next thing is that things like EVs are gonna be a huge part of the future in America and across the world. And those offer their own operational problems when the solar's going down at six p.m. in the winter, everybody's coming home to plug their EV in. How do you deal with effectively a double ramp, a drop-off in generation and an increase in demand at the same time, the ramp rates get amazing. But the good thing is that's gonna drive their revenue back out because they're selling kilowatt hours again. But so again, operational issues, but they don't have a great way to monetize that yet. The selling kilowatt hours is not the best way for utilities to potentially move forward in the future. And they're starting to think about other new ways, new business models, if you will. And the third is really a bunch of companies coming for the utilities customers. So what I'm talking about there is companies like Arcadia Power, companies like Sun Run, that really wanna be the front, companies like Google, companies like Amazon, that really wanna be the front end of the customer interface. And so if you look at a utility, it's their jobs getting harder because of the operational issues. The revenue is potentially going down because they're not selling kilowatt hours anymore and they're potentially losing their customer. That's a potentially dark world for utilities. But we believe that the utility is the most efficient way for us to electrify America and to bring on large scale renewables. So what Virtual Peaker does, I'll get to some of the details in a minute, but really what we're doing is helping utilities with all types of internet connected products, DERs sort of generally, water heaters, batteries, thermostats, all those devices. And we're really helping them along three core areas. So the first is reduced operational costs. How do they control their grid in a way that they've never had before so that they can really dial in those costs? The second is how do they really engage those customers in a way that they currently are not? And if they don't do it, then some other company is gonna do it and that's gonna be bad news for the utility. And the third is executing on new business models with the effective glut of data and control capabilities out there on internet connected devices. There's a lot of new business models that utilities can explore and that's one of the areas that we're also helping them with. And I'll get into some details there in just a second. But why am I here? That's a great question. So who am I? Well, for one, I'm a seasonal beard wearer. That's one thing I am, but I'm also a PhD from UC Berkeley. I was at Berkeley starting in 2005 and in 2006, the team I was on wrote the first program will communicating thermostat software. And this predates NASA, predates Ecobee and it sounds absurd today to think that thermostat manufacturers were telling the California Energy Commission that they couldn't create a communicating thermostat. And the California Energy Commission said, yeah, yeah, I'm sure we can. Let's give some researchers some money and let's have them make one. So we did. And from there, I focused my dissertation on how do you control massive quantities of thermostats for the good of both the utility and the homeowner? So fast forward a few years, 2010, I'm finishing up my PhD, I'm absolutely exhausted. And I get this great offer serendipitously back in my hometown of Louisville, Kentucky, doing my dissertation work with a great company, GE Appliances. I took it and had a great ride there where I worked on energy management across the whole home. So water heaters, refrigerators, basically everything. How do you control all these guys from the grid? And then jumped into internet connected products. And I was effectively the product owner on the APIs that they use today. They're giving me a great background in how these devices actually connect. So one little tidbit that is sort of a thread throughout my career. In 2006, when I was at Berkeley, we proposed a common port module for thermostats. That was one of the innovations and recommendations we came away with. Fast forward a few years and I'm at GE and they're using Zigbee. It's supposed to be the new standard for communications across utilities. Then we started using a module called a Usenap module, which was also supposed to be, which is sort of the offspring of this idea from that a lot of people have had. But again, we also put out Berkeley. But the Usenap module was a common port module. That was 2010 when we did that. This is sort of a thread across my career where standards are sort of coming and going. And right now I was at GE, left GE, started Virtual Peaker in 2014, which is our first customer incidentally. And thinking back from dealing with standards versus dealing with proprietary, there's a lot of nuance there and hopefully we can get into that discussion. But I've been on both sides of it and have pretty strong feelings that I'll share. So that's basically the background of me. I started Virtual Peaker in 2014. We're now, we're still a startup. We're growing rapidly and talk a little bit about that in a second too. So what do we do? Okay, so we're effectively a Derms. I hate the term Derms, but that's effectively what we are, distributed energy resource management system. And I like to think of it more about what we're doing and not about the name. So we have all the energy demands on the left-hand side of the graph, water heaters, EV chargers, batteries, thermostats. We connect with those devices through our cloud. And we bring all that data back into our system, normalize it and operate a real-time control system on top of all those devices. That allows us to then also normalize the control going out. So our customers are utilities. We're providing this platform for utilities and we have a homeowner application to engage the customers around as well. Today we have integrations with about 20 different manufacturers and what that allows us to do is effectively read data and control about 20 different manufacturers. And then the manufacturers you've heard of, nasty could be Honeywell, Sonnen, Tesla, Sunberg, the big battery providers, the big water heater manufacturers, the EV chargers, we're connected with all those different companies. And we're streaming all that data back into our system. In addition to basic control that utilities offer, we're also, because of our ability to do cloud-based real-time control, it opens up the possibilities on the type of applications we can have. So from the cloud we can do manufacturer agnostic time of use optimization. And that's done at the individual device level. So each device, we understand the behavior of each device and then can optimize it in real time to basically save the customer money. We can also do things like real-time energy arbitrage. And these are sort of the two, on the ends of the spectrum on what real-time control really opens up for us. And real-time energy arbitrage is really about reading wholesale electricity prices and then charging and discharging your devices in order to optimize the cost basis. So think of a battery by low sell high sort of mentality. And with real-time control, we can actually do that in real-time on each individual device. So just to reiterate, if this wasn't completely clear, we don't install any additional hardware in the home. All of this is done at the cloud layer. And it, because it's done at the cloud layer and we've taken a purely software-as-a-service approach to it, which means that each utility is effectively running, each of our utility customers is effectively running the same platform. And we can stand up that platform very quickly and offer these techniques to them very quickly and scalably because we've taken a purely software-as-a-service mindset to it. So in other words, we're not standing up servers for utilities. We're doing it all from our cloud. We're working with utilities big and small because of that software-as-a-service approach. We're working with the biggest utilities in the world, like Pacific Gas and Electric. Smaller utilities like Portland General Electric and Green Mountain Power, but we're also working with co-ops and munis. So our biggest customer is PG&E, the biggest utility in the US. And our smallest customer is Belmont Light District, 14,000-meter municipal in Massachusetts. And again, because of that software-as-a-service approach, we can scale up and then scale down to meet the needs of all the customers. So to finish up, we're growing and we're growing quite rapidly. We've effectively doubled in team size and doubled in revenue since the first year and that's during the COVID time. So I'm gonna take this opportunity to do a little pitch. We're looking for software engineers, algorithm, R&D engineers, marketing people, and business development. All openings today that we have, we're hiring fast and furious. So if you're interested, give me a shout. With that, I'm done. Great, thanks, Bill. Really helpful. Tim, you're up. Thank you. Thanks very much. I'm Tim Schauff. I run the product development group at InterTrust and I would like to first start by just giving you a little bit of background about the company. We've been around for quite a while. The year that InterTrust was founded was the same year that the Berners-Lee developed the first web browser for the worldwide web. So quite a while ago. When we were founded, our founder, Victor Scheer, realized that the network would transform the way that we all interact. He also realized that the exchange of digital assets over the network would reshape the way that the economy structure operates. Today we know that banks provide a trusted intermediary function that facilitates the exchange of traditional currency between multiple parties. At the time, there was no infrastructure in the internet to be able to enable that same sort of trusted exchange of value in the online world. And so we began pursuing a mission to create a substrate for digital trust and security that could operate across this network. And we've been pursuing the mission ever since. We offer products today that serve the needs of the industry, consumer electronics, healthcare companies, automotive companies, and more recently in the energy domain. In fact, to our shareholders, EON, Germany's largest energy company, and Origin Energy, Australia's largest energy retailer, are investors in the company. Now, here's a little simple statement about our mission. I mean, our basic concept here is that data and the data will be the fuel that powers the development of the internet, the dynamics around the collection, the use and exchange of data. We call this data rights management. This will be a foundation for building the sort of online trust that's going to be needed in order to innovate and manage the transformation that's taking place across all the different industries, including the energy industry, of course. There are a number of significant challenges that we will face as we try to work with data. I think everybody recognizes that data is an incredibly valuable raw material that can be mined to optimize, understand and optimize all aspects of our business operations, our technical operations. It gives us the tool, the foundation tool that allows us to manage in this increasingly dynamic energy environment. But at the same time, data tends to be radioactive. We all are very familiar with the idea, with the challenges that have arisen around the digital advertising industry, consumer data rights and the tensions between the two. And we all can see every day the problems that come from throwing personal data around without adequate regard for privacy and security considerations. In the context of the energy industry, we've got to come to grips with this challenge. Obviously businesses that are using data and sharing data have the risk of unintended exposure of their trade secrets to other companies intentionally or unintentionally. And the risk of data leakage through the supply chains. Of course, we are all looking at increasingly complicated regimes around data legislation, consumer data legislation, regulator compliance requirements related to the use and protection of data. In many countries we're seeing data residency requirements, data sovereignty related obligations. This means this data can be used only if it remains in the country in which it originates. All kinds of interesting challenges for enterprises that want to be able to take advantage of the data. And then lastly, there are all kinds of technical challenges associated with using data at large scale that have to do with the reality that large enterprises have diverse data sets all over the place, stored in all different kinds of systems, different locations, different formats. And even these just very basic elements can create tremendous barriers for the effective use in management of your data assets. So we have created a platform, we call Inter trust platform that brings together three core capabilities that we believe are foundational for this data economy. And we put it under the umbrella of something we call data rights management and it consists of three basic areas. The first one is called data governance. This basically says that all data assets have an owner, they have and that owner should have the ability to control who has access to the data, how the data is used and that access control should be implemented at a level that's a finer grain level than simply saying, well, here's a database. I either give you access to the database or you don't have access to the database. It needs to be at a much finer grain level of control. And for all of the reasons that I mentioned earlier about the risks associated with data leakage and infiltration of data unintentionally, the platform provides this policy based control over the data rights, but it also provides a secure record of all of the transactions that take place in the platform so that you're able to comply with all legal requirements so that you can understand what's happening inside of your environment so that you can manage the security issues. And so that's a very, very fundamental part of the platform. The second piece, it really addresses some of the technology issues around bringing different data sets together. We have a capability in the platform that supports data virtualization. What this really means is that we can provide a very simple interface that allows you to bring together elements from different data sets, different databases, without having to do much more than provide a URL, maybe some access credentials, and the system can automatically begin to blend data from these multiple sources without requiring that the data be copied. And when you think about data security and the data management challenge that enterprises face, one of the biggest risks is the risk of unmanned copies of the data flying around inside the company and potentially outside of the company. Once somebody makes a copy of a sensitive data set, you have lost control of that data set, and that exposes the company to all manner of risks. So the idea around virtualization is number one, make it easier for data workers to get access to the data wherever it lies. And number two, to be able to minimize the amount of data copying that's required in order to do, especially in order to carry out these earlier stage explorations that are seeking for where are the places where their data presents the most value. And of course, we try to do this in a way that will provide a simple interface for everybody downstream who's working with the data. The last element of the system provides a capability we call secure execution. We don't just look at the data, we wanna process the data, we wanna extract insights out of the data. We've built a system that allows you to bring your analytics to the data, allows you to engage the capabilities of even third party analytics companies, bring that capability to the data, but ensure that the operation of those algorithms, those analytical tools, doesn't present the data leak of the data. So the idea is that data basically operates inside of a protective sandbox. I mean, the software operates inside of a protective sandbox that ensures, minimizes the risk of data leakage, which we think, again, is very fundamental. If you can't control where the data is going, if you don't understand who has access and what they've done with it, your companies will be constrained from being able to use data at the scale that it could be. So lastly, I just wanna tell you a little bit about a project, one of the projects that we've been working on is relevant in the energy space. This is a collaboration that we began three years or so ago with Germany's largest grid operator called WestMess. They're now a part of the EON company. And we worked with a team inside of the WestMess group called the Digital Cooperative. The basic challenge that these guys were interested in trying to solve was, how can we create a planning tool for municipalities and grid operators that will enable them to manage the substantial risks and challenges that all of them are facing today? Germany has over 12,000 municipalities that are more than 850 individual distinct DSOs operating around the country. 850 plus grid operators is kind of a crazy, crazy example of diversity across that company. After the earthquake and tsunami that caused so much trouble in Fukushima, in Japan in 2011, Germany mandated the elimination of nuclear power through their energy portfolio. Nobody knows how that's going to play out. Obviously that's a very complicated operational challenge, but that's been mandated by government. They've also mandated the fairly aggressive timetable for the elimination of the country's reliance on coal. There's obviously as a counter weight to the loss of these energy sources, the country's been investing heavily in renewable energy. And Bill mentioned, the dynamism that comes with renewable energy creates tremendous challenges for companies that are focused on managing wires in the ground and making sure that substations have enough energy to are not overloaded by the energy demands that cause all kinds of infrastructure problems. In parallel with the growth of renewable energy, we are also of course seeing the growth of electric vehicles which present not only new sources of demand, which maybe as Bill pointed out, they can help to add new types of customers for the energy products, but they create a new kind of dynamism in that demand, the energy demand is moving. Every time you look, it's in a different place. These are tremendous challenges. And of course, municipalities, politicians are facing increasing pressure to address the carbon footprint of their societies. These are massive transformations. The DSOs, the municipalities facing tremendous challenges and there's billions, literally billions of dollars being spent each year trying to upgrade the grid. How do you know where to spend the money? So we worked with the West Nets team, Eon Digico to bring together grid and municipal data, real estate data, data about the homes, the property ages, the size and shapes of the roofs that might relate to the ability to use solar energy. Configurations of garages in Germany, lots of people buy electric vehicles and then just plug them into the outlet in their garage. It's incredible challenges for the grid because these are demands that just pop up from one day to the next. There's no registration required to be able to make those changes. We brought together demographic data that speaks to the income, household size, other considerations like that. And the basic concept was to create a platform that could be offered to all of the municipalities. I mean, some of these municipalities, some of the DSOs, most of them are actually quite small. They're not in a position where they could make an investment of the scale to bring this kind of an information system together all by themselves. And so working with Germany's large energy company, we've created a platform that allows them to safely bring their data to the platform and combine it with the data from these other sources, commercial sources and public sources to be able to see what's happening, to be able to do planning around their own EV charger investments to analyze the potential for solar, to engage in better conversations with consumers about how their energy use is affecting the grid, how they can manage their costs, how they may be able to participate more actively in these new energy opportunities. Today, the company InterTrust is governing and securing about 80% of the German medium voltage grid data, about 60% of the German low voltage grid data. So we've gotten a lot of experience managing these kinds of data sets. These first applications are very focused on the longer term grid planning operations that are having these billions of dollars being spent. But of course, as we move these application concepts closer and closer to real time, we can begin to leverage most of the same assets for demand response, home, industrial IOT applications, even insurance. There's all kinds of opportunities around the insurance industry for bringing kind of more linearity, more stability into the economics of all of these new energy domains. So this is the area where we're very focused right now. I'm excited to hear more from Steve about the questions and hear your questions. So thank you very much. Great. Thanks, Tim. Thanks everybody for those comprehensive introductions. We are now going to move into the panel discussion and looking at the time, I have only two questions and we're gonna tackle the first one now. And that is you each paint a slightly different vision for the future. You tackle different challenges in specific ways. The common thread obviously is data, the use of data to make better decisions, higher quality decisions. If we're thinking about innovation, innovation really is a complex combination of organizations, people and technology. So with that in mind, I wonder what needs to be true both at the technological layer and at the organizational layer for the value propositions that you have at least articulated, what needs to be true for that to actually happen? What existing systems need to be there? What mindsets do you need from your customers? All your customers in this case are most likely utilities. What needs to be there? For you to really start engaging meaningfully to roll out some of the greater visions that you've painted. And I'll start with Astrid on that question. Yeah, I mean, I think from our perspective, it's motivation is really the biggest question. There are a lot of practical barriers to innovation in an existing environment. It can make things really tough. Everything from data provenance to confidence, to confidence. To comfort with cloud-based models, these kinds of things. Each of which can be really difficult to work through individually. And so what we've found is that if an organization is highly motivated with a problem that they wanna solve, it's really important to them from a business perspective, whether that's like business model for a utility that's facing a drop-off and rates through existing business model structures or really deep concern about the system stability in the process of thought for solar. Like there's a lot of different motivations that will come in. But it tends to make all other conversations flow more smoothly when there's actually a problem to be solved as opposed to kind of a, maybe a forward-looking awareness that change will eventually need to happen to somebody, but maybe not today. It tends to just be like kind of a motivating function. And just to go back, is there something specific at the technological layer? Because your solution plugs in at so many different levels. Is there essentially a baseline set of receptors, technological receptors, data receptors that need to be there for you to gain traction? Our approach is really just integrative, right? So it's all about like asking what's out there today and then pulling that together and beginning to make use of it. There is a minimum set of data that is really helpful. Some, if you wanna do real-time management, some form of real-time data, whether that's from the system, from smart metering or from the devices themselves is kind of required. But we're pretty agnostic as to data source. And I think that from a philosophy perspective, if you're kind of asking the question of what is the most effective mechanism for driving change within a very large scale existing industry with a lot of heterogeneity from kind of organization to organization and technology to technology. One of the really important attributes for anyone who's looking to innovate in that space is being able to work with what's already there. If you're kind of requiring a utility to go in and sort of rip out all their instrumentation and replace it before you can do any work with them, like that's gonna take a while. Being able to engage with data sources as we find them can really speed that conversation along. What's your perspective? Yeah, so I think I'm gonna echo a lot of what Astrid said. I think the couple, I'll reframe it slightly, I think. I think an innovation mindset with a bias toward action is something that we need to see more of in utilities. All the utilities have groups that understand DR pretty clearly, all the groups know that, all the utilities know that all these problems are on the horizon, and so many of them are taking an approach of analyze everything to death and not move forward with anything. Just keep analyzing, keep analyzing, working on standards committees in order to drive standards that are basically obsolete before they come out. What really needs to happen is people, I think the utilities that we find that we're most successful with are utilities that take this action biased approach where they're not worried if they're buying from, they're not worried if they're not buying from the biggest company out there, first off. Second off, they're ready to just jump in and do something. And one of our customers, one of our very early customers, and that's probably one of the reasons they were one of our early customers, said the utilities are, this is a utility, by the way, and I won't name who it is, said the utilities are nuts when they just pick the biggest company to do this. What they really need is a small, lightweight company who can innovate quickly with them, because frankly, we have no idea what this is gonna look like in the long term. All of the solutions out there, the only one that's mature from the perspective of DER control is really demand response. And that's a technology or a technique that's been around for ages. Ages, that's maybe a little bit of a hyperbole, but it's been around for a long time. But how we're gonna control devices for this new dynamic grid is something that we basically need to experiment in real life and find out. Right, and essentially that is a good conceptualization of what a startup or a company moving into a new sector, what that allows is for that experimentation. Tim, what do you think? Well, yeah, I think these are good points. The reality for the energy companies is that transformation is happening. And I think, as Bill said, having that action mindset, action-oriented mindset is fundamental. You're either going to be a part of the transformation or the transformation will bypass you at tremendous and create tremendous damage in your business. So the question that we bring to the table is we can help to articulate what the opportunities might be. In most cases, as Bill points out, companies do have a sense of what the issues are. But a lot of times there is a kind of friction associated with overcoming, getting going. And so I think one of the other areas that's very important is to look at how can you start the question? How can you start with action versus more and more in-depth conversations? Because the reality, in all of my experience, transformation doesn't start as a giant wave. Transformation starts with a ripple. And you want to engage because we all have to learn. The learning process takes time. You won't learn by reading. You have to learn by doing. And so we try to create a platform where people can begin the process. And we don't know where it's gonna go. But by beginning, you have a chance to start to find what's meaningful and important to your company, your division, your group, whatever it may be. Again, giant strategy is interesting at a presentation level, but giant high levels, those kind of initiatives are often crash on the rocks of reality when you get down to the operational levels. So it's important to have a balance between the vision and the day-to-day, you know, down in the details, working teams. Got it. We're gonna transition to some of the audience questions that they're now starting to stack up. But before we do, thinking or understanding that we are a research organization, we are a university, how might the research community help you with thinking more on, the way that I'm thinking about this is more on the technology side or your solution development. It could be something else, but at least thinking about what a university or a research organization can help you with, what might that be and why? We'll go in the same order, and I'm gonna ask you, what do you think? So we're really excited about collaborations with research organizations and we see a couple of opportunities and challenges in that. One really big issue that many research organizations face, and this is notably not true of APRI, but it is for many universities and DOE labs, is limited access to real-world utility data and utility systems, so it works in and with. And so there's a lot of really great work happening on the research side, but there's this big barrier to getting it out into the field, and in particular, I think successful innovation really needs to be part of an iterative loop. Bill mentioned kind of bias to action and willingness to try things out and see if they work or not. For research organizations, they really need the opportunity to do that in order to, I think, really effectively advance the work that they're trying to do. And so one of the things that we were thinking about when we were thinking about open source strategy and engagement with the broader community was this idea of being able to more effectively engage the research community, potentially draw some of that really promising work more quickly and more iteratively into the field and potentially allow folks to begin iterating in more real-world, more utility-facing environments. So concretely, what we do with that is look to pursue joint partnerships around grant applications and stuff like that today. But we're hoping to provide a more fluid collaboration environment as we move forward, partly through open source approach. But I think it's really under-leveraged today. It is very hard to get stuff from research into it. Got it. Actually, you bring up something I wanna explore based on, you know, you're here at Google and Tim, your experience, say, at Apple and Sony and Bill, your experience in GE, are there any models that come to mind that could help tighten the feedback loop to create this greater fluidity that Astrid's talking about? I mean, can I say something? I mean, I think, you know, I think all of us are aware that the phenomenon of Silicon Valley was, in fact, that sort of corporate academic partnership kind of a model. And I think that has historically been less common in other areas around the world, although that's obviously changing. And I think a lot of it is, again, it's a bit of a mindset issue. We have data scientists, we have the ability to develop applications, but that's actually not our focus. And we would love the opportunity to engage with domain experts who want to bring to life the data science and the application capabilities against the data sets that our customers are bringing to the table. And I think part of it is this question of how do you facilitate that conversation? I know in this example that I shared about the grid planning project with the German energy company, they're very engaged with universities around their home area. This is an Esen, Düsseldorf, Cologne area, Northwest, Reinfalz in Germany. They do engage with the universities specifically because they bring to the table kinds of expertise that simply doesn't exist in a energy company that is maybe primarily focused on the operations of the business. And I think that in addition to an action bias, Bill calls it, you need to have that kind of curiosity that says what would be possible. And I think we're at a great place these days because there is tremendous hunger in the academic community for access to the data. The energy companies would love to do that, but they're terrified about what it means to share data. There are all kinds of struggles there. And so I think there is some interesting opportunity at the intersection if we can demonstrate that there are ways to collaborate around data without losing control of the situation. That's sort of our focus. And Bill, what's your thoughts? So I'm gonna back up maybe half a question and talk about how we think collaboration with research can go. So what we try to do, we have a current partnership with Eprion, a project. And so we tried to, our solution is commercially available. Like it's not really in the research phase, but like I indicated, the possibilities are really to use a trope endless on what we could do with distributed energy resources with the utility. And nobody has all the use cases worked out. So we like to partner with organizations like Epri in order to figure out those use cases a little bit better and partnering with a research institution on potentially applying some control strategies specifically is something that we'd be particularly interested in. And I have to say, the number one thing we need from universities is to keep sharing out great people who are interested in this field. And that's probably the number one thing we need from you guys. How about we, in the interest of time, why don't we move on to the audience Q and A? And I got a few, this one looks pretty interesting, at least in my mind. And I'll read it to you and whoever wants to jump in, jump in. And the question is, as we move to a more distributed and digitized grid, there are many things that could be monitored or controlled. What should be? Everything. I think like one really important distinction to draw to begin with is that the question of what you want telemetry on and what you want to incorporate into operational data environment and be able to use to make decisions is a different question from the question of what you want to control directly. I think all data sources are interesting from an insight perspective, if you really want to understand what's going on. One thing that is extremely helpful is to have multiple data sources which you can use to cross compare against one another. This is an approach that we use really widely in the kind of large scale telemetry space across industries. So this idea that measuring from multiple points and multiple subsystems, correlating them and drawing conclusions from gaps and differences, it's a really powerful tool to develop more robustness in your monitoring and telemetry views. You can usually tell almost as much from places where data sources disagree and places where they agree. And that in and of itself is very helpful. There's a number of sources of the data and control points available in the wild. We typically distinguish between ones that are basically utility-owned or utility-controlled versus third-party-owned and third-party-controlled. And I think there's maybe a lot more to be said about control strategies between the two of those and which span the two of those. But what I would say is that pulling in data from as many as possible provides a more accurate baseline of a picture from which to draw control conclusions on. In terms of control, being able to affect both grid-facing devices as well as third-party devices does more flexibility in terms of control strategies that you can put in place. But I think rather than continuing to talk, I'll maybe let one of the other panelists chat about that. Billy, you wanna jump in? Yeah, sure. So I sort of tongue-in-cheek. I believe privacy and data security are very, very important and I take them personally very seriously. And I know there's a big question in the community potentially about what sort of data could be shared and how people can understand people from device-level data. That's a question we get a lot. Is somebody gonna give you access to their thermostat or is somebody gonna give you access to their water heater in order to understand, in order to run our learning and et cetera, et cetera. And sort of tongue-in-cheek, I say, people share the most intimate details of their lives on Facebook. And what your thermostat is doing is far, far, far less important than the pictures of your children. And there's not very much to glean from a thermostat operation versus picture of you on vacation, for instance. So I think that we need to be protective of our data and I think all the companies that interface with our data need to be extremely, extremely protective of it. And we take data security very seriously, implementing all the safeguards. But the reality there's not, from a device control standpoint, that data is not that important. And if you look at what a utility already has access to with smart meters, it becomes probably even less important. I think one of the challenges right now is that energy companies, power companies, tend not to have the most engaged relationships with their customers. And I think that the dynamic that, that's a hard one to change. But I think that the starting point is to look for things that can be very motivating for an energy company. And I think that when we consider which sorts of devices would matter. I mean, I think it's very logical to say, well, the high energy devices are the ones that have the potential to create an impact to the network. And so I would guess that prioritizing communication and control around those kinds of data, those kinds of data, those kinds of devices should be priority because we're still in the very early days and we need to demonstrate that we can manage a network of these devices securely, that it won't be compromised by some hacker in another country. And these are all theoretically possible, but they are not really proven yet. And I think that until we get some examples of those kinds of networks at scale, it's gonna be, well, it's always, it's an opportunity for the innovators across the industry, but I think that's gonna be very important. So I think a more narrow focus on high energy devices that allows the energy companies to see a benefit, that allows everyone to get a little bit more confident about the security issues and creates a motivation at the energy company to find ways to engage with the consumer a little bit more directly. And they've got to create some incentives. I mean, consumers generally don't pay much attention to this stuff and saying that you're gonna save a few dollars every month may not be a motivation, enough of a motivation. Bill Astor, do you wanna feel on top of that or do you wanna go to another question? One thing that I would just add to the conversation about privacy in terms of looking at data from end users in particular and consumers in particular is that there's a really powerful set of tools that have been developed in other kind of data adjacent domains such as healthcare around differential privacy models. So just taking this idea that you don't necessarily need every piece of fine-grained data about a person or a customer in order to draw useful system level conclusions or train a machine learning model or something like that. There's a really powerful one because it gives you some additional ability to do processing on data and to use data in enterprise environments without necessarily compromising the role of the individual or the data profile of the individual. So I do think that as we look to incorporate data particularly about end users that making sure that differential privacy approach which takes only the data you need for the operation at hand is a really critical component of successfully integrating those sources because to the other panelists point, trust is a really important part of this interaction. Utilities often don't have great trust with their customers today or at least don't have a very close relationship. And so I think it makes it kind of doubly and triply critical that to the extent that they do interact with people's individual data that that happens in a high trust environment that it is well protected from compromise and that it is using a structural approach which makes the potential for privacy invasion or security compromise inherently less risky can be really helpful and increase in confidence with some of these interactions. Go ahead, go ahead. Yeah, so I agree with all that Astrid absolutely but I do want to challenge one perspective potentially. I think utilities while they don't have a great ongoing relationship with their customers so they don't interface with their customers a lot. I think they by and large are relatively trusted maybe with the exception of certain utilities obviously but I think by and large people think of utilities almost like a government entity because they don't really have a choice on who their energy provider is. And because of that, I think that they have a sort of trusted role in the world and what that means is that we as a service provider don't ever want to break that, cause them to break that trust. And so we take the approach that effectively all the data is the utilities and we don't use the data for anything else. And by doing that and protecting the data very closely we're hoping that they can maintain that idea of a trusted relationship and then grow that engagement over time. I mean, I agree with that. I mean, I think that the challenge that the companies are facing is a function of a move toward a more deregulated environment where instead of a generation distribution retail operating in the inside the box of a regulated monopoly governments around the world are creating, breaking down the balls between these things and saying, hey, let's unbundle these different operations in order to create more competition, more innovation. Presumably this is ultimately good for the consumers and affords a much more adaptable environment. And I think the reason, yeah, I agree with you Bill that the energy companies are trusted. They're trusted to deliver energy reliably with a few exceptions. What they aren't, I guess what consumers aren't kind of conscious of them as doing anything else. It's kind of like, I trust my doctor with my healthcare information, but I don't trust him to tell me what to do with my car and vice versa. I don't want to talk about my healthcare issues with my car repair guy, even though I'll tell him everything he needs to know about my car. And it's like the energy company hasn't yet positioned itself as an advocate or an expert or a trusted authority for some of these other dynamics. And that's something they have to change because the forces of deregulation or re-regulation of unbundling of the different aspects of the energy are going to introduce a lot of competition into the marketplace that they're gonna have to respond to. I wanna, this is great. I wanna switch gears just a little bit and address another question that has popped up, which is, as systems become more complex, existing commercial tools such as planning tools have struggled to perform accurately and robustly. So are there challenges associated with grid system modeling as it relates to your businesses and goals? Yeah, yeah. So a lot of times utilities don't have really great even maps of their system, especially if you're talking about smaller munis or co-ops, the actual, where things are, it's potentially not in a GIS system. And it's not, the map might be there, but it's potentially another way besides what you would really want from it. So I think that is definitely a concern. We've found the same. It varies from utility to utility, but there are often gaps in the existing system models. They often don't incorporate real-time switching decisions as well, which is that there are real-time system characteristics which can vary greatly. This is why the places where being able to pull in real-time system and permission can help with understanding what's happening. Being able to both model and also measure can give you some additional confidence, but I would say that in general, building approaches to system modeling, which are very robust to missing data is something that could be really helpful for the industry in general. It's definitely been important to us as we engage, but that's also something that goes all the way out to kind of the research side. It's just that most tools are fairly brittle to missing data or inaccurate data, and it makes it hard to do any additional work on top of them. I guess one other thing that I would say about the performance challenges here is that I do think that this is a place where the utility and power sector can benefit from the cloud computing approaches. Now the cloud isn't just like using someone else's computer to solve your problem. It's also inherently the ability to bring potentially hundreds or thousands of computers working in parallel to very complex problems, and that's really accelerated the ability to solve very, very complex problems in many fields, the ability to bring hundreds or thousands of machines working in parallel to these grid modeling and planning problems. Honestly, I think could be really transformative, and so that's something where there's probably like some really good short-term work to be done and making that more widely available to people. Any high-level tips that you can offer, because obviously this is somewhere where the world that you come from, are there certain avenues, specific avenues that need immediate attention or midterm attention at a greater level of detail? Modeling tools that are highly parallelizable will be super helpful. Like we use up in DSS today, which comes out of APRI, or at least primarily out of APRI, and it's a great tool. It's very helpful, but it's difficult to run multiple parts of the grid in parallel. Being able to do that across multiple grid components or potentially multiple circuits, certain parts of a circuit in parallel would be very helpful. Some of these are unsolved problems and from a mathematical perspective, like it's kind of hard to parallelize a linear algebra problem. However, there are some really interesting advances in application of like GPUs to these classes of problems in the graphics space, application of things like graph math to reasoning about complex systems, which I think are very applicable to grid type systems in particular, which I think can be really helpful here. So I think those two are both, if you're sort of looking for tips for researchers on where to potentially engage, like I think those two are both really interesting ones. Got it, helpful. I can tell you about an example. I mean, talking about this application and tool thing, I mean, I think the reality is that most of the established applications are films that were developed in the 90s and they have very much of that kind of looks like you're running Excel on a Windows 3.1 kind of system. They have that kind of quality about them or worse. And we worked on a project several years ago that involves taking a fresh look at how the Operations Center for offshore wind assets could be managed. And we built a system that basically replaces a collection of, I think it's like 12 or 13 different independent PCs that were each monitoring in the Operations Center near the offshore wind farm, various aspects of the operation of the wind turbines and other aspects of the farm. And we basically brought the data streams together in an integrated form and then built an application using modern mobile application development tools that not only did it allow them to bring the relevant data together into a single display, which just makes life easier for the guy at the frontline in the Operations Center, but it allowed you to then begin offering views into the operations to people all over the company who here to Fort had not even been able to engage in the conversation because they didn't see, if they saw the data they saw in a monthly paper report that was 60 pages long, it's like nobody's got time to scroll through that sort of thing. And I think that sort of mindset about applying the lessons that we are all living in around mobile technology to some of these problems creates some really interesting opportunities for companies that are trying to figure out how to get started. So we use modeling internally for one of the big problems that we have is we're integrated with 20 different devices and how do you operate a real-time control system on 20 different devices? If you make an update, do you run it in slow real-time real-time, which is 24 hours equals 24 hours? That's pretty untenable from a software development standpoint in a lot of respects. So one of the modeling techniques that we're exploring at one of the ones we're using at Virtual Peaker and one of the ones I used when I was at Berkeley before is a technique called software-in-loop simulation where we basically take the software that we're running in real-time and we wrap it in a digital simulation of the world and then accelerate the time so that you can then make 24 hours happen in 30 seconds. And that's a great modeling technique we use internally to handle all this a lot of data. So we're closing out on our time here. And I guess the question is what will you be working on in six months, Astrid? So for us, we've got projects underway right now which do non-pilot basically full field deployments in a couple of locations for both distributed telemetry and also distributed control. So in the next six months or so, really making sure that we can bring effective large-scale analytics techniques into those environments based on the data foundation that we already have. Maybe looking to enhance some of those research partnerships to bring in innovative algorithmic approaches in addition to the systems approaches which we are pretty comfortable with. One of our short-term goals is really kind of building out that community. And then I think on the kind of business side of this, there's a lot of really big open questions around kind of future utility business model that I think many utilities that are engaged in the space today are looking to address. Really beginning to comprehensively evaluate and kind of feed into a conversation about what that might look like is a big goal for us. And so again, that's partly work that we do with customers, partly also engagement with a broader community, kind of driving the conversation forward around like, with additional technological capabilities, what should this business model look like? What should the ESO model look like? Are there lessons that we want to draw from existing deployments as this is starting to roll out in the UK and Australia? Are there things that we want to pull into kind of a standardized model that's a broader conversation that is much larger than just us? And I think it's really timely. And so in the next six months, like really starting to get much deeper into that conversation is going to be a goal for us. Yeah, I'll be brief. We're really excited about a project where we just standing up today at Portland General Electric doing a bring your own battery program with a bunch of different battery manufacturers. It's our second deployment of this type of battery program where it's basically fully controlled by Virtual Beaker. All the messaging and everything else is deployed within Virtual Beaker. And we're really excited about having that up and running in six months and adding new batteries every day. Got it. And in the final 30 seconds. Yeah, sure. So for us, I think what is really interesting is that building on the foundation that I was describing earlier, we are now starting to reach cross industry. So it's one thing to look from the perspective of a grid operation company. What does it mean and the expansion of electric vehicle charging? What if you could begin relating with the automobile manufacturers and the data sets that they're gathering? How could that data combined with the charging service operator data be used to create a much more effective model for demand and supply management? And so we're very excited about those kinds of opportunities to start to reach cross industry because in the energy world, it's all kind of connected. And the data is gonna be the currency that allows us to see what's coming and understand how to optimize that. So I think that's the focus area for us. Fantastic. Well, I wanna thank each of you for taking the time. I found this to be a very illuminating panel. I hope the audience appreciated it as well. So thanks for the time. We're gonna share all your contact information with the audience. So if they want to follow up with you individually, they can. And I will just end with a thank you and tune in next week where we will have another panel on our digital grid summer webinar series. So thanks very much and have a good day. Thanks Steve. Yeah, thanks a lot Steve for moderating. Great job. Appreciate it. Thanks everybody.