 I only have about 45 minutes, 45 minutes maybe, to try to give you an overview of what I consider kind of the next generation of what we're investing in, what we'd like to see evolve in a collective community of analytic tools, and many of these pieces are already being worked on in lots of different areas, and I'm just going to try to pull it all together in a storyline of that hopefully as exciting for you as it is for me and colleagues that I work with. NREL, for those of you who are not familiar, a quick glance on this slide, it's give or take about 2,000 employees, actually runs at 400 plus million dollars a year. It's one of the youngest national laboratories in the DOE complex of which there are 17 or so of them, and we're located outside of Denver and Golden, Colorado, where this morning it was a chilly about 30 degrees or so, and we've already had a few snowstorms, so enjoy the weather here. The landscape for NREL is very quickly just on this slide, it covers everything from basic materials through devices and systems and integrated systems of systems, and I use that language to talk about it because that's the storyline actually that I'm going to walk through in terms of where is this interplay between data computing analytics moving toward, and I think it's really important to understand that the landscape of opportunity, let's put it that way across that whole landscape of science and markets and policy and regulation. We formed a new directorate last year, putting together our scientific computing directorate and a supercomputer that's now just being upgraded to a new one, and our energy analysis directorate, that really brought to front of mind how do we more mindfully think about using scalable analytic capability and scientific computing and data that we are now inundated with in order to improve the fidelity and the insights from our energy analysis, and now you're going to see a broadening of my definition or the definition of energy analysis, which four or five years ago when I came and gave a talk it was pretty focused on energy economic models and models that would help inform policy regulatory and finance decisions. And I think what's interesting today is that with the scaling capability of computing we can now think of actually linking physics based models, i.e. doing the computational work on materials all the way up through our energy planning economic models and pulling them together in a much more comprehensive way, which allows us to actually inform the science by asking questions about the markets or informing the markets and the policy and regulation by understanding the science better and doing the computational or the thought experiments on the science side. So this is not a very sophisticated graph, but I actually kind of like it. It was myself, it was the outcome of a conversation of myself and one of our graphics team, and it was trying to capture what are the complexities between data, technology, markets, molecules, devices, systems, and more importantly systems of systems, and how all that plays and of course people, right? So this, we had a whole program and continue to have programs on I call it humid-centered innovation, which is thinking about energy not as an electron or a molecule or something like that, but really as an enabler of services. What we all care about, heat, cooling, food, clothing, transport, mobility, connectivity, et cetera. None of that really operates without energy and particularly in our world without modern energy. So I'm going to give you a little bit of a landscape, an overview of, across that whole piece from molecules to markets and from simple pieces all the way to integrated systems and kind of hopefully tell you a little bit of a roadmap of what we're working on and where we're going to go. So let's go through it and a lot of this is going to be very high level. If we want to dive into details I can send you to papers or we can have a side conversation afterwards, but here you can see already what we're doing with computational analytics where we're actually, this is just one example of perovskite solar cells. I think there's also some very complimentary work here on this as well where we're actually doing the computation on ligand structures and molecular structures and then trying to advise then the physical experiments about where to go and what type of chemicals to put into the perovskite mixture. Of course to be compatible with the roll to roll processing and the painting processing that people are working on to bring down the cost on that. So a lot of this is on new, well this one in particular talks about new whole transport materials, but all different layers all go through that. Again, computational chemistry, advising the science on where to actually do experiments, but then what we actually also do is pull that new potential of the performance and the cost structure of that cell all the way into an energy economic model and I'll show you an example of what a couple of those are. Here's two more examples, again computational chemistry working on the physics side and the molecular side of paradigms for alternatives of fiberglass and thinking about that kind of structure or maybe more importantly ligands, so expansion molecules off silicon nanocrystals in order to again control the property of silicon nanocrystals. All that is with an idea of how do you do computational design of materials for properties that you care about for the economics to get into the marketplace. So again, more expansive definition of energy analytics on this front. Here are a couple of a little bit more system oriented examples and I'm going to talk a little bit more about the wind piece as we go through the talk as well. On the left actually is looking at again computational analytics of E25, ethanol 25 blend of in gasoline and actually looking at in a co-optimization routine where you actually look sub millisecond computational fluid dynamics of the flame inside a chamber, inside a cylinder and how do you then maximize the energy out of that with that blend. And on the right hand side again, very high power analytics behind wind farm optimization. I'm going to spend a little bit more time on that at the end because that's a nicer linker and more easy link frankly to the understanding the market potential and the policy and regulatory side of that equation. That's all very high exascale computing opportunity. It's not quite on exascales yet. It's on petaflop computers which we have and it's really about new insights to wind farms and I think that that one's actually pretty tactical for many of us to try to understand. We continue to do resource information at now much much better fidelity both temporarily and spatially. So this is about taking basic fundamental data that comes off of satellites, downscaling it or processing it down to sub kilometer levels and being able to provide that data at five minute, fifteen minute, hourly, etc., whatever temporal scale people need. In this particular case we're actually mapping this for a given region and we create supply curves that actually then go into a combination of analysis of what technologies might those resources support i.e. photovoltaics, concentrating solar power or solar thermal and then what does that actually look like economically in a quote localized and temporal supply curve. This is an example of actually doing that on wind. Those that work in the detail here would be familiar with wharf modeling again global climate models that then work through all the way down to given a year on sub multi kilometer basis is not as good as what we'd really like. We'd really like to actually have mathematics to do downscaling down to a hundred meter spatial resolution particularly for wind. Those mathematics aren't there yet. That's what we're going to work on with a bunch of others and then also temporarily. So wind is a prototypical example of why continue to look at advanced analytics, why understand the resource base in combination with what's going on technologically. Most of you are familiar hopefully with this upper left hand graph which is about the scale of wind over the last multiple decades. I didn't put a cost structure on here but wind has decreased in cost by 90% over the last few decades at least. And now you see wind blades and what are the biggest problems with wind turbines right now particularly onshore the ideal size of a wind turbine you can't actually transport. You can't get it around a corner. You can't get it over a bridge or underneath a bridge and so now we have to be thinking of what is the co-optimized manufacturing of wind at a given wind plant and so we have to think new materials, new processes for manufacturing at different heights and why at different heights because ten years ago when we started the energy economic analysis real dialogue about the potential for clean energy particularly in the United States but in most other countries we put up a map like this. 80 meters was a hub height that we thought that wind was really good at and it was a great map. There's a huge resource potential in the United States thousands of times more than our average energy consumption ignore the temporal aspects of all of that but that map had a really interesting implication showing it in Washington D.C. or in any given state which was oh I don't have wind in my state why should I care about that well if we fast forward to 2014 and we actually look at what is the appropriate wind height where hub heights would be at 110 meters you see a very different map you have a very different conversation in fact you have very different economics as well and that's 2014 but if now if you look at 20 kind of 18 more modern or near to be commercial this is now at 150 meters hub height and you get incredibly different in fact the resource base quadrupled from 2008 to today's estimate it doubled from 80 meters to 100 meters it doubled again from 110 to 150 meters which again completely changes the economic calculus and it changes the geopolitical and the political economy calculus and lots of other pieces there lots of other things happening on the wind side as well to help support grid integration as well and this is one of them and I don't know if I don't trust this thing to work let's see if it does work okay so this won't will not loop but let me just describe what this is so this is a blade resolve calculation of the wake of a wind turbine at six billion points of a calculation so this is basically an exoscale type of computing that's happening very sub second I can go backwards let's see if I maybe I can go backwards and I'll show it one more time when I describe it doing this is actually seen as one of the grand challenges of using very very scalable computing because this takes many thousands of node hours on a petaflop computer and it's one turbine and really what we want to do is not just model one turbine but we want to model full turbine full wind farms and why do we do that because if you remember as a kid and you used to you know put your hand out the car window and you know if you've been in an airplane and it crosses a path or something like that wake effects are really important and wake effects in a wind farm which you would have thought we would have we would have thought about this 15 or 20 years ago right so rather than put wind farms you know back-to-back or wind turbines right in a direct line of the wind direction and at exactly the same height which you know as you think about it now it was pretty obvious well the first turbine is going to affect the second turbine and third and fourth and fifth in fact by doing calculations we know it affects wind turbines about three kilometers downstream and you can see some of these pictures I think you've seen some there there have been pictures on the web of the first wind turbines affecting multi wind turbines in with clouds off and in some offshore turbine of wind farms as well but the opportunity here is to now optimally design wind plants in 3d and so now we take our 2018 wind resource and in fact we can completely rethink what the wind resource might be because we might be able to get four five or six or seven percent more energy out of a wind farm than we thought doesn't dramatically but pretty significantly changes what those economics look like particularly for the developers and others but more importantly it really changes the engineering paradigm of how do you design wind farms that's not just one so the other question we have to ask is well if we're designing these plants to last for 25 or 30 or 40 years what do the patterns of energy and weather and climate induced weather changes look like in 25 or 30 or 40 or 50 years so we started a whole series of work of now looking at projected resource information so projected resource information rather than recent or historical measured information validated is a much more challenging task to do with depth and fidelity and rigor because you now have to go off and look at an ensemble of climate scenarios and an ensemble of different global climate models and extract out from there wind information solar information etc again just looking at those resources and not others this of course shows you the graph of the different aspects that you have to think about right so we're normally we normally think about heating and cooling degree days etc now we have to start thinking about resource information and how much how much does that really change is it material or not and you can see kind of the layout that we've done here on five or six now studies that are either out or in process of coming out here in the next few months looking across a whole series of GCMs with well actually only out to 2050 so while the the team actually chose an RCP of 8.5 hopefully people know what that language is if not you can ask me afterwards 2050 those the RCP scenarios don't really change very much so the real in the real question is is what happens to the generation profiles either operationally and or for information on on planning here you see one cent one quick synopsis of an output there these are actually values in gigawatt hours but let me give you a relative scale this is less than 3% of total output of a whole of a whole system so this is done for ERCOT it looks over a plan system in ERCOT with like 30 or 40 percent variable generation wind and solar in that and you can just see the variability again across the GCMs because they're again they use different tools and techniques and methods and then the average which is or the median which is in the bold line in the middle so trying to understand more and more fidelity there actually interestingly a paper that will be coming out relatively soon in science advances will look at the the climate impacts long-term climate implications of Indian Ocean circulation on the wind patterns over continental India which I think is actually quite interesting to think about and I think more conversation among the energy analysts and the climate analysts community is again one thing that's squarely in our radar screen to really understand both historical as well as future projections so I mentioned very briefly we take all the molecular pieces that we're doing in terms of computational chemistry understanding materials and then throw it into energy economic model so this is one of the classic models out of NREL it's called the system advisor model that's free software you're welcome to go on download it and have fun playing with it it's actually quite powerful it will do hourly or sub hourly calculations of system configurations under different business models give you lots of different parameters in fact you can play with every parameter in it if you so choose and here it shows a bunch of different pieces on PV but most importantly what we're actually now doing is improving the fidelity what's in that model of actually going from the cell to a module to a system and there are there are still some calculation challenges and engineering calculation challenges of doing that so we're improving that as we go forward I'm not going to read the details of the slides but you can see the the modeling of temperature and interconnection so how those modules are interconnected in a system does affect the overall system performance and the output of it and that temperature actually goes all forward with the climate modeling as well and so I think it's important to again think about not just here in the next two to five years but if that PV system which we're striving to go from 25 year life duration to 50 year life duration what actually might that increase in ambient temperature look like on the system performance out that long so that's the example of where that fits in the other piece that we're doing which I think most folks here I think would would hopefully find pretty interesting is adding the whole plethora of battery chemistries so batteries tend to be one of the sexy topics of the day or of the decade or a few years a huge potential in given applications and in certainly in providing some critical services to the grid particularly as it gets more and more variable generation in it and this just has a whole bunch of opportunity to look at the interplay particularly between batteries and photovoltaics but also battery hybrid systems so we've actually produced a number of reports last year or so at looking at battery hybrid configurations including batteries coupled with gas turbines where there are a couple examples actually here in California where it actually helps with with Blackstar and with with very short duration system services which are identified as potential market opportunities for actually real revenues those revenues hasn't have it come online as strong as as people had hoped yet but the potential is that they come even further so let me stand way back out as well so from molecules to systems and now global system dynamics this is actually just one chart out of a whole series of reports that we've done on supply chain dynamics and this was actually started four or five years ago underneath a program called the clean energy manufacturing analysis center where we were asked to start to generate metrics that could effectively inform dialogues that would be the same as a barrel of oil or a million BTU of gas how do you think about trade flow and the dynamics of trade flow of clean energy supply chains in some equivalent fashion whether or not it's photovoltaic materials cells modules whether or not it's lithium ion the or lithium and cobalt just the materials or whether or not it's battery cells or whether or not it's battery packs or etc etc so this is just one example of important export defined in value add which is an interesting economic calculation that you can walk through and we've done this for a number of different pieces again trying to inform a dialogue about what's happening in those supply chains so that's stepping way back out just another piece of analytics that actually on that one the interesting opportunity where we're we're still in an early-stage dialogue is actually with a whole series of blockchain companies who are actually thinking about provenance all of all of these materials all the way through to recycling and reuse and very interesting because that's what they could do and now we're thinking about what does that look like for the clean energy materials and supply chain and eventually recycling and reuse opportunities as well let me spend a few minutes on sustainable transportation I mentioned another example of this earlier but let me just pull it back up this is again very high fidelity computational fluid dynamics running on a petasflop supercomputer looking at the the code design of engine components for for fuel blends so fuel blends being either ethanol and gasoline or biodiesel or biodiesel equivalents and diesel itself and then engine designs themselves that looks both at the combustion side but also the flow of of products out so I don't know if you can see it at the top this is actually co2 streams after the combustion that's actually the the the combustion pattern itself so this runs again very high fidelity very sub-second multi hundreds of millions of data points to do that and again thinking about that pathway for vehicles that's on the material side and kind of the materials and devices side of that equation maybe more importantly or more traditionally I would say a whole lot of material or of analysis in potential adoption of alternative fuel vehicles including in fact electric autonomous connected autonomous vehicles I'm gonna show you a couple examples on there where we're actually trying to push the limit on how do we use the data in order to understand where to potentially optimize there I'll just show you here where do you potentially place electric vehicle infrastructure and at what scale is that infrastructure so a lot of debate about whether or not you're putting in tier one level one level two or level three charging what in fact is the the challenge that might come from uncontrolled multiple vehicles at a level three charging station so you could think of you know three vehicles just popping up and saying oh I'd like a level three charge that might be a megawatt spike to a distribution line which likely is not going to be handled very well from from the the utility company but interestingly we're thinking about how do you create open-table type reservation systems for vehicles like that that actually might already go into the continuous on you know the the electrified vehicle which is oh I need to charge where am I I need to look for the charging station that's got an open slot that open slot is already in communication with the system operator so that they know that they have to provide power to that particular place in that particular piece of time and then you've got charging and you've got regulation and you've got tariff structures that go with all of that there is a tool online it's called evi pro light there's another tool called evi pro it's a little bit more high fidelity that is not online for that and then of course you're sitting in the middle of the innovation cycle for transport network transportation services companies or transportation network companies here this is the uber lifts and others of the world and lots of different dynamics on that front and of course looking at their impact on different places where so this is just a couple of graphs of what that looks like for an airport so we're about just about to enter a big program with Dallas Fort Worth Airport for looking at again millions and millions of passenger data files and mobility files both in the SID airport but also the transport to and from the airport which is give or take five or so million I I think passenger trips a month if I remember the number correctly don't quote me on that and they're really interested in what is what is electrified autonomous vehicle mobility services mean for us and how do we think about that why because guess what that one of their biggest revenue streams is at an airport anybody got a good guess parking yeah pretty pretty fascinating so so we can blend all that actually with architecture with community design and think about optimization of communities this is just a couple of examples of how to go about doing that again in our world every single building is modeled for its own energy system but as also an element of an integrated energy system energy system being not just electricity but also with mobility possibly with natural gas possibly with water as well these are wonderful complex problems that have very high computing requirements as well as data requirements and you can work with communities actually to collect that data got a couple of ongoing programs both in Denver as well as with some colleagues outside of Chicago on that part so Jim how am I doing on time nine minutes all right I'm gonna quickly wrap through systems and systems of systems most folks hopefully well maybe not familiar with we've done a lot of work in power systems and I'm gonna run through why that's so important and why it's important to take a systems view on this this is just a map of a bunch of our models everything from agent-based models to very large-scale capacity expansion model which is essentially a least copmed optimization of what does your power generation profile look like in the future to models that allow you to look at whether the operational aspects how's the thing work and you need to from an electrical engineering side you need to know how does it actually maintain voltage and frequency stability and also how does it serve load so I'm gonna do this really quickly we've done some really creative mathematics to decompose some of these things they can actually run them on a computer lots and lots of studies that you're familiar with lots and lots of models that actually lay out different grid infrastructure in the United States either I call it traditionally or actually now looking at HVDC interconnections which have been looked at on and off for probably 30 or 40 or 50 or maybe even longer years and rethinking what does a US grid actually what what could it look like and how might that affect both cost reliability resiliency of that grid etc lots of different questions in there about how that interplays with it and then what we used to do was actually sit in a room with a whole bunch of deep thinkers and say wow how do we explain this to an audience how do we work through this and it we would come up with I don't know a dozen scenarios I don't know Hill we in EMF you know dozen maybe 15 or 16 scenarios which can be well structured and articulated and so what we've done is basically said well what if we just took the took all the parameters or as many as the parameters as we think is is actually workable and we just let them float within a given region of of what we think is is reasonable so this would be cost and performance could be natural gas prices etc so we do that because we can then run ensembles of scenarios so that's what this is this is tens of thousands of scenarios of our model called reads which is a national electric sector capacity expansion model across a huge open space of technology natural gas prices storage capabilities etc and then you can start to evaluate what that all looks like and what insights might you get by looking at some of the extremes or certainly looking at some of the trade-offs and I think this is a direction where we want to really work with the analytic community to move to ensembles and robust insights from ensembles rather than very specific little suites of scenarios which always get picked apart by reviewers since many of us are reviewers of our journal articles the other thing which is hot on our radar screen is that talking about resiliency and actually thinking about how do you economically model resiliency so here very smart young economist intern who's been working with my group and others has come up with a resiliency cost function and it's actually resilient to cost function as a function of time which is really important because if all of us think about what is the potential impact of an outage it is function of time right a refrigerator has a different outage function than your cell phone does or your electric vehicle charging station or your gas station five minutes oh boy I gotta go do fast all right so let me see if I can get this to run uh-oh movies aren't running oh no there we go let me see if I can get it is it running bomber all right sorry um what we've been able to do is actually locationally and temporally economically look at the loss as a function of time by giving different loads different cost functions temporally and then the next graph actually so this one here but I'm sorry it won't run is it running okay this one here actually gives you not only where the loss factor is when it gets into this so this is an outage so the orange things are outages the blue is the loss of load and the green shows you where a generator actually will turn on if it has foresight to where the economic losses will be and of course it knows where the outages are and so in this test grid you can actually work with a system operator to optimally minimize the economic losses of an outage so this is just a first start it's a it's an early project on this it gets the economics running on it actually pretty insightful to think about that um as as we go through and think about other pieces as well we also do very large grid integration studies I'm not going to show you a video of this there's actually one up on our youtube website it actually looks at five minute grid analysis of the whole well two different parts of the country time synchronized looking at different scenarios of wind and solar again looking for reliability and economics on that and I put too many in here and this is actually the same thing for India I'm not going to show you the video because there are too many of them so we've been able to do that and then well I have five minutes I want to show you what we're doing with some visualization as well so that 3d optimization of wind farms we actually put into a 3d uh sorry a 3d visualizer uh that you can use in augmented reality or you can put on virtual reality goggles or you can actually walk into a big cage that's a third of the size of this room and actually gain some physical insights as a as a as a scientist this is actually inside props like solar cell materials also doing that again computation and visualization this one's actually 11 dimensions of a power system voltage and frequency looking at how unstable they are when you actually introduce a cloud cover with a multi megawatt solar system on a distribution line so all of those insights combining it with visualization actually help the scientists they also help the decision makers to really work through it and again that's this combination of analytics computing visualization as we go through so there are a bunch more examples of that but in light of time I'm going to stop and entertain questions and I always put this up to be reflective because while I talked a lot about science and economics and engineering I think most of us know and if we talk to our political economist colleagues and the social scientists colleagues they always remind us that it's about science in in perspective so our solutions in perspective so thank you thank you for the invitation and thank you for allowing me to give you a a random walk through everything from molecules to markets and kind of where we're going with energy analysis and computing together for the next phase of work look forward to working with you we have we have 10 minutes about for questions and we start with students first and those of us who are lifetime learners can can ask the questions later uh anybody yes Mark yeah thank you for the great talk very interesting um I was wondering as you integrate systems you integrate different scales both on the physical dimension and also in the temporal dimension you can comment on if you see any limits on inside in terms of making having more temporal resolution and more physical resolution as an example we have a capacity expansion model does it make sense to have second to second kind of information or is it enough to have an hour or half an hour how far should we go with computation basically and the same on the physical side you know that should should like bake models be included in like in the long term in these large scale expansion models how far do you see insights kind of from different scales yeah it's a really good question so i'll just repeat it what how do you think about temporal and spatial resolution for the different types of questions and different types of models that you're that you're working on and the answer is actually that what we're trying to do is create actually a modeling architecture which is agnostic to both and so you can actually put in uh well it's agnostic it's scalable in multiple ways temporally spatially and computationally and so if you actually have the data and you're able to have the computational power to run let's say a minute level or five minute level for a year production cost model that would run possibly in parallel or in sequence with the capacity expansion that is a possibility that we're working toward but you don't have to do that all the time a lot of times what we do is learn from the the operational models and then put i call it stylized or approximate statistics into the capacity expansion models and then allow them to do representative hours days or weeks within their modeling infrastructure but the orientation forward is scalable integrated architectures for the comp for the computing and and to have interfacing layers so that if you're running capacity expansion and you get to a certain point in time what you may do is actually automatically call and say okay run a production cost model now and tell me if everything's going to work well uh under that given scenario same thing for electric or gas and the gas interface etc good question so thank you for the wind resource map example if that was really good please try to speak as loud as possible because you don't have a microphone sure yeah thank you for the wind resource map example i thought that was like a pretty good example of how science can inform policy so what do you see is nrel's role in informing policy moving forward and do you see it changing in the near future so our role has been and will continue to be developing world-class data analytics technical insights and the tools and capabilities to help decision makers whether or not that's policy or investment or a regulator help them answer those questions and so kind of credible objective analytics with best in class data and tools to do that and that that stays pretty consistent that's kind of been our mantra for the past decade and it will continue to be. Patricia? Yeah so i thought it was really cool that you showed that map of all of the scenarios you ran with leads which is clearly sort of state-of-the-art and much better than doing a full half hazard scenarios but obviously creates a lot of information that could be hard to digest by a policymaker or decision maker or really anyone so i'm wondering if you could speak to what you guys at nrel are doing about how to present that information in a usable way it turns out i didn't show you the other half of that that one so we've actually used all those tens of thousands to train some neural network models on the outputs you can actually embed the neural networks which have a pretty low error relative to the outputs of the model because it's a linear program and you can actually embed that software in a web page so you can actually go and run what are those scenarios look like and you actually can see outputs on that that's on the the techie side of that equation on the the more how do you communicate side of that equation there's some pieces for informing science in terms of decision trees and ladders and things like that that we've done and more for the i call it non quantitative decision makers a lot of that is about thinking about the the the more 2d or 3d visualization space and how much information do you actually provide in terms of here's a swath of of of potential outcomes that are more influenced by natural gas or here's a trade-off between natural gas in a system and storage in a system because they're providing or at least in some tooling they're providing similar services to the to the system itself so a lot of that is a bit of art as well as the science side it's a really good question and more data is more confusion just to follow up the website where this is the what is the website and is there any publication associated with the neural net or QDF reference um there is i can get it to you afterwards yeah okay are you just curious are you managing your own computer resources or are you using scalable cloud computing from another we do both so we we had a computer called Peregrine it was a two two and a half petaflop computer it's currently being replaced by the next gen which is about nine petaflops on on site we then also use cloud computing as well it's a that's an interesting mix of balance most of the storage is it's it's again mixed storage is on site or off site procured data sets that we actually work on really hard we keep but then we provide to the to the to the general public and and others as well what we're looking at more so is actually again scalable architecture so we're working with some of the bigger hardware labs that that are really in the exascale computing project in order to think about how do we create again grand challenge applications that are scalable to the exascale so we're probably going to stay at the tens of petaflops at at our lab at least for the next near term that we can think of but we really want to be working with others that really have that order of magnitude scalability on top of it how much of the computing needs to be specialized for the simulation you're doing versus general types of general yeah it's all multi-purpose gpu's and in graphics accelerators okay now i'd like to open up everybody so about 20 years ago i used to design and help sell super computers and i visited your neighbors at mcar of the hill and we got a long discussion about different problems where the three different challenges might be getting the data knowing the science that's being able to write the equations and having enough compute power to do it right is that a decent model for challenges you you face and and if so could you sort of give a few examples where you know where which one of those is the challenge clearly you want more compute power i said that was what i could help them with um it's a good way to frame it i'm not sure i've thought of it in those three pillars um i would say we are um swimming in data the question is is whether or not that data is high enough fidelity and if it can be useful or not so it's much more about quality and usefulness than it is about data um i think where we do have a challenge is on the again the forecasted resource and other information as i as i suggested we don't have the mathematics and the compute power to do that as well as we need to so that's that's a place where all three of those play but i'd say otherwise if we're looking at you know five or ten year kind of infrastructure development and policy analytics we have more than enough data and the question is is how do you really think about using that to gain some good insights can we expand that a little bit um do you see a fundamental difference in among those things between say when you're studying the human behavior in the system and you're saying how are they really going to behave versus a molecular structure where the molecules all given the same forces always behave the same is is that is one a place where where fundamentally getting the equations right is more of a challenge and the other is getting the math right is more of a challenge or don't you see a distinction of that way so i think there are differences i would maybe describe them slightly differently i'd say you know writing the physics and chemistry and the co-optimization algorithms etc of the physical side the molecules up through devices and they the physical layer of systems that's actually relatively straightforward i use that in quotes i mean for those that are so inclined it's relatively straightforward i would say on the on the behavioral aspect certainly the institutional decision making even the energy services side we collectively don't do as good a job as we could and i think you know we've had 10 or 15 years of pretty decent experience experience in uh you know the the google power meter and thinking about all sorts of uh you know uh let's know the kind of behavioral barriers shall we call it and i think we we we acknowledge them but we really don't yeah think more deeply about how do we characterize them and how could we actually change them and so presumably then your research strategy is very different between those types of things they are yeah yeah other questions what type of data analysis or software development is nrel currently doing in the like district energy space on like systems such as stanford's heating cooling system yeah um so on the district scale i showed you kind of one graphic of where we're trying to do co-optimization across buildings people mobility uh etc so we have a pathway where we've actually invested in um again pulling together our buildings physics based models our models of the energy infrastructure in those buildings which tend to be called i'll call it energy management systems for lack of a better way to say it and then the micro grid of the of the district in combination with the macro grid including all the mobility aspects of that as well so the architecture of that is underneath a program or a framework called urban optimization that's building off of all the work we've done in building optimization as well as um micro grid and distribution side on the power system sides we actually have a whole research program that i didn't talk about here which is the equivalent of thinking of connected autonomous vehicles and resiliency in the mobility fleet in the electricity side so we're actually thinking of how do you work toward autonomous energy systems electricity systems in particular but there could be um auto islandable high resiliency reconnect when they need to etc etc that's got a whole pathway about distributed um analytics distributed computing and distributing controls that i didn't talk about here let me do one last question and then i have a special short question with a short answer it looks like a very powerful set of tools for getting us through the great transition my question is kind of broad policy one that what do you think the prospects are the future for diffusing this to the people that really need it people in China and India are you connecting well with institutions commercial institutions government institutions do you think they're going to get these tools in time yeah so i showed one example i didn't actually show you the video of actually us doing this type of work in india with the counterparts there so that's the ministry of power the grid operators the states etc they are getting these tools they really understand them and they embrace them with a fair amount of upfront work to gather the data and and create the modeling and tools um i would say that the appetite is um is there we're certainly working with institutions in india china and about 35 other countries pretty intensely it i always take the approach that in fact i was talking to a whole series of africa energy ministers most of utility ministers and utility executives last summer i said and i gave him kind of a similar talk maybe not quite so techy and i said there's no reason for you not to use these best in class tools the data is either available you have it or we can get it we can work on the resource data the tools themselves are not pie in the sky out of out of reach these days the computing power either you have it or you can access it via the cloud and it's really about building capacity so i guess my my plea backwards is please please continue to support you know students graduate students send them our way uh and create an international network of folks who really want to work on these systems and bring these tools and capabilities to every country or every region because they're available and it's really about collecting the data actually online but it's a human capacity issue okay and at that point i've got to end with a short editorial i for the last actually almost a dozen years i've been on the external the national advisory council of nrl that ends up in december but what i've been able to see is things going on and a bit of advice to the students here it's a damn good place to go to work either as an intern or as a first job because they're asking really important questions to move us towards the clean energy future get really good people who can use the top notch analysis empirical techniques computational techniques so you're not sort of limited in what you can do intellectually so if you get a chance to go there for the next job go do it and at that point thank you dog