 Welcome back everybody, Jeff Frick here with the Q. We are live in San Francisco at Pure48 at the GE Minds and Machines Conference, about 3,000 people talking about the industrial internet of things. This is where the digital world meets the physical world where IT meets OT, and we're really excited to be here. Our next guest, Dimitri Volkman, he's a digital twin thought leader and evangelist for G-Digital, welcome. Thank you, how are you? Great, so first off, your impressions of this event? Fantastic, what do you think? It's very amazing because it's a sort of industrial type of warehouse space, and we built the whole conference into it, so it's pretty cool. Exactly, so you're all about the digital twin. So for people new to the concept of a digital twin, just give them kind of the basic one-on-one. So the idea is that if you look at the way we've digitized the world in the past decades, it was all about optimizing back office, and while we do that, we actually dematerialize things. We purchase orders that we had in paper, disappeared, money order disappeared. Now when it comes to things on machines, we still want to digitalize them, but they don't disappear. So the digital twin is really about creating a live digital representation of the machines that is paired to that machine, but constantly learns from its operation. So it's a sort of a new software construct to give you the ability to interact with the machines. So it's interesting, we talked a little bit off camera before we started, and he said this really kind of was born from early NASA days where when you shot the rocket into space, if you wanted to trial something, you probably had a physical twin as opposed to a digital twin back in the day, but that's really kind of the concept to give you another thing to play with, to test, to trial, and see what's going on. But now it's really a digital representation. Yeah, the original idea was we to allow you to model something and see how it's going, so having its structure, then its behavior when it operates, and see in the context what might happen to it. So that was the initial idea. Now it's very important to understand that we're not modeling those machines for the sake of modeling. We're modeling those machines because if you look at the business of industrial, we come with a commercial idea. I want to create a better jet engine for planes. Then we design it, we build it, we run it, and we service it. Now all along that cycle with the digital twin, we bundle data and intelligence so we can give you information about the past, the present, and the future operation of that asset. But we do that in order for you to make a business out of it, which is get some outcomes out of operation. So we look at the outcomes you want out of operation and we bundle the data intelligence about past, present, and future to help you optimize that. So that fundamentally gives you early warnings, predictions, and optimizations. And what I think is interesting about the concept is you kind of have both ends of the spectrum. On one hand, you physically twin a machine. It's not the model ABC. It's the particular serial number unit. At the other end of the spectrum though, you have the ability to aggregate from multiple serial numbers of that unit and to come up with data that's no longer just kind of an aggregate best practice, but it's actually data based on the summary of real machines. And it's actually not only data, but you touch on two very important points. The digital twin gives you the ability to zoom into the detail of a specific asset and have a very high fidelity digital representation. And it's that specific asset. Now that digital twin will also learn from the others. So it's the learning process. And again, that's this interactive, constantly learning concept for which you need a different software representation. It's not just a transaction in database. It's something that continues to operate. And as we gather data, we can create model on this data, get more intelligence and more and more insights, if you want. So on the execution of that, if it's a digital twin for a particular type of a jet engine, say, is that something that GE does? You have people responsible for monitoring the model, monitoring the reality, changing the models. And how does the actual learning and then the learning go back to the market in that, in that instance? Yeah, well, it always starts with the data about the assets. So have you seen how we have assets, your machine that have models and sensors model. So we capture the data. And we also, because we've been servicing systems for a long time, we have a lot of historical data, or so-called gold data as well. Then we have our research on material science on physics models and analytics. So we combine all that. So there's an interesting separation of role if you want in the twin. You have what we call the digital twin builders. And those guys are actually more building classes. So there will be a digital twin class for a specific asset, a work engine, like the one we have here, a G90 engine. So we assemble all the data, the historical, the knowledge, the intelligence, the models. And then they will publish that model once it's been validated. And then you have people that consume that. So the application, the developers. Because as a developer, if I want to do a nice app to optimize the usage of my work engine in my business, I don't need to be an expert in spark plugs or material science or transformation. So we separate the builders that are the experts and then with the digital twin runs on the platform, they expose their value to a much broader audience. Right. And as you said, and you can break it down. We talked about G is powering this whole conference with a big generator outside. That's got a digital twin with a little demonstration inside. And as you said before, you can monitor the system or you can monitor the part. So you can kind of change your point of view, your focus, the depth of field, if you will. Exactly. Just sort of zoom in, zoom out and you are at a very high level. It's the part, the asset or the product or the system itself. So and to the point of the Waukesha engine here, what is interesting is, so we're showing a digital twin of the overall engine itself. So it gives you things about power generating, percentage of load, reliability in the past months, projection of what's going to happen in the next 60 days. Now in that engine, the spark plugs are very important elements, because if you have a spark plug that doesn't work, the engine will usually stop functioning well. So for the spark plug itself, we have digital twins for each spark plug. Now what that gives us is, it's actually like your car engine, do you do your maintenance at 60,000 miles, they're probably going to change your four, your six spark plug. Now on the engine on that side, it's much more expensive because the spark plug at $700. So the digital twin gives you a life model of every specific spark plug. So the way we do that is we measure the voltage that it requires to light the spark plug, and over the time this voltage change, we compare that with historical data, we use models, and we give you a prediction of each individual spark plug. So we tell you spark plug number six, you're going to stop functioning in two weeks, do something, but the others are all fine. That's interesting, and the piece that I don't think anyone else has talked about since we've been on the show is how outside developers can build applications that take advantage of the digital twin to provide new type of algorithms to actually run these machines. Yeah, exactly. And that's the, this is coherent with our approach to the digitalization of the industrial world. You know, we started for G for G, we do it for our own. Then we bring it to our customers, then we bring it to a world. So we're in the process of releasing a software developer toolkit if you want so that anybody can build digital twin on our platform, and they don't necessarily need to be GE assets. We actually already have, it's not completely exposed to developers right now, but in our APM solution, a number of the assets that we manage use digital twin technologies and they are not necessarily GE assets. Right, and correct me if I'm wrong, because that's really the genesis of GE digital as an entity, because originally Bill and the team were building the software links in between the various divisions so that the gear could talk, and then that evolved into actually building the software for those units, and now it's evolved into the Pridix cloud, not only for the GE assets, but other people's assets as well as other people's applications. Exactly, so the history is we started as a center of excellence, and we started with the big blocks, then we realized that they were before you for our platform, so we did the platform for us, and now we brought the platform to the market, we launched it in February, and we're extending now to the whole world, so, but we keep on thinking for us, for our customers, for the world. So, now that people have the ability to operate and experiment and trial things with the digital twin, how have you seen that change behavior? Because before I'm sure it was kind of set it and forget it, wait for the automatic manual to say maintenance time and then you win and did it. Now that I have the capability to look in, to compare against models, to compare operating thresholds based on optimizing for longevity or output or whatever's going on, how has that actually changed the way that people operate these machines? Yeah, it's usually a progression, so to your point, the first thing you do with the technology of data analytics is you've got your equipment health, so is it running well now? Is it running within parameters? Then you start putting more sophisticated analytics, and you do predictions, which help you with your maintenance. Now that you have your maintenance, you can start saying, how does that impact my all operations? So you can start your combining digital twins. So let me give you an example in aviation. So we have very sophisticated model for engine blades in the turbine. They use a specific thermal coating to make the blade live longer. So coating is like you're painting the blade. Now we realize that when you operate those engines in the Middle East, where there is sand and hot air, the coating wears much more quickly than when you operate them in the North Pacific West, where the air is cold and moist. So the first thing was with those models, we can detect failures earlier. Now that we know that it actually has an impact on the route of our plane, because one of our planes, we know that the engine needs to be replaced. We're going to change this route, so we fly it into a less harsh environment. To extend the window. And maybe closer to a maintenance center. So the knowledge about how the engine wears in its context, allows us to optimize the route in which we use the engine. So again, we start by looking at one specific object, do some predictions, then do some optimization of the whole operation. And the beauty about the twin is, and especially this is what I evangelize most about the platform. I tell people, imagine, you have that digital representation, you have all the data, all the intelligence, past, present, future. Find new models. I'm giving you the basic equipment health. I'm obviously giving you the ability to maintenance. But now you have the knowledge. Re-invent your business. So it's a very enabling technology approach. And in the example of that aviation example, who's the one who says, hey, maybe we should route the plane ABC and let it run through the Pacific Northwest for a while because of this reason. Who are the people that are now in a position to act on that data? Well, we see the, and we've done that to ourselves again, we've seen the emergence role of what we call the chief digital officer. So these are more people that come from the IT world that understand the possibilities of the technology but also very close to the business. So these are the leaders that we sit with the operation manager and say, hey, where are we losing money? Oh, we're losing money where we have to service an engine in a place where the party's not there. And by the way, we service it too late because we didn't know it would work that fast. It gets a, hey, I have a solution for that because now I have the technology to again have early warning predictions and we can impact our optimization of operations for that. So we see a lot of the IT role evolving from just managing the basic system to being the one that understand the possibilities and yet understand their business and engage that discussion with the business. Hey, we can change now. We can do this, we can do that. It's fascinating to me that a big part of the IT discussion is using cloud and other technologies to sweep away undifferentiated heavy lifting and make everybody concentrate on higher value activities. This is the same thing that's happening in the physical world where that maintenance guide now is no longer just worried about changing the spark plugs out on time but now starting to think more strategically about how to better deploy those assets, change the timing, you know, get off the straight up manual. So where does it go? As you sit back and you kind of look down the road, you're the evangelist, you're out in front of the curve. What do you see kind of in the short term and how far can we go with this concept? Yeah, I think today there are still a lot of use case that are really geared towards the first steps. So equipment have, you know, maintenance optimization, maintenance, reliability management, first level of optimization of my operation. And the reason why is that to some extent a lot of those, because the digital twin concept existed but there were sort of imprisoned in the apps. By putting them into the platform, we still serve the app but we now give the chief digital officer the ability to say, oh, I have all this information now. I can invent new things. So again, it's kind of the limit. It's not, oh, I have to use that app. I not only can use that app, but I have the technology underlying where I can go directly talk to the digital twin. We've seen that in the show this morning and invent new things. So we are at the very, very beginning of the adoption of digital twins and most of the people have only used them in application. So we are in the process of educating them on how they can fully benefit from it. It's like in the beginning of databases. You know, when we switch transactional system from a TV monitors to SQL, it was like, oh, it's the same thing. And then people say, oh, I can do queries. Oh, I can do data warehouse. Oh, I can do analytics. So we are in that transition mode, if you want now. Exciting times. The last thing I would add is we've talked a lot about capturing data and having insights. But the next step is how do you act? So using the digital twin to send commands to the asset itself. So you're going to see more and more of those feedback loop now, where we get data, we get inside, a human will make the final decision, but then the iteration of the operation will be piloted by the twin as well. Because the twin works two way. It gives us data, but you can also receive information and act on the machine. So that's the next big way. And the twin can both monitor and do predictive and do scenario analysis at the same time. That's great. All right, Dimitri, thanks for stopping by. Exciting times. A lot of efficiencies to be rung out of the OT, like we've done in the IET and congrats for being right at the head end of it. Thank you, it was great. Absolutely, Dimitri Volkman. I'm Jeff Frick, you're watching theCUBE. We're at the G mines and machines in San Francisco. We'll be right back after this short break. Thanks for watching.