 Live from San Francisco, California. It's theCUBE, covering the IBM Chief Data Officer Summit. Brought to you by IBM. Welcome back to San Francisco, everybody. You're watching theCUBE, the leader in live tech coverage. We go out to the events, we extract the signal from the noise, and we're here at the IBM Chief Data Officer Summit. Tenth anniversary, Seth Dobrin is here. He's the Vice President and Chief Data Officer of the IBM Analytics Group. Seth, always a pleasure to have you on. Good to see you again. Thanks for having me back, Dave. You're very welcome. So I love these events. You got a chance to interact with Chief Data Officers, guys like yourself. We've been talking a lot today about IBM's internal transformation, how IBM itself is operationalizing AI. And maybe we can talk about that, but I'm most interested in how you're pointing that at customers. What have you learned from your internal experiences, and what do you bring into customers? Yeah, so I was hired in IBM to lead part of our internal transformation, so I spent a lot of time doing that. I've also, when I came over to IBM, I had just left Monsanto, where I led part of their transformation. So I spent the better part of the first year or so at IBM, not only focusing on our internal efforts, but helping our clients transform. And out of that, I found that many of our clients needed help and guidance on how to do this. And so I started a team, we call the Data Science, an AI elite team. And really what we do is we sit down with clients, we share not only our experience, but the methodology that we use internally at IBM, so leveraging things like design thinking, DevOps, Agile, and how you implement that in the context of Data Science and AI. Yeah, question. So Monsanto, obviously, completely different business than IBM. But when we talk about digital transformation, and they talk about the difference between a business and a digital business, it comes down to the data. And you've seen a lot of examples where you see companies traversing industries, which never used to happen before. You know, Apple getting into music, and there are many, many examples. And you know, the theory as well, it's because it's data. So when you think about your experiences of a completely different industry, bringing now the expertise to IBM, were there similarities that you were able to draw upon, or was it a completely different experience? No, I think there's tons of similarities, which is part of why I was excited about this, and I think IBM was excited to have me. Because the chances for success were quite high in your mind, right? Yeah, yeah, because the chance for success were quite high. And also, you know, if you think about it, there's, on the, how you implement, how you execute, the differences are really cultural, more than there anything to do with the business. Right, so it's, you know, the whole role of a chief data officer, or chief digital officer, or chief analytics officer, is to drive fundamental change in the business. Right, so it's, how do you manage that cultural change? How do you build bridges? How do you make people feel, you know, how do you make people a little uncomfortable? But at the same time, get them excited about how to leverage things like data, and analytics, and AI, to change how they do business. And really, this concept of a digital transformation is about moving away from traditional products and services, more towards outcome-based services, and not selling things, but selling as a service. Right, and it's the same, whether it's IBM, you know, moving away from fully transactional to cloud and subscription-based offerings, or it's a bank, reimagining how they interact with their customers, or it's an oil and gas company, or it's a company like Monsanto, really thinking about, how do we provide outcomes? But how do you make sure that every, as a service, is not a snowflake, and it can scale, so that you can actually make it a business? So, underneath the as-a-service is a few things. One is data, one is machine learning and AI, the other is really understanding your customer, right, because truly digital companies do everything through the eyes of their customer, and so every company, you know, has many, many versions of their customer until they go through an exercise of creating a single version, right, a customer or client 360, if you will, and we went through that exercise at IBM, and those are all very consistent things, right, they're all pieces that kind of happen the same way in every company, regardless of the industry, and then you get into understanding what the desires of your customer are to do business with you differently. So, you were talking before about the chief digital officer, chief data officer, chief analytics officer, as a change agent, making people feel a little bit uncomfortable, exploit that a little bit, what's that, asking them questions that intuitively they know they need to have the answer to, but they don't through data, what did you mean by that? Yeah, so here's the conversations that usually happen, right, you go and you talk to your peers in the organization, and you start having conversations with them about what decisions are they trying to make, right, and you're the chief data officer, you're responsible for that, and inevitably the conversation goes something like this, and I'm going to paraphrase, give me the data I need to support my preconceived notions, right? And that's what they wanted to get. There's the answer, give me the data there. That's right, so I want a dashboard that helps me support this, and the uncomfortableness comes in a couple of things in that, it's getting them to let go of that, and allow the data to provide some inkling of things that they didn't know were going on, that's one piece. The other is then you start leveraging machine learning or AI to actually help start driving some decision, so limiting the scope from infinity down to two or three things, and surfacing those two or three things, and telling people in your business, your choices are one of these three things, right? That starts to make people feel uncomfortable, and really is a challenge for that cultural change, getting people used to trusting the machine, or in some instances even trusting the machine to make the decision for you, or part of the decision for you. That's got to be one of the biggest cultural challenges, because you've got somebody who's, let's say they run a big business, it's a profitable business, it's the engine of cash flow at the company, and you're saying, well, that's not what the data says, and you're saying, okay, here's a future path for success, but it's going to be disruptive, there's going to be a change, and I can see people not wanting to go there. Yeah, and if you look at, to the point about, even businesses that are making the most money for parts of a business that are making the most money, if you look at what the business journals say, you start leveraging data and AI, you get double digit increases in your productivity, in differentiation from your competitors, that happens inside of businesses too, so the conversation even with the most profitable parts of the business, or highly contributing to most revenue is really, well, we could do better, right? You could get better margins on this revenue you're driving, you could, that's the whole point is to get better leveraging data and AI to increase your margins, increase your revenue all through data and AI, and then things like moving to as a service from single points of transaction, that's a whole different business model and that leads from once every two or three or five years getting revenue to you get revenue every month, right? That's highly profitable for companies because you don't have to go in and send your sales force in every time to sell something, they buy something once and they continue to pay as long as you keep them happy. But I can see that scaring people because if the incentives don't shift to go from all up, pay all up front, right? There's so many parts of the organization that have to align with that in order for that culture to actually occur, so can you give some examples of how you've, I mean, obviously you ran through that at IBM, you saw, I'm sure, a lot of that, got a lot of learnings and then took that to clients, maybe some examples of client successes that you've had or even not so successes that you've learned from. Yeah, so in terms of client success, I think many of our clients are just beginning this journey, certainly the ones I work with are beginning their journey, so it's hard for me to say client X has successfully done this, but I can certainly talk about how we've gone in and some of the use cases we've done with certain clients to think about how they transform their business, so maybe the biggest bang for the buck one is in the oil and gas industry, so ExxonMobil was on stage with me at Think, talking about some of the work that we've done with them in their upstream business, right? So every time they drop a well, it costs them not thousands of dollars, but hundreds of millions of dollars, and in the oil and gas industry, you're talking massive data, right? Tens of, or hundreds of petabytes of data that constantly changes, and no one in that industry really had a data platform that could handle this dynamically, and it takes them months to even start to be able to make a decision, so they really wanted us to help them figure out, well how do we build a data platform on this massive scale that enables us to be able to make decisions more rapidly, and so the aim was really to cut this down from 90 days to less than a month, and through leveraging some of our tools as well as some open source technology and teaching them new ways of working, we were able to lay down this foundation. Now this is before we haven't even started thinking about helping them with AI, oil and gas industry has been doing this type of thing for decades, but they really were struggling with this platform, so that's a basic success where, at least for the pilot, which was a small subset of their fields, we were able to help them reduce that time frame by a lot to be able to start making a decision. So an example of a decision might be where to drill next? That's exactly the decision they're trying to make. Because for years in that industry, it was, boop, oh, no oil, boop, oh, no oil, no. Now they got more sophisticated, they started to use data, but I think what you're saying is the time it took for that analysis was quite long. So the time it took to even overlay things like seismic data, topography data, what's happened in wells and cores they've drilled around that was really protracted just to pull the data together, right, and then once they got the data together, there were some really, really smart people looking at it going, well, my experience says here, it was driven by the data, but it was not driven by an algorithm. A little bit of art, too. A lot of art, right, and it still is. So now they want some AI or some machine learning to help guide those geophysicists to help determine where, based on the data, they should be dropping wells. And these are 100 million, billion dollar decisions they're making, so it's really about how do we help them? And that's just one example. I mean, every industry has its own use cases. And so that's on the front end, about the data foundation. And then if you go to a company that was really advanced in leveraging analytics or machine learning, you know, JPMorgan Chase, you know, in their, they have a division, and also they were on stage with me, I think, that they had basically everything is driven by models. So they give traders a series of models and they make decisions. And now they need to monitor those models, those hundreds of models they have for misuse of those models, right? And so they needed to build a series of models to manage, to monitor their models. And this was a tremendous deep learning use case and they had just bought a Power AI box from us so they wanted to start leveraging GPUs. And we really helped them figure out how do you navigate and what's the difference between building a model, leveraging GPUs compared to CPUs? How do you use it to accelerate your, the output? And again, this was really a cost avoidance play because if people misuse these models, they can get in a lot of trouble. But they also need to make these decisions very quickly because a trader goes to make a trade, they need to make a decision was this used properly or not before that trade is kicked off and milliseconds make a difference in the stock market. And so they needed a model. And one of the things about when you start leveraging GPUs and deep learning is sometimes you need these GPUs to do training and sometimes you need them to do training and scoring. And this was a case where you need to also build a pipeline that can leverage the GPUs for scoring as well, which is actually quite complicated and not as straightforward as you might think. In near real time, in real time. Pretty close to real time, yeah. You can't get much more real time. The middle of the second is potentially to stop a trade before it occurs and protect the firm, right? Yeah, and I think this is right. I think they actually don't do trades until it's confirmed. And so, or that's the desire is to not. Well, and then now you're in a competitive situation where, you know. Yeah, I mean, people put these trading floors as close to the stock exchanges they can. Physically. Physically at a minute by a latency, right? So every millisecond counts. Yeah, read FlashBoys, that'd be interesting. So what's the biggest challenge you're finding, both at IBM and in your clients, in terms of operationalizing AI? Is it technology? Is it culture? Is it process? Is it? Yeah, so culture is always hard. But I think, as we start getting to really think about integrating AI and data into our operations, right? Did you look at what software development did with this whole concept of DevOps, right? And really rapidly iterating but getting things into a production ready pipeline looking at, you know, continuous integration, continuous development. What does that mean for data and AI? And it's this concept of data ops and AI ops, right? And I think data ops is very similar to DevOps in that things don't change that rapidly, right? You build your data pipeline, you build your data assets, you integrate them, they may change, you know, on the weeks or months timeframe, but they're not changing on the hours or days timeframe. As you get into some of these AI models, some of them need to be retrained within a day, right? Because the data changes, they fall out of parameters or the parameters are very narrow and you need to keep them in there. What does that mean? How do you integrate this into your CI CD pipeline? How do you know when you need to do regression testing on the whole thing again? Does your data science and AI pipeline even allow for you to integrate into your current CI CD pipeline? So this is actually an IBM wide effort that my team is leading to start thinking about how do we incorporate what we're doing into people's CI CD pipeline so we can enable AI ops, if you will, or ML ops. And really, really, IBM is the only company that's positioned to do that for so many reasons. One is we're the only one with an end-to-end tool chain. So we do everything from data, feature development, feature engineering, generating models, whether selecting models, whether it's auto AI or hand coding or visual modeling into things like trust and transparency. And so we're the only one with that entire tool chain. Secondly, we've got IBM research, we've got decades of industry experience, we've got our IBM services organization. All of us have been tackling with this with large enterprises, so we're uniquely positioned to really be able to tackle this in a very enterprise-grade manner. Well, and the leverage that you can get within IBM and for your customers. And leveraging our clients, right? So we have six clients that are, our most advanced clients that are working with us on this, so it's not just us in a box, it's us with our clients working on it. So what are you hoping to have happen today? We're just about to get started with the keynotes. We're going to take a break and then come back after the keynotes and we've got some great guests, but what are you hoping to get out of today? Yeah, so I've been with IBM for two and a half years and this is my eighth CDO summit, so I've been to many more of these than I've been at IBM and I went to these religiously before I joined IBM really for two reasons. One, there's no sales pitch, right? It's not a trade show. The second is it's the only place where I get the opportunity to listen to my peers and really have open and candid conversations about the challenges they're facing and how they're addressing them and really giving me insights into what other industries are doing and being able to benchmark me and my organization against the leading edge of what's going on in this space. I love it and that's why I love coming to these events. It's practitioners talking to practitioners. Seth Doberman, thanks so much for coming to theCUBE. Yeah, thanks as always, Dave. Always a pleasure. All right, keep it right there, everybody. We'll be back right after this short break. You're watching theCUBE live from San Francisco. Right back.