 From Yorktown Heights, New York, it's theCUBE. Covering IBM Cloud Innovation Day. Brought to you by IBM. Hi, it's Wikibon's Peter Burris again. We're broadcasting on theCUBE from IBM Innovation Day at the Thomas J. Watson Research Laboratory in Yorktown Heights, New York. Have a number of great conversations and we've got a great one right now. Rob Thomas, who's the general manager of IBM Analytics, welcome back to theCUBE. Thanks, Peter, great to see you. Thanks for coming out here to the woods. Oh, it's not that bad. I actually lived not too far from here. It was interesting, Rob. I was driving up the Taconic Parkway and I realized I hadn't been on it in 40 years. Is that right? Very exciting. So, Rob, let's talk IBM Analytics and some of the changes that are taking place. Specifically, how are customers thinking about achieving their AI outcomes? What's that ladder look like? Yeah, we call it the AI ladder, which is basically all the steps that a client has to take to get to an AI future. It's the best way I would describe it, from how you collect data to how you organize your data, how you analyze your data, start to put machine learning into motion, how you infuse your data, meaning you can take any insights, infuse it into other applications. Those are the basic building blocks of this ladder to AI. 81% of clients that start to do some of the AI, they realize their first issue is a data issue. They can't find the data, they don't have the data. The AI ladder is about taking care of the data problem, so you can focus on where the value is, the AI pieces. So, AI is a pretty broad, hairy topic today. What are customers learning about AI? What kind of experience are they gaining? And how is it sharpening their thoughts and their pencils as they think about what kind of outcomes they want to achieve? You know, for some reason it's a bit of a mystical topic, but to me, AI is actually quite simple. I like to say AI is not magic. Some people think it's a magical black box. You just put a few inputs in, you sit around and magic happens, it's not that. It's real work, it's real computer science. It's about how do I build models, put models into production. Most models when they go into production are not that good, so how do I continually train and retrain those models? And then the AI aspect is about how do I bring human features to that? How do I integrate that with natural language or with speech recognition or with image recognition? So, when you get into the covers, it's actually not that mystical. It's about basic building blocks that help you start to achieve business outcomes. And it's got to be very practical, otherwise a business has a hard time ultimately adopting it. But you mentioned a number of different, I especially like to add the human features to it, the natural language. It also suggests that the skill set of AI starts to evolve as companies mature up this ladder. How is that starting to change? That's still one of the biggest gaps, I would say. Skill sets around the modern languages of data science that lead to AI, Python, R, Scala, as an example of a few. That's still a bit of a gap. Our focus has been, how do we make tools that anybody can use? So if you've grown up doing SPSS or SAS, something like that, how do you adopt those skills for the open world of data science? That can make a big difference. On the human features point, we've actually built applications to try to make that piece easy. Great example is with the Royal Bank of Scotland where we've created a solution called Watson Assistant which is basically how do we arm their call center representatives to be much more intelligent and engaging with clients, predicting what clients may do? Those types of applications package up the human features and the components I talked about. Makes it really easy to get AI into production. Now many years ago, the genius Turing noted the notion of a Turing machine where you couldn't tell the difference between a human and a machine from an engagement standpoint. We're actually starting to see that happen in some important ways. You mentioned the call center. How are technologies and agency coming together? By that I mean the rate at which businesses are actually applying AI to act as an agent for them in front of customers. I think it's slow. What I encourage clients to do is you have to do a massive number of experiments. So don't talk to me about the one or two AI projects you're doing. I'm thinking like hundreds. I was with the bank last week in Japan and their comment was in the last year they've done a hundred different AI projects. These are not one year long projects with hundreds of people. It's like let's do a bunch of small experiments. You have to be comfortable that probably half of your experiments are gonna fail. That's okay. The goal is how do you increase your win rate? And do you learn from the ones that work and from the ones that don't work so that you can apply those? This is all to me at this stage is about experimentation. Any enterprise right now has to be thinking in terms of hundreds of experiments, not one, not two, or hey, should we do that project? Think in terms of hundreds of experiments. You're gonna learn a lot when you do that. But as you said earlier, AI is not magic. And it's grounded in something. And it's increasingly obvious that it's grounded in analytics. So what is the relationship between AI and analytics and what types of analytics are capable of creating value independent of AI? So if you think about how I kind of decompose AI, I talked about human features. I talked about it kind of starts with a model. You train the model. The model is only as good as the data that you feed it. So that assumes that one is that your data is not locked into a bunch of different silos. It assumes that your data is actually governed. You have a data catalog or that type of capability. And if you have those basics in place, once you have a single, I'd say, instantiation of your data, it becomes very easy to train models. And you can find that the more that you feed it, the better the model's gonna get, the better your business outcomes are gonna get. That's our whole strategy around IBM Cloud Private for data is basically one environment, a console for all your data, build a model here, train it in all your data, no matter where it is, it's pretty powerful. Let me pick up on that where it is, because it's becoming increasingly obvious, at least to us and our clients, that the world is not gonna move all the data to a central location. The data's going to be increasingly distributed closer to the sources, closer to where the action is. How does AI and that notion of increasingly distributed data gonna work together for clients? So we've just released what's called IBM Data Virtualization this month. And it is a leapfrog in terms of virtualization technology. So the idea is leave your data wherever it is. It could be in a data center, could be on a different data center, it could be on an automobile if you're an automobile manufacturer. We can federate data from anywhere, take advantage of processing power on the edge. So we're breaking down that problem, which is the initial analytics problem was before I do this, I've gotta bring all my data to one place, it's not a good use of money. It's a lot of time, it's a lot of money. So we're saying leave your data where it is, we will virtualize your data from wherever it may be. That's really cool, what was it called again? IBM Data Virtualization, and it's part of IBM Cloud Private for data. It's a feature in that. Excellent, so one last question Rob. February's coming up, IBM think, San Francisco 30 plus thousand people, what kind of conversations do you anticipate having with your customers, your partners, as they try to learn, experiment, take away actions that they can take to achieve their outcomes? I wanna have this AI experimentation discussion. I will be encouraging every client, let's talk about hundreds of experiments, not five. Let's talk about what we can get started on now. Technology is incredibly cheap to get started and do something, and it's all about rate and pace and trying a bunch of things, that's what I'm gonna be encouraging. The clients that you're gonna see on stage there are the ones that have adopted this mentality in the last year, and they've got some great successes to show. Rob Thomas, General Manager of IBM Analytics. Thanks again for being on the queue. Thanks Peter. Once again, this is Peter Burris of Wikibon from IBM Innovation Day, Thomas J. Watson Research Center. We'll be back in a moment.