 Live from the BuildGram Auditorium in San Francisco, it's theCUBE, covering Pure Storage Accelerate 2018. Brought to you by Pure Storage. Welcome back to theCUBE's continuing coverage of Pure Storage Accelerate 2018. I'm Lisa Martin, sporting the clon. Apparently this symbol actually has a name, the clon. I learned that in the last half an hour. I know who do. Really? Yes. Is that a C or a K? Is that a Prince orientation or what is that? I'm formerly known as. Nice. First played at this venue. Yes. As did Roger Taltry in the who. And I might have been staff for one of those shows. I mean, you never know. I think you're performing later. You might not even know this. We have a couple of guests joining us. We've got Matt Burrow, the GM of FlashBlade, and Rob Ober, the chief platform architect at NVIDIA. Guys, welcome to theCUBE. Hi. Thank you. What's coming on? So, lots of excitement going on this morning. You guys announced Pure and NVIDIA. Just a couple of months ago, a partnership with ARI. Talk to us about ARI. What is it? How is it going to help organizations in any industry really democratize AI? Well, ARI, so ARI is something that we announced, the ARI Mini today here at Accelerate 2018. ARI was originally announced at the GTC, Global Technology Conference for NVIDIA back in March. And what it is is it essentially brings NVIDIA's DGX servers connected with either Orista or Cisco switches down to the Pure Storage FlashBlade. So, this is something that sits in less than half a rack in a data center that replaces something that was probably 25 or 50 racks of compute and storage. So, I think Rob and I like to talk about it as kind of a great leap forward in terms of compute potential. Absolutely. Yeah. It's an AI supercomputer in a half rack. So, one of the things that this morning that we saw during the general session that Charlie talked about, and I think Matt Kicksmuller did as well, is kind of a really brief history of the last 10, 20 years in storage. Why is modern external storage essential for AI? Well, Rob, you want that one? You want me to take it? Coming from the non-storage guy, maybe. No, go ahead. Well, so, when you look at the structure of GPUs and servers in general, we're talking about massively parallel compute, right? These are, we're now taking not just tens of thousands of cores, but even more cores, and we're actually finding a path for them to communicate with storage that is also massively parallel. Storage has traditionally been something that's been kind of serial in nature. Legacy storage has always waited for the next operation to happen. You actually want to get things that are parallel so that you can have parallel processing both at the compute tier and parallel processing at the storage tier. And you need to have big, fat network bandwidth. This is what Charlie was alluding to when Charlie said, hey, you know, when Charlie was one of his stools, or one of the legs of his stool, was talking about, you know, 20 years ago, we were still, or 10 years ago, we were still at 10 gig networks. The emergence of 100 gig networks has really made the data flow possible. So I wonder if we can unpack that, and we talked a little bit to Rob Lee about this, the infrastructure for AI. And I wonder if we could go deeper. So you take the three legs of the stool, and you can imagine this massively parallel compute storage networking grid, if you will. One of our guys calls it Unigrid, not crazy about the name, but this idea of alternative processing, which is your business, really spanning the scaled out architecture, not trying to stuff as much function on a die as possible, really is taking hold. But what is the, how does that infrastructure for AI evolve from an architect's perspective? The overall infrastructure, I mean, it is incredibly data intensive. I mean, a typical training set is terabytes in the extreme, it's petabytes for a single run, and you will typically go through that data set again and again and again in a training run in epochs. And so you have one massive set that needs to go to multiple compute engines. And the reason it's multiple compute engines is people are discovering that as they scale up the infrastructure, you actually, you get pretty much linear improvements and you get a time to solution benefit. Some of the large data centers will run a training run for literally a month. And if you start scaling it out, even in these incredibly powerful things, you can bring that time to solution down. You can have meaningful results much more quickly. Can they give you a sense of sort of a practical application of that? Great. Yeah, there's a large hedge fund based in the UK called MAN-AHL. They're a system-based quantitative trading firm. And what that means is humans really aren't doing a lot of the trading. Machines are doing the vast majority, if not all of the training. What the humans are doing is they're essentially quantitative analysts. The number of simulations that they can run is directly correlative to the number of trades that their machines can make. And so the more simulations you can make, the more trades you can make. The shorter your simulation time is, the more simulations that you can run. So we're talking about in a sort of a meta context, that concept applies to everything from retail and understanding if you're a grocery store, what products are not on my shelves at a given time, in healthcare, discovering new forms of pathologies for cancer treatments, financial services we touched on, but even brought it right down into manufacturing. Looking at what are my defect rates on my lines? And if it used to take me a week to understand the efficiency of my assembly line, if I can get that down to four hours and make adjustments in real time, that's more than just productivity, it's progress. Okay, so I wonder if we could talk about how you guys see AI emerging in the marketplace. You just gave an example. We were talking earlier again to Rob Lee about, it seems today to be applied in narrow use cases and maybe that's going to be the norm. Whether it's autonomous vehicles or facial recognition, natural language processing. How do you guys see that playing out? Will it ever be this kind of ubiquitous horizontal layer? Or you think the adoption is going to remain along those sort of individual lines, if you will? You know, at the extreme, like when you really look out into the future, let me start by saying my background is processor architecture. I've worked on computer science. The whole thing is to understand problems and create the platforms for those things. What really excited me and motivated me about AI deep learning is that it is changing computer science. It's just turning it on its head. And instead of explicitly programming, it's now implicitly programming based on the data you feed it. And this changes everything and it can be applied to almost any use case. So I think that eventually it's going to be applied in almost any area that we use computing today. So another way of asking that question is how far can we take machine intelligence and your answer is? Pretty far. So as a processor architecture architect, obviously this is very memory intensive. You're seeing, I was at the Micron Financial Analyst Meeting earlier this week and listening to what they were saying about these emerging, you got DRAM and obviously you have Flash and people are excited about 3D Crosspoint. I heard somebody mention 3D Crosspoint on the stage today. What do you see there in terms of memory architectures and how they're evolving and what do you need as a systems architect? I need it all. No, if I could build a GPU with more than a terabyte per second of bandwidth and more than a terabyte of capacity, I could use it today. I can't build that, I can't build that yet. But I need, it's a different stool. I need teraflops, I need memory bandwidth and I need memory capacity. And really we just push to the limit. Different types of neural nets, different types of problems will stress different things. They'll stress the capacity to bandwidth or the actual compute. This makes the data warehousing problem seem trivial. But do you see, you know what I mean? The data warehousing, it was like always chasing the chips and snakes wallowing a basketball, I called it, but do you see a day that these problems are going to be solved architecturally or everybody talks about Moore's Law is moderating or is this going to be this perpetual race that we're never going to get to the end of? So let me put things in perspective first. It's easy to forget that the big bang moment for AI and deep learning was the summer of 2012. So slightly less than six years ago. That's when Alex Net hit the scene and people went, wow, this is a whole new approach. You know, this is amazing. So a little less than six years in. I mean, it is a very young, it's a young area. It is in incredible growth. The change in state of art is literally month by month right now. So it's going to continue on for a while and we're going to just keep growing and evolving. You know, maybe five years, maybe 10 years, things will stabilize. But it's an exciting time right now. Very hard to predict, isn't it? I mean, who would have thought that, well, who would have thought that Alexa would be such a dominant factor in voice recognition or that a bunch of cats on the internet would lead to facial recognition? I wonder if you guys could comment, right? I mean, strange beginnings. Who doesn't love cats out there? But very, I wonder if I could ask you guys about the black box challenge. I've heard some companies talk about how we're going to white box everything, white make it open. But the black box problem meaning, if I have to describe, and we may have talked about this, how I know that it's a dog. I struggle to do that, but a machine can do that. I don't know how it does it. Probably can't tell me how it does it, but it knows with a high degree of accuracy. Is that black box phenomenon a problem or do we just have to get over it to you? I think in certain, I don't think it's a problem. I know that mathematicians who are friends, it drives them crazy because they can't tell you why it's working, right? So it's an intellectual problem that people just need to get over. But it's the way our brains work, right? And our brains work pretty well. There are certain areas I think where for a while there will be certain laws in place where you can't prove the exact algorithm, you can't use it. But by and large, I think the industry is going to get over it pretty fast. I would totally agree. You guys are optimists about the future. I mean, you're not up there talking about how jobs are going to go away, and that's not something that you guys are worried about. And generally we're not either. However, machine intelligence, AI, whatever you want to call it, it is very disruptive. There's no question about it. So I got to ask you guys a few fun questions. Do you think large retail stores are going to... I mean, nothing's in the extreme, but do you think they'll generally go away? So I think large retail stores will generally go away. When I think about retail, I think about grocery stores and the things that are going to go away, I'd like to see standing in line go away. I would like my customer experience to get better. I don't believe that 10 years from now we're all going to live inside our houses and communicate over the internet and text and half of that be with chatbots. I just don't believe that's going to happen. I think the Amazon effect has a long way to go. I just ordered a pool thermometer from Amazon the other day. I'm getting old, I ordered readers from Amazon the other day. So I kind of think it's that spur of the moment item that you're going to buy. Because even in my own personal habits, I'm not buying shoes and returning them and waiting five to 10 times cycles to get there. I still want that experience of going to the store. Where I think retail will improve is understanding that I'm on my way to their store and improving the experience once I get there. So I think you'll see, you'll continue to see the Amazon effect that's going to happen. But what you'll see is technology being employed to reach a place where my end user experience improves such that I want to continue to go there. Do you think owning your own vehicle, and driving your own vehicle will be the exception rather than the norm? I pains me to say this because I love driving. Me too. But I think you're right. I think it's a long, I mean, it's going to take a while. It's going to take a long time. But I think inevitably it's just too convenient. Things are too congested by freeing up autonomous cars. Things that'll go park themselves, whatever. I think it's inevitable. Well, machines make better diagnoses than doctors. No, I mean, that's not even a question. Absolutely. Yeah, already do. Do you think banks, traditional banks, will lose control of the payment systems? That's a good one. I haven't thought about that one. Yeah, I'm not sure that's an AI related thing, maybe more of a blockchain thing, but it's possible. Blockchain and AI, kind of cousins. They are actually. I fear a world, though, where we actually end up like Wally in the movie and everybody's on these like floating, chaise lounges, you know, eating and drinking. But I'm just wondering, you talked about, Madda, you know, in terms of the number of, the different types of industries that really can merge in here. Do you see maybe the consumer world with our expectation that we can order anything on Amazon from a thermometer to a pair of glasses to shoes as driving other industries to kind of follow what we as consumers have come to expect? Absolutely no question. I mean, that is, look, consumer drives everything, right? All flash arrays were driven by, you have your phone there, right? The consumerization of that device was what drove Toshiba and all the other fab manufacturers to build more manned flash, which is what commoditized manned flash, which brought us faster systems. These things all build on each other. And, you know, from a consumer perspective, there are so many things that are inefficient in our world today, right? Like, let's just think about your last call center experience. If you're the normal human being, your last year, yeah, you said it. You prefer not to, right? My next comment was going to be most people's call center experiences aren't that good. But what if the call center technology had the ability to analyze your voice and understand your intonation and your inflection? And that call center employee was being given information to react to what you were saying on the call, such that they either immediately escalated that call without you asking, or they were sent down a decision path which brought you to a resolution that said that we know that 62% of the time, if we offer this person a free month of this, that person is going to view, is going to go away a happy customer and rate this call at 10 out of 10. Like, that is the type of things that's going to improve with voice recognition and voice analysis and all of those things. And that really gets into how far we can take machine intelligence, the things that machines, or the humans can do that machines can't. And that list changes every year. The gap gets narrower and narrower. And that's a great example. And I think one of the things going back to your, you know, whether stores will continue to be there or not, but one of the biggest benefits of AI in is recommendation, right? So you can consider it user-ish maybe, or the other hand it's great service where a lot of like something like an Amazon is able to say, you know, I've learned about you, I've learned about what people are looking for, and you're asking for this, but I would suggest something else. And you look at that and you go, yeah, that's exactly what I'm looking for, right? I think that's really where in the sales cycle, that's really where it gets right there. Can machines stop fake news? That's what I want to know. Oh! Probably. To be continued. People are working on that. They are, there's a lot, I mean. That's a big use case. It is not a solved problem, but there's a lot of energy going into that. I take that before I take the floating Wally chase lounges, right? Deal. But what if it was just for you? What if it was just a floating chase lounge? It wasn't everybody, then it would be alright, right? Not for me. Matt and Rob, thanks so much for stopping by and sharing some of your insights and we wish you to have a great rest of the day at the conference. Great, thank you very much. Thanks for having us. This is our Dave Vellante. I'm Lisa Martin. We're live at Pure Storage Accelerate 2018 at the Bill Graham Civic Auditorium. Stick around, we'll be right back after a break with our next guest.