 Live from Las Vegas, it's theCUBE. Covering IBM Think 2018, brought to you by IBM. Hello everyone, welcome back to theCUBE. We are here on the floor at IBM Think 2018, the CUBE Studio's live coverage from IBM Think. I'm John Furrier, the host of theCUBE, and we're here 70 years as the Vice President of Offering Manager at IBM Cognitive Systems. That's power systems, a variety of other great stuff. Real technology, performance happening with power. It's been a good strategic bet for IBM. Stephanie, great to see you again. Thanks for coming back on theCUBE. Absolutely, I love to be on, John. Thank you for inviting me. When we had a briefing with Papachiano, who was heading up power in that group, one of the things we learned was there's a lot of stuff going on that's really going to be impacting the performance of things. So just take a minute to explain what you guys are offering in this area. What is it fit into the IBM portfolio? What's the customer use cases? Where does that offering fit in? Yeah, absolutely. So I think here at Think, it's been a great chance for us to see how we have really transformed. You know, we have been known in the market for AIX and IBM i, we continue to drive value in that space. We just gade on yesterday, our new systems based upon Power 9 processor chip for AIX and IBM i in Linux. So that remains a strong strategic push. Enterprise Linux, we transformed in 2014 to embrace Linux wholeheartedly. So we really are going after now the Linux space. SAP HANA has been an incredible workload where over a thousand customers run in SAP HANA. And boy, we are going after this cognitive and AI space with our performance and our acceleration capabilities, particular around GPUs. So things like unique differentiation in our NVLink is driving our capabilities with some great announcements here that we've had in the last couple of days. Jamie Thomas was on earlier and she and I were talking about some of the things around really the software stack and the hardware kind of coming together. Can you just break that out? Because I know Power been covering, Doug Bayloss been on many times, a lot of great growth right out of the gate, ecosystem formed right around it. What else has happened? And separate out where the hardware innovation is on technology and what software and how the ecosystem and people are adopting it. Can you just take us through that? Yeah, absolutely. And actually I think it's an interesting question because the ecosystem actually has happened on both sides of the fence with both the hardware side and the software side. So open power has grown dramatically on the hardware side. We just released our Power 9 processor chip. So here is our new baby. This is the Power 9. Hold it up. So this is our Power 9 here. Eight billion transistors. 14 miles of wiring and 17 layers of metal. I mean it's a technology one bar. It's very small. We can't even show on the camera. This is the Moore's law piece that Ginny was talking about in her keynote. That's exactly it. But what we have really done strategically is changed what gets delivered from the CPU to more what gets delivered at a system level. And so our IO capabilities. First chip to market, delivering the first systems to market with PCIe Gen4. So able to connect to other things much faster. We have NVLink 2.0, which provides nearly 10X the bandwidth to transport data between this chip and a GPU. So Jensen was on stage yesterday from NVIDIA. He held up his chip proudly as well. We're very, the capabilities that are coming out from being able to transport data between the power CPU and the GPU is unbelievable. So what about the Religio and NVIDIA for a second? Because that's also, NVIDIA stocks up a lot. Haven't do it thanks to the Bitcoin mining graphics card. But this is what, this is again, a use case. NVIDIA's been doing very well. They're doing really well in IoT, self-driving cars. Where data performance is critical. How do you guys plan that? What's the relationship with NVIDIA? Yeah, so it has been a great partnership with NVIDIA. When we launched in 2013, right at the end of 2013, we launched OpenPower. NVIDIA was one of the five founding members with us, Google, Melanox and Tyane. So they clearly wanted to change the game at the systems value level. We launched into that with, we went and jointly bid with NVIDIA and Melanox. We jointly bid for the Department of Energy when we co-named it Coral. But that came to culmination at the end of last year when we delivered the summit in Sierra Supercomputers to Oak Ridge and Lawrence Livermore. We did that with innovation from both us and NVIDIA. And that's what's driving things like this capability. And now we bring in software that exploits it. So that NVLink connection between the CPU and the GPU, we deliver software called PowerAI. We've optimized the frameworks to take advantage of that data transport between that CPU and GPU. So it makes it consumable with all of these things. It's not just about the technology. It's about, is it easy to consume at the software level? So great announcement yesterday, right? With the capabilities to do logistic regression. Unbelievable taking the ability to do advertising analytics, taking it from 70 minutes to one and a half. I mean, we're going to geek out here, but let's go under the hood for a second. This is really kind of a high end systems product. Okay, at the kind of performance levels. Where does the connect to the go to market? Who's the buyer of it? Is it OEMs? Is it integrators? Is it new hardware devices? How do I involve them? And who's the target customer? And what kind of developers are you reaching? Can you just take us through that? Who's buying this product? So this is no longer relegated to the elite set. What we did, and I think this is amazing, when we delivered the summit in Sierra, right? Huge cluster of these nodes. We took that same node. We pulled it into our product line as the AC922, and we delivered a four GPU air-cooled version to market. On December 22nd, we GAID of last year, and we sold to over 40 independent clients by the end of 2017. So that's a short runway. And most of it, honestly, is all driven around AI. The AI adoption, and it's across enterprise. Our goal is really to make sure that the enterprises who are looking at AI now with their developer are ready to take it into production. We offer support for the frameworks on the system so they know that when they do development on this infrastructure, they can take it to production later. So it's very much driven to taking AI to the enterprise, and it's all over. It's insurance. It's financial services sector. It's those kinds of enterprise that are using AI. So IO sensitive, right? So IoT not a target or maybe? So when we talk out to Edge, it's a little bit different, right? So the IoT today for us is driving a lot of data. That's coming in, and then at different levels. There's not a lot of processing power needed at the Edge. There is not. There is not. And it kind of scales in. We are seeing, I would say, kind of progression of that compute moving out closer, whether or not it's on. It doesn't all come home necessarily anymore. Compute's being pushed to where the data is. Absolutely. I mean, that's headroom for you guys. It's not necessarily a low, it's not a priority now because it's not an intense ban. Other compute can solve that. That's right. All right, so where does the cloud fit into? You guys powering IBM's cloud? So IBM cloud has been a great announcement this year as well. So you've seen the focus here around AI and cloud. So we announced that HANA will come on power into the cloud specializing in large memory sets. So 24 terabyte memory sets. For clients, that's huge to be able to exploit that. Is IBM cloud using power or not? So that will be an IBM cloud. So go to IBM cloud, be able to deploy an SAP certified HANA on power deployment for large memory installs, which is great. We also announced power AI access on power nine technology in IBM cloud. So we are definitely partnering both with IBM cloud, as well as with like the analytics pieces, data science experience available on power. And I think it's very important what you said earlier, John, about you want to bring the capabilities to where the data is. So things like a lot of clients are doing AI on-prem where we can offer a solution. You can augment that with capabilities like Watson, right off-prem. You can also do DevOps now on with AI in the IBM cloud. So it really becomes both a deployment model, but the client needs to be able to choose how they want to do it. And the data could come from multiple sources. It's always going to be latency. So what about blockchain? We'll get the blockchain. Are you guys doing anything in the blockchain ecosystem? Obviously one complaint we've been hearing obviously is some of these cryptocurrency chains like Ethereum has performance issues. They got projects doing a lot of open source in there. Is power even putting their toe in the water at blockchain? We have put our toe in the water. Blockchain runs on power from an IBM portfolio perspective. IBM blockchain runs on power or blockchain? Like Hyperledger, like Hyperledger will run. So open source blockchain will run on power. But if you look at the IBM portfolio, the security capabilities in Z14 that that brings and pulling that into IBM cloud, our focus is really to be able to deliver that level of security. So we lead with system Z in that space and Z has been incredible with blockchain. Z is pretty expensive to purchase though. But now you can purchase it in the cloud through IBM cloud, which is great. Awesome, this is the benefit of cloud. Sounds like Softlayer is moving towards more of a Z mainframe power backend. I think the IBM cloud is broadening the capabilities that it has because the workloads demand right different things. Blockchain demands security. Now you can get that in the cloud through Z, right? AI demands incredible compute strength with GPU acceleration. Power is great for that. And now a client doesn't have to choose. They can use the cloud and get the best infrastructure for the workload they want and IBM cloud runs it. You guys been busy. We've been busy. Papa Gianno has been bunkered in. You guys been breaking out. So love to do a deeper dive on this, Stephanie. And so we'd love to follow up with you guys and we told Bob we would dig into that too. The question I have for you now is how do you talk about this group that you're building together? The name's all internal IBM names, power, cognitive design. Is it like a group? Do you guys call yourself like the modern infrastructure group? Is it like, I mean, what is it called within? Like if you have to explain it to outside IBM, AI is easy. I know what AI team does. You're kind of doing AI. You're enabling AI. You're modern infrastructure or is it? What's the pillar are you under? Yeah, so we sit under IBM systems and we are definitely systems proud, right? Everything runs on infrastructure somewhere. And then within that three spaces, you have certainly have Z storage. And we empower, since we've set our sights on AI and cognitive workloads, internally we're called IBM cognitive systems. And I think that's really two things, both the focus on the workloads and differentiation we want to bring to clients, but also the fact that it's not just about the hardware. We're now doing software with things like power AI software, optimized for our hardware. There's magic that happens when the software and the hardware are co-optimized. Well, if you look, I mean, the system is proud. I love that conversation because if you look at the systems revolution that I grew up in computer science generation of the 80s, that was the open movement, BSD, pre-Linux. And then now if you think about the cloud and what's going on with AI and what I call the innovation sandwich with data in the middle, blockchain and AI is bread. You have all the perfect elements of automation in a cloud. That's all going to be powered by a system. So essentially operating systems skills are super important. Super important. Super important. This is the foundational elements. Absolutely. And I think your point on open, that has really come in and changed how quickly this innovation is happening, but completely agree, right? And we'll see more fit for purpose types of things. As you mentioned, more fit for purpose where the infrastructure and the OS are driving huge value at a workload level. And that's what the client needs. You know, what DevOps proved with the cloud movement was you can have programmable infrastructure. And what we're seeing with blockchain and decentralized web and AI is that the real value intellectual properties, you mean the business logic. Okay, that is going to be dealing with now a whole nother layer of programmability. It used to be the other way around. The technology determined the core decision. So the risk was technology purchase. Now that this risk is business model decision. How do you code your business? And it's very challenging for any business because the efficiency happens when those decisions get made jointly together. That's when real business efficiency. If you make one decision on one side of the line or the other side of the line only, you're losing efficiency that can be driven. And open is big because you have consensus algorithms, you've got regulatory issues, the more data you're exposed to, and more horsepower that you have. Absolutely. This is the future perfect storm. Perfect storm. Definitely, thanks for coming on theCUBE. Great to see you. My pleasure, John. Great to see you. You're awesome. Systems proud here in theCUBE we're sharing all the systems data here at IBM Think. I'm John Furrier, more live coverage after this short break. All right.