 The Cube at IBM Impact 2014 is brought to you by headline sponsor IBM. There are your hosts, John Furrier and Paul Gillan. Welcome back everyone live in Las Vegas at IBM Impact. This is The Cube, our flagship program. We go out to the events, we strike the signal from the noise. I'm John Furrier, the founder of SiliconANGLE. And our next special guest is Grady Bush who is a legend in the software development community. And there she went to Santa School in Santa Barbara. My son goes there as a freshman. There's a whole other conversation. Welcome to The Cube. Thank you. One of the things we're really exciting about when, you know, we get all the IBM guys, get the messaging out, you know, the IBM talk. But the groundbreaking work around computer software where hardware is now exploding in capability. Big data is instrumentation of data. It takes us to a conversation around cognitive computing, the future of humanity, society, the societal changes that are happening. There's a huge intersection between computer science and social science. Something that's our tagline for SiliconANGLE and something we are passionate about. So I just want to get your take on that and talk about some of the work you're doing at IBM. Where is all this leading to? Where is this unlimited compute capacity, the mainframe in the cloud, big data instrumentation, indexing human thought, bit-bits, wearable computers, sensors, internet of things, it's all taking us in the direction. What's your vision? There are three things that I think are inevitable and they're irreversible that have unintended consequences. Consequences that, you know, we can't, we have to attend to and they will be in our face eventually. The first of these is the growth of computational power in ways we've only begun to see. The second is the development of systems that never forget with storage beyond even our expectations now. And the third is a pervasive connectivity such that we see the foundations for not just millions of devices but billions upon billions of devices. Those three trends appear to be where technology is heading. And yet, if you follow those trends out, one has to ask the question as you begin to, what are the implications for us as humans? I think that the net of those is an interesting question indeed to put in a personal plug. My wife and I are developing a documentary with the Computer History Museum for public television on that very topic looking at how computing intersects with the human experience. So we're seeing those changes in every aspect of it. Two that I'll dwell upon here, which I think are germane to this particular conference, are some of the ethical and moral implications. And second, what the implications are for cognitive systems. On the latter case, we saw in the news, I guess it was today or yesterday, there's a foundation led by the Gates Foundation that's been looking at collecting data for kids in various schools. A number of states set up for it. But as they begin to realize what the implications of aggregating that information were for the privacy of that child, the parents became cognizant of the fact that, wow, we're disclosing things for which there can be identification of the kid in ways that maybe we don't want to do that. So I think the explosion of big data and the explosion of computational power has led us as a society to begin asking those questions, what are the limits of ownership and the rights of that kind of information. And that's a dialogue that will continue on. In the cognitive space, it kind of follows on because one of the problems of big data, and it's not just, you know, big, big data, but like you see at CERN and the like, but also these problems of aggregation of data, there is such an accumulation of information at such a speed in ways that an individual human cannot begin to reason about it in reasonable ways. Thus was born what we did with Watson a few years ago, Watson-Jeopardy. I think the most important thing that the Watson-Jeopardy experience led us to realize is that there is an architectural framework upon which we can do many interesting reasoning things. And now that Watson has moved from research into the Watson group, we're seeing that expand out in so many domains. So the journey is really just beginning as we take what we can know to do in reason with automated systems and apply it to these large data systems. It's going to be a conversation we're going to have for a few generations. We're beginning to see, I mean computing has moved beyond the role of automator, of automating rote manual tasks. We're seeing, I've seen forecast that most of the jobs that will be automated out of existence in the next 20 years will be knowledge jobs. And even one journalism professor forecasting the 80% of journalism jobs will go away when we replace my computer over the next couple of decades. Is this something for people to fear? I'm not certain fear will do us any good, especially if a change like that is inevitable, fear doesn't help. But I think that what will help is an understanding as to where those kinds of software systems will impact various jobs and how we as individuals should relate to them. We as a society, we as individuals in many ways are slowly surrendering ourselves to computing technology and what you describe as one particular domain for that. There's been tremendous debate in the economic and business community as to whether or not computing has impacted the jobs market. I'm not an economist, I'm a computer scientist, but I can certainly say from my insight perspective, I see that transformational shift and I see that what we're doing is radically going to change the job market. There was, you know, if you go back to the Victorian age where people were looking for a future in which they had more leisure time because we have these devices to give us, you know, free us up for the mundane, we're there. And yet the reality is that we now have so many things that required our time before, it means there's in a way not enough work to go around. And that's a very different shift than I think what anyone anticipated back to the beginnings of the industrial age. We're coming to grips with that. Therefore, I'd say this, don't fear it, but begin to understand those areas where we as humans provide unique value that the automated systems never will. And then ask ourselves the question, where can we as individuals continue to add that creativity and value? Because there and then we can view these machines as our companions in that journey. Great, I want to ask you about the role, I mean humans is a great message. I mean they're driving the car here. But I want to talk about something around the humanization piece you mentioned. There's a lot of conversations around computer science as a discipline, which the old generation when I went to computer science school was, it was code architecture, but now computer science is literally mainstream. There's general interest hence why we built this cube operation to share signal from the noise around computer science. So there's also been a discussion around women in tech, tolerance for different opinions and views, freedom of speech if you will, and censorship if everything's measured and politically correctness. All this is now kind of being fully transparent. So let's take the women in tech issue and also kids growing up who have an affinity towards computer science but may not know it. So I want to ask you the question with all that kind of backdrop. Computer science as a discipline, how is it going to evolve in this space? What are some of those things for the future generation? For my son who's in sixth grade, my son's a freshman in college, and in between, is it just traditional sciences? What are some of the things that you see that's not just so much coding and learning Java or Objective-C? I wish you'd ask me some questions about some really deep topics. I mean, gosh, these are, these are, I'm sorry. It's about the kids in the early days of the telephone. Telephones were a very special thing. Not everybody had them. And it was predicted that as the telephone networks grew, we were going to need to have many, many more telephone operators. What happened is that we all became operators. So the very nature of telephony changed so that now eyes and individual have the power to reach out and do the connection that had to be done by a human. A similar phenomenon I think is happening in computing that it has moved itself into the interstitial spaces of our world such that it's no longer a special thing out there. We used to speak of the programming priesthood in the 60s where I just lost my thing here, hang on. There we go. I think we're good. We're good. I'm a software guy. I don't do hardware. So my body rejects hardware. So we're moving into a place where computing very much is part of the interstitial spaces of our world. This has led to where I think, you know, the generation after us, because our median age is, let me check, it's probably above 20. Just guessing here. 32 years older than me. I'm still 7, I think. You're still 7. We're moving to a stage where the notion of computational thinking becomes an important skill that everyone must have. My wife loves to take pictures of people along the beach, beautiful sunset, whales jumping, and the family's sitting there and it did it again. My body's rejecting this device. Clearly I have the wrong shape of yours. There we go. Taking pictures of families who are seeing all these things and they're buried with their heads and their iPhones and their tablets, and they're so wedded to that technology. We often see, you know, kids going by in strollers and they've got an iPad in front of them looking at something. So we have a generation that's growing up knowing how to swipe and knowing how to use these devices. It's part of their very world. It's difficult for me to relate to that because I didn't grow up in that kind of environment, but that's the environment after us. So the question I think you're generally asking is, what does one need to know to live in that kind of world? And I think it's the notions of computational thinking. It's an idea that's come out of the folks at Carnegie Mellon University, which asks the question, what are some of the basic skills we need to know? We need to know some things about what an algorithm is and a little bit behind the screen itself. One of the things we're trying to do with the documentary is opening the curtain behind just the windows you say and ask the question, how do these things actually work? Because some degree of understanding to that will be essential for anyone moving into life. You talked about women in tech in particular. It is an important question, and I think that I work with many women side by side in the things that I do. And frankly, it saddens me to see the way our educational system in a way back to middle school produces a bias that pushes young women out of this society. So I'm not certain that it's a bias that's built into computing, but it's a bias built into culture. It's a bias built into our educational system, and that obviously has to change because computing knows no gender or religious or sexual orientation boundaries. It's just part of our society now. And everyone needs to contribute. I'm sorry, I do want to ask you about software development since you're devoted to your career to defining architectures and disciplines of software development. We're seeing software development now as epitomized by Facebook, perhaps, moving to much more of a fail-fast mentality. Try it, put it out there. If it breaks, it's okay. No lives were lost. Pull it back in and we'll try it again. Is there a risk in this new approach to software? So many things there. First is it a new approach. No, it's part of the agile process that we've been talking about for well over a decade, if not 15 years or so. You must remember that it's dangerous to generalize upon a particular development paradigm that's applied in one space that applied to all others. With Facebook, in general, nobody, no one's life depends upon it. And so there are things that one can do that are simplifying assumptions. If I apply that same technique to the dialysis machine, to the avionics of a triple seven, simply doesn't apply. So one must be careful to generalize those kind of approaches to every place. It depends upon the domain, depends upon the development culture, ultimately depends upon the risk profile that would lead you to high-ceramony or low-ceramony approaches. Do you have greater confidence in the software that you see being developed for mission-critical applications today than you did 10 years ago? Absolutely. In fact, I'll tell you a quick story and I know we need to wind down. I had elective open-heart surgery a few years ago. Elective because every male in my family died of an aneurysm. They were an aneurysm. So I went in and got checked and indeed I had an aneurysm developing as well. So we had it, you know, have my heart ripped open and dealt with before it would burst on me. I remember laying there in the CT scan machine looking up and saying, hmm, this looks familiar. Oh my God, I know the people that wrote the software for this thing and they used the UML. And I realized, oh, this is a good thing. Which is your creation. Yes, yes. So it's a good thing because I felt confidence in the software that was there because I knew it was intentionally engineered. Great. I want to ask you some society questions around IT and computing. Obviously green is key and data centers take up a lot of space, right? So obviously we want to get to a smarter data center environment. How do you see the role of software? Obviously with the cognitive and all the things you talked about. Helping businesses build a physical plant, if you will. And is it a shared plan? It's openness. You're seeing open power systems here from IBM. You hear him? Obviously open source, open source. What does that future look like from your standpoint? May I borrow that cup of tea or coffee? I want to use it as a visual aid. Let's presume, oh, it's still warm. Let's say that this is some tea and roughly the energy cost to oil, water for a cup of tea is roughly equivalent to the energy cost needed to do a single Google search. Now imagine if I multiply that by a few billion times and you can begin to see the energy cost of some of the infrastructure which for many are largely invisible. Some studies suggest that computing has grown to the place where at least in the United States it's consuming about 10% of our electrical energy production. So by no means is it something we can sweep under the rug. You address, I think, a fundamental question which is the hidden costs of computing which people are becoming aware of. But then you ask the question also where can cognitive systems help us in that regard? We live in Maui and there's an interesting phenomenon coming on where there are so many people using solar power putting into the power grid that the electrical grid companies are losing money because we're generating so much power there. And yet you realize if you begin to instrument the way that people are actually using power down to the level of the homes and cells then power generation companies are making much more intelligent decisions about day-to-day almost minute-to-minute power production. And that's something that black box analytics would help but also cognitive systems which are not really black box analytics systems they're more learning systems learning systems can then predict what that might mean for the energy production company. So we're seeing even in those places the potential of using cognitive systems for attending to energy costs in that regard. The future is a lot of possibilities. I know you've got to go we're getting a hook here big time because you've got a while. We really appreciate it. These are important future decisions that we're on track to help solve. And I really appreciate it. Looking forward to the documentary. Any timetable on that? Sometime before I die. Great. Thanks for coming on theCUBE. Really appreciate it. This is Silicon Angles the Cube. We'll be right back with our next guest after this outbreak. Thanks for having me.