 Hi, Jeff Frick here. We're on the ground at the Clifft Hotel in downtown San Francisco. For a really interesting event we got invited to, it's the Dell 1510 discussion series, a series where they're going out to a number of cities in the United States. This was the second one, and the topic was big data. Dell brought together about 15 journalists, about 10 Dell senior employees, and I think one customer to come together and really have kind of an open discussion on big data, and what it really means. And so we're psyched to participate, and we wanted to catch up with a few of the participants. So we're joined here next by Jay Menon, VP Chief Research Officer at Dell. Welcome. Welcome, and thank you. I'm delighted to be here. Thanks for the opportunity to talk. Yeah, so what did you think of the discussion group today? Well, I thought it was a great discussion group. I thought that, you know, we went into all kinds of discussions thinking, you know, where are customers in this journey to big data? What are some of the issues that are stopping them from getting there? What are some of the privacy issues? And some really interesting questions about does data driven computers, you know, and what is it due to? Can it augment? Does it augment? Does it replace humans in certain professions? And so I thought it was really a wide ranging and really thought provoking kind of discussion. Yeah, we had a healthcare professional here and the conversation kept at some point. Will the computer take over? The doctor. The doctor really for diagnosis, maybe not in surgery, but I don't know, maybe in surgery, too. They build cars with computers. So they're pretty fascinating things. But I want to tease you a little bit. Before we came up today, I did a little research. And through the power of big data, yes, I found an old presentation, a keynote that you gave in 2012 at IBM Edge. The best part of the presentation wasn't the presentation, but the intro to the presentation where you said, this is a great, great position to be in to predict the future when none of you guys are going to remember what I said five years from now. So guess what? Your worst nightmare is true. I'm back in three years. So I want to, but it's really kind of interesting to see what you were talking about and really relate where we are now again from June of 2012. And the three big trends you talked about were huge IO, massive IO, fast flash, because disk is just too slow and massive storage, both massive storage in the short term for access, but also massive storage really for long term archive. So now it's almost two years later, almost three years later. Is it, is it play out the way you thought? What do you think? Here we are. Yeah. Yeah. You know, so the fascinating thing about what you just brought up is that one of the things we've been saying is that in the era of big data, there's no such thing. Everything is transparent. What can be known will be known. And, and interestingly, I never thought that the remarks I would make on the stage that day that somebody would figure it out. But now in the era of big data, this is all out there. And people know what you said, and you aren't going to get away with things you said anymore. So that was, that was fascinating to me, especially given the context of today's discussion being about big data. Right. Exactly. But you know, I think some of the things I said there, thankfully, are, are really panning out. I mean, think about what has happened with flash technology over the last couple of years. We are actually at a point where probably 2016 is going to be the year when flash is actually going to be cheaper than 15 K RPM spinning disk. We're actually going to see that overlap happen. And once that happens, we're not going to be using 15,000 RPM spinning drives and systems anymore, because flash is so much faster. And it's even cheaper at the same time. Right. So that kind of transition at the time that I talked, this was certainly not true. Flash was faster, but it was still significantly more expensive. Now, by the way, that's only about 15 K RPM, 10 K and 7.2 K RPM drives are still going to be cheaper. But really, you know, you could almost start to make the argument that flash is kind of becoming the new disk. And interestingly, disk is going after tape in a very aggressive way. You know, if you think about what the disk guys are doing, they're putting shingled magnetic recording and all of this kind of stuff, you're going to have enormous capacities on disk drives. And, and they will go after the other thing I talked about, which is that the growth in unstructured data, the enormous rate of growth in unstructured data, that's going to be, of course, you know, tape will have a role there, but increasingly, disk will have a role there as well. And then third thing that's happening, and I think I talked about this as well, is that new forms of memory. It's like an hour long talk. So you covered all the all the topics. Yeah, and the new new form of memory, which is new forms of non non-molital memory. And I think those are going to happen in the next two years. I think by 2017, our systems will have new forms of non-molital memory significantly faster than flash, and much closer to DRAM and speed. So I think I'm pretty safe. I think on safe ground, I think what I said, you know, the timing is always hard to predict. But directionally, I think we're been well, it's funny, because I think you're right. And I think it's happened a lot faster than anyone ever predicted. Obviously, you know, Moore's law is everywhere in all functions of the stack that compute the store, the network, the networking and everything. So then the next question is what, what's the lag for people to really take advantage of this capability? How long does it take them to rethink their applications to take advantage of something like flash, as opposed to just trying to do what they did faster? Yeah, I think that, I think, I think with flash, people have already started to leverage that. I don't think, I don't think you need to change your applications because flash really appears like, like disk. And we already have systems. Dell has this. Others have it where, you know, the flash just appears as a cache in front of your sand device. And so the system automatically migrates the data, the good stuff stays in flash, the stuff that you haven't accessed in a long time goes out to slower spinning media. And all of that is handled automatically for you. And so applications don't really have to change to take advantage of that. Now, applications will have to take change to take advantage of the next generation of non-molatile memory that we talked about, the thing that's faster than flash. And I think that's going to be a journey. But I think the advantages of that next generation of memory are going to be so compelling that people will start to take advantage of it. You know, when you have a Black Friday and it's five million consumers all trying to do transactions, and you're trying to detect if a credit card transaction is fraudulent, you don't, you don't have a lot of time to do it. And you really want to use this next generation of memory. It's going to be so compelling that people will start to take advantage of that. There'll be ways to make it somewhat transparent. But I think to take significant advantage of it, there will be some changes to applications. And then you combine cloud, right? Because the old, it was the old Mother's Day problem, right? They had AT&T had and everybody called mom on Sunday on Mother's Day, right? They had to have the capacity for basically everybody being on the phone with a dedicated direct network. Now we have the cloud so you can expand, you can compress based on your demand Black Friday. So what are you working on now before we let you go and have something to drink or something to eat? What's new? What's exciting? What's waking you up in the morning? Now I'm leading research at Dell, and we have projects in, you know, in security and big data and things like the next generation memory we talked about, one of the and also an end user computing. So we are, we are working on, you know, if I had my little lap thing, my mobile phone with you, one of the things we're working on is something we call continuous authentication. You know, your, your tablet device can get stolen from you and typically the way we do that is we authenticate ourselves once. If you get a hold of it now, you know, you can start doing things on my tablet, right? So what we're doing is we're doing a thing called continuous authentication where the way that I tap on the, on the screen is different. The pressure I apply is different than the pressure you apply. Or the way I swipe is different than the way you swipe. And so what we're trying to say is that when, when you get a hold of my machine, my machine learning algorithm will say this doesn't look like J anymore. There's a 95% chance this isn't J. And, and I'm going to force you to re authenticate yourself. And if you're not me, you can't re authenticate yourself. You're offing a pertain going to work anymore. Right, right. So that's, that's an example, gesture based, continuous authentication. Awesome. So stuff gets lost, you know, so that's one. The other cool thing we're working on is something we've been calling the mood sensing computer just to give it a nice and catchy phrase. But the idea is the following, you know, the more I know about you, the better job I can do of servicing you, right? I mean, GPS is a good example. I know when you search for restaurants, because of GPS, I know to give you good recommendations right around where you are. Right. Emotional state is another thing that if I can find out more about you, then I can do a better job for you. Say you're playing a game, I sense you're bored. You know, I can ratchet up the challenge level of the game. Or at work, I sense you're really concentrating on something really hard. Well, phone calls should go straight to voicemail. I don't want to disturb you. Or on a machine floor, you know, or an assembly line, or you're driving a car, if I can sense that you're not completely in the present. Right, right. I really need to get you out of there or a pilot. So there's some real world applications. So we have technology, some of it is based on camera, and some of it is based on speech, that we can actually recognize. These are demos we can give you if you are visiting us of your state, your emotional state, you're surprised, you're angry, you're sad, and all of those kind of things. So that's an example of another project that we're working on. So we've got things in security, we've got things in big data, particularly because Dell has a huge healthcare practice. We're looking at a lot of big data applied to the healthcare industry, looking at images, image analytics, can I tell automatically that you've had an aneurysm, you know, not necessarily, you know, that support the radiologist in doing that, reduce the rate of readmission to a hospital that can save the hospital a lot of money. So all of these kind of things in the big data field, but applied in the healthcare vertical, these are some examples of things we're working on in the short time. I would love to go on and on and on. We got to go to Austin because we're getting the hook here and you're hungry and I'm hungry, so that's good. But thanks for sharing that, Jay. That's amazing. I mean, I would just love to dive into the way that that algorithm is set up. All right. How much of it is by me? How much of it is by what you set up? How does it learn? And how do I course correct it on the way? So I'm not constantly having to authenticate because I broke my finger and I'm not doing it the same way. I'm doing it left handed. So thanks a lot for stopping by. All right. I'm Jeff Frick here. We're at the Dell 1510 discussion series on big data at the Cliff Hotel in San Francisco and you're watching the Cube.