 Hi, Jeff Frick here with theCUBE. We are on the ground in Santa Clara, California at the MIT Chief Data Officer's second annual West Coast edition. We cover the East Coast edition back in Cambridge. We wanted to come out to the West Coast edition and see what's going on. And we watched a great movie about the human face of big data, terrific panel discussion. They got a full day program tomorrow. So we were able to grab a few minutes with some of the esteemed guests. So we're excited for this segment to be joined by Gardner analyst, Joe Bogayski. Welcome. Thank you very much. Appreciate it, the opportunity. Absolutely. Interesting times. And I think you brought up a really interesting point. The movie made everything look like it's happening now as beautiful graphs and visualizations and we're saving lives and making things great. But really, it's still early days. Oh, it's very early days. But the good news is that there's a lot that can be done. And I think that came across loud and clear in the film. What we don't see is just how much more can be done. They talk about it in there, which is great because there is just enormous opportunity ahead of us, actually beyond anything we've seen so far in the internet revolution. Yeah, I always go back to Amar's Law, which we're right down the street from Intel and R&B and Gordon Moore and we're sitting in Intel office and everyone knows Amar's Law, but Amar's Law just doesn't get the credit it gets. We totally overestimate in the short term and we way underestimate the long-term impacts. And I think this is going to be a classic test book example if we're going to be in for significant, huge changes. Yeah, I think the changes will not just be in one area either. They won't just be in business. They're going to be in government and education and throughout society. There will be good things and there'll be bad things because every major technological revolution, which this one is close to being, but at this point still being evolutionary, is preceded by some great ideas, some new things that we're finding to get the excitement going and the investment moving. And when we do that, then we have opportunity to explore and expand. The excitement builds on itself, grows some more. And we're just at the very beginning of that. And it's funny because people are very excited and things are moving in, but the technology is always outpacing kind of the governance. It's outpacing the regulation. It's outpacing the law. And a really good point that you brought up is really the ethics. And someone once said that if big data's done well, it's magic, big data done not well is kind of creepy. So talk a little bit about the ethics and some of these ethical dilemmas that we really have to face when we're all carrying these mobile devices, whether we like it or not, everybody knows exactly where we are all the time, how fast we're moving and we're running through stop sites, are we jogging every day? I mean, everybody knows because it's all in our mobile phone. All right, that's very true. What we don't see though is how that data can be used and under what conditions it can be used. And again, there's a good side and a bad side. And as you said, and one of my analysts, Frank Bautendike, started the research at Gartner on this with digital ethics and that's the creepy line. Once you go over that creepy line, all of a sudden you're talking about something that most established organizations would unlikely do. However, it doesn't mean that there aren't some new explorers out there that take a shot at it and see what happens. And I don't mean nefarious. I don't mean that, but things that some people might find slightly distasteful or even a lot distasteful. And other people will find this is exciting and out of that comes new ideas. And so where should those boundaries be drawn? That has to be done as a society. That has to be done as a culture. That has to be done in our businesses, in our homes. We have to make those decisions. We have to set the parental controls, if you will. That's one of the Chief Data Officer's primary missions is to understand what that data does, what it can do and how it can contribute to both the well-being but also the harm of the organization. Now just to give an example, there was one that we investigated that Amazon has been gracious enough to allow us to talk about. And that's where searches are being done to look for different kinds of paraphernalia that actually could be used to build methane, meth. So if you wanna build a meth lab, you could be looking for the same kinds of things. That was certainly not their intent. However, word gets out that, oh my goodness, look at what's going on. Where does that line get drawn? I have no idea. And it's very, very difficult because that's an individual use situation. Now, if they detect it, they stop it, but that doesn't mean the algorithms inherently evil. It's how we use it, how we choose to understand the information, how we choose to govern ourselves and to both limit and explore. And it's interesting on the algorithm side because people write algorithms but you don't necessarily know what the output of that algorithm is gonna be and that's a perfect case. You're trying to put together a bucket of goods that you see is often purchased together and maybe you make suggestions. If you buy one, you probably need these other three things and maybe I'll even give you a deal if you buy them all today. But that algorithm wasn't built to help people build in that lab. So even if you thought that maybe you're trying to be proactively conscious of the algorithms that you're building, you just can't know all the answers or where it's gonna go. No, you can't, but what you can do is react to it to understand what happens and then, yes, you'll have to put some governors on some of these things and say, well, this is not the kind of business we're in. So we're not going to do that. So those, but we will understand that only when it's brought to our attention. This is where sharing, the information sharing economy comes in and helps us. So just the opposite is true, right? That these things happen, allow us to see that they're happening and allow us to make a good decision. But we won't know to make the decision if it ever comes forward. If we quiesce it, if we try to hide it, if we bury it under a rug, it just isn't the right thing to do. We have to debate in open society and open forums. We have to debate like this at the CDO forum. And it was interesting in the movie, they had a great example of really metadata and really analyzing the searches. And it was a CDC example of using, searching on Google searches for flu symptoms and flu searches to be able to predict where flu were going before the CDC would actually get data from doctors and be two weeks ahead of the game. But there's two other levels of ethics. So even if we kind of had taken the Amazon example a little further, we decide, you know, we do the algorithm, we make the adjustment. But then there's the government that potentially with or without permission is grabbing that data that maybe we never really thought about or a third-party hacker that jumps in and we hear hacking attacks all the time. Again, we never necessarily plan for that data to go in those hands, but those are kind of two other levels of people getting access to that data that was never necessarily intended. Let's face it, the data is there and it almost feels like you should almost plan for an attack and more kind of how do you respond and how do you clean up the mess rather than really think that you can ultimately build a castle around this stuff. Oh, very true. You're not going to build a castle around your data. However, we also should separate out things that where we're under attack where they're truly not just unethical, immoral and illegal activities happening. That kind of an attack we need to protect against and we need to do things with the algorithms that we know how to build to detect the onset of such a thing and prevent it. It was, you know, not that long ago when a DNS attack, a denial of service attack would be a problem for organizations. Most organizations can now stop it. They've got the tools, the technologies, the infrastructure to do it. But what they don't have the infrastructure tools and technology to do and one of the things that was an insight to me as I watched the movie was they only don't have the, we don't know how to detect behavior that's spoofed. So imagine, if you will, once we learn how typical behavior is, we can spoof that behavior. I've seen that in my career before when we're doing analysis of data. We've seen, I've seen spoofing attacks. They're so close to real behavior of good people that they're very, very difficult to. Those are going to be the challenges that the current generation has to get our minds wrapped around and try to address because we must protect ourselves from those attacks. There are nefarious souls out there. They will try to attack us. They will try to take things from us that don't belong to them and use it for ill-gotten gain. Whatever their purpose or methodology is really doesn't matter. What we have to do is protect ourselves. When we talk about ethics, remember, we're talking about, you know, the decisions we make about something that is on that border of, you know, is this right for our business? Is this right for our government? Is this right for our citizens? We need to make those decisions as informed citizens. We need to make those decisions as informed executives. And do the best we can, right? With the information that we have and nobody's perfect, no amount of information is complete, so we just gotta do what we gotta do. And we'll neither anticipate all the possible outcomes like we talked about one, you know, completely reasonable behaviors on the part of the algorithm and the people using it, et cetera. And then, you know, somebody uses it for ill-gotten gain. That's, you know, we have to understand it. We have to make a decision what we do about it and we have to share those insights. Thanks Joe, very much for spending a few minutes. I know you got a busy day ahead tomorrow and it's been a long day today. So thanks for stopping by. You're quite welcome. Thank you so much for that. I'm Jeff Frick with theCUBE. We're at the MIT Chief Data Officer Summit, the second one on the West Coast. You're watching theCUBE. Thanks for watching.