 Live from New York, it's theCUBE, covering theCUBE, New York City, 2018, brought to you by SiliconANGLE Media and its ecosystem partners. Hello everyone, welcome back to CUBE NYC. This is theCUBE's special presentation of something that we've done now for the past couple of years. IBM has sponsored an influencer panel on some of the hottest topics in the industry and of course there's no hotter topic right now than AI. We've got nine of the top influencers in the AI space and we're in Hell's Kitchen and it's gonna get hot in here and these guys are gonna, we're gonna cover the gamut. So first of all folks, thanks so much for joining us today. Really, as John said earlier, we'd love the collaboration with you all and we'll definitely see you on social after the fact. I'm Dave Vellante with my co-host for this session, Peter Burris, and again thank you to IBM for sponsoring this and organizing this. IBM has a big event down here in conjunction with Strata called Change the Game, Winning with AI, We Run the CUBE NYC, we've been here all week. So here's the format. I'm gonna kick it off and then we'll see where it goes. So I'm gonna introduce each of the panelists and then ask you guys to answer a question. Well, sorry, first tell us a little bit about yourself briefly and then answer one of the following questions. Two big themes that have come up this week. Because this is our ninth year covering what used to be Hadoop World, which is gonna morphed into big data. Question is AI, big data, same wine, new bottle? Or is it really substantive and driving real business value? So that's one question to ponder. The other one is you've heard the term, the phrase data is the new oil. Is data really the new oil? I wonder what you think about that. Okay, so Chris Penn, let's start with you. Chris is co-founder of Trust Insight, longtime CUBE alum and friend. Thanks for coming on. Tell us a little bit about yourself and then pick one of those questions. Sure, we're a data science consulting firm, we're an IBM business partner. When it comes to data as the new oil, I love that expression because it's completely accurate. Crude oil is useless. You have to extract it out of the ground, refine it and then bring it to distribution. Data is the same way where you have to have developers and data architects get the data out. You need data scientists and tools like Watson Studio to refine it and then you need to put it into production and that's where marketing technologists, technologists, business analysts, folks and tools like Watson machine learning are help bring the data and make it useful. Okay, great, thank you. All right, Tony Flath is a tech and media consultant. Focus on cloud and cybersecurity, welcome. Thank you. Tell us a little bit about yourself and your thoughts on one of those questions. Sure thing, well thanks so much for having us on the show, really appreciate it. My background is in cloud, cybersecurity and certainly in emerging tech with artificial intelligence. Certainly touched it from a cybersecurity play, how you can use machine learning, machine control for better controlling security across the gamut. But I'll touch on your question about wine. Is it a new bottle, new wine? Where does this come from, from artificial intelligence? And I really see it as a whole new wine that is coming along. When you look at emerging technology and you look at all the deep learning that's happening, it's going just beyond being able to machine learn and know what's happening, it's making some meaning to that data and things are being done with that data from robotics, from automation, from all kinds of different things where we're at a point in society where technology is getting beyond us. Prior to this, it's always been command and control. You control data from a keyboard. Well this is passing us. So my passion and perspective on this is the humanization of it, of IT. How do you ensure that people are in that process, right? Excellent, and we're gonna come back and talk about that a lot. Carla Gentry at Datanerd, great to see you live as opposed to just in the ether on Twitter. Data scientist and owner of Analytical Solution. Welcome, your thoughts. Well thank you for having us. Mine is data, the new oil, and I'd like to rephrase that is data equals human lives. So with all the artificial intelligence and everything that's going on and all the algorithms and models that's being created, we have to think about things being biased, being fair, and understand that this data has impacts on people's lives. Great. Steve Adirre, my Paisan. Paisan. AI startup advisor, welcome. Thanks for coming to theCUBE. Thanks Dave. So my first career was geology and I've used AI is the new oil, but data's new oil, but AI is the refinery. I've used that many times before. In fact, I've moved from just AI to augmented intelligence. So augmented intelligence is really new the way forward. This was a presentation I gave at IBM Think last spring. Has almost 100,000 impressions right now. And the fundamental reason why is machines can attend to vastly more information than humans, but you still need humans in the loop. And we can talk about what they're bringing in terms of common sense reasoning because big data does the who, what, when, and where, but not the why. And why is really the holy grail for causal analysis and reasoning. Excellent. Bob Hayes, Business Over Broadway. Welcome, great to see you again. Thanks for having me. So my background is in psychology, industrial psychology, and I'm interested in things like customer experience, data science, machine learning, so forth. And I'll answer the question around big data versus AI. I think there's other terms we could talk about, big data, data science, machine learning AI. And to me, it's kind of all the same. It's always been about analytics and getting value from your data. Big, small would have you. And there's subtle differences among those terms. Machine learning is just about making a prediction and knowing if things are classified correctly. Data science is more about understanding why things work and understanding maybe the ethics behind it, what variables are predicting that outcome. But still, it's all the same thing. It's all about using data in a way that we can get value from that as a society in the right instance. Excellent, thank you. Theo Lau, Founder of Unconventional Ventures. What's your story? Yes, so my background is driving technology innovation. So together with my partner, what I worked as is we work with organizations to try to help them leverage technology to drive systematic financial wellness. We connect founders, startup founders with funders. We help them get money in the ecosystem. We also work with them to look at how do we leverage emerging technology to do something good for the society. So very much on point to what Bob was saying about. So when I look at AI, AI is not new, right? It's been around for quite a while. But what's different is the amount of technological power that we have allow us to do so much more than what we were able to do before. And so what my mantra is, great ideas can come from anywhere in the society, but it's our job to be able to leverage technology to shine a spotlight on people who can use this to do something different to help seniors in our country to do better in their financial planning. Okay, so in your mind, it's not just the same wine new bottle, it's more substantive than that. As more substantive is a much better bottle. Karen Lopez, Senior Project Manager for Architect Info Advisors, welcome. Thank you. So I'm Data Check on Twitter, and that kind of tells my focus is that I'm here, I also call myself a data evangelist, and that means I'm there at organizations helping stand up for the data because to me that's the proxy for standing up for the people and the places and the events that that data describes. That means I have a focus on security, data privacy and protection as well. And I'm gonna kind of combine your two questions about whether data is the new wine bottle I think is the combination. Oh, see now I'm talking about alcohol. But anyway, all analogies are imperfect. So whether we say it's the new wine or same wine or whether it's oil is that the analogies good for both of them. But unlike oil, the amount of data is just growing like crazy and the oil we know at some point. I kind of doubt that we're gonna hit peak data where we have not enough data like we're gonna do with oil. But that says to me that how did we get here with big data with machine learning and AI? And from my point of view as someone who's been focused on data for 35 years, we have hit this perfect storm of open source technologies, cloud architectures and cloud services, data innovation that if we didn't have those we wouldn't be talking about large machine learning and deep learning type things. So because we have all these things coming together at the same time, we're now explosions of data which means we also have to protect them and protect the people from doing harm with data. We need to do data for good things and all of that. Great, definite differences. We're not running out of data. Data is like the terrible tribbles. Yes. But it's very cuddly data. Yeah, cuddly data. Mark Lind, founder of Relevant Track. That's right. I like the name, what's your story? Well, thank you. It actually plays into what my interest is. It's mainly around AI and enterprise operations and cybersecurity. These teams that are in enterprise operations, both it can be sales, marketing, all the way through the organization as well as cybersecurity, they're often undersourced and they need off of what Steve pointed out is they need to augment intelligence. They need to take AI, the big data, all the information they have and make use of that in a way where they're able to, even though they're undersourced, make some use and some value for the organization, better use of the resources they have to grow and support the strategic goals of the organization. Oftentimes, when you get to budgeting, it doesn't really align. You're short people, you're short time, but the data continues to grow, as Karen pointed out. So when you take those together, using AI to provide augmented intelligence to help them get through that data, make real tangible decisions based on information versus just raw data, especially around cybersecurity, which is a big hit right now, is really a great place to be and there's a lot of stuff going on and a lot of exciting stuff in that area. Great, thank you. Kevin L. Jackson, author and founder of GovCloud. GovCloud, that's big. Yeah, GovCloud Network. Thank you very much for having me on the show. I've been working on cloud computing initially in the federal government with the intelligence community as they adopted cloud computing for a lot of the nation's major missions. And what has happened is now I'm working a lot with commercial organizations and with the security of that data. And I'm going to sort of on your questions, piggyback on Karen. There was a time when you would get a couple of bottles of wine and they would come in and you would savor that wine and sip it and it would take a few days to get through it and you would enjoy it. The problem now is that you don't get a couple of bottles of wine into your house, you get two or three tankers of data. So it's not that it's a new wine, you're just getting a lot of it. And the infrastructures that you need, before you could have a couple of computers and a couple of people, now you need cloud, you need automated infrastructures, you need huge capabilities. And artificial intelligence and AI, it's what we can use as the tool on top of these huge infrastructures to drink that, you know. Firehose of wine. Firehose of wine. Everybody's having a great time. Everybody's having a great time. Things are booming right now. Excellent, well thank you all for those intros. Peter, I want to ask you a question. So I heard there are some similarities and some definite differences. You, with regard to data being a new oil, you have a perspective on this and I wonder if you could inject it into the conversation. Sure, so the perspective that we take in a lot of conversations, a lot of folks here on theCUBE, what we've learned, and I'll kind of answer both questions a little bit, first off on the question of data as a new oil, we definitely think that data is the new asset that business is going to be built on. In fact, our perspective is there really is a difference between business and digital business and that difference is data as an asset. And if you want to understand data transformation, you understand the degree to which a business is re-institutionalizing work, reorganizing its people, re-establishing its mission around what you can do with data as an asset. The difference between data and oil though is that oil still follows the economics of scarcity. Data is one of those things, you can copy it, you can share it, you can easily corrupt it, you can mess it up, you can do all kinds of awful things with it if you're not careful. And it's that core fundamental proposition that as an asset when we think about cybersecurity, we think in many respects that is the approach to how we can go about privatizing data so that we can predict who's actually going to be able to appropriate returns on it. So it's a good analogy, but as you said, it's not entirely perfect, but it's not perfect in a really fundamental way. It's not following the laws of scarcity. And that is enormous in fact. In other words, I could put oil in my car or I can put oil in my house, but I can't put the same oil in the boat. Yeah, put it in both places. And on the issue of the wine, I think it's, we think that in fact there is a new wine and the very simple abstraction or generalization we come up with this is the issue of agency, that analytics has historically not taken on agency. It hasn't acted on behalf of the brand. AI is going to act on behalf of the brand. Now you're going to need both of them. You can't separate them. A lot of implications there in terms of bias. Absolutely. In terms of privacy. You have a thought here, Chris? Well, the scarcity is our compute power and the ability for us to process. I mean, it's the same as oil. There's a ton of oil under the ground, right? We can't get to it as efficiently or without severe environmental consequences to use it. Yeah, when you use it, it's transformed, but our scarcity is compute power and our ability to use it intelligently. But even when you find it, I have data, I can apply it to six different applications. I have oil, I can apply it to one. And that's going to matter now, we think about work. But one thing I'd like to add, sort of you're talking about data as an asset. The issue we're having right now is we're trying to learn how to manage that asset. Artificial intelligence is a way of managing that asset. And that's important if you're going to use and leverage big data. Yeah, but see everybody's talking about the quantity, the quantity, it's not always the quantity. We can have just oodles and oodles of data, but if it's not clean data, if it's not alphanumeric data, which is what needed for machine learning. So having lots of data is great, but you have to think about the signal versus the noise. So sometimes you get so much data, you're looking at overfitting. Sometimes you get so much data, you're looking at biases within the data. So it's not the amount of data. It's the, now that we have all of this data, making sure that we look at relevant data, making sure we look at clean data. One more thought, and then we have a lot to cover. I want to get inside your big data. I was just thinking about it from a cybersecurity perspective. One of my customers, they were looking at the data, just comes from the perimeter, your firewalls, routers, all that. And then not even looking internally, just the perimeter alone, and the amount of data being pulled off of those, and then trying to correlate that data so it makes some type of business sense, or they can determine if there's incidents or incidents that may happen and take a predictive action or threats that might be there because they haven't taken a certain action prior. It's overwhelming to them. So having AI now to be able to go through the logs to look at, and there's so many different types of data that come through those logs, but being able to pull that information as well as looking at endpoints and all that, and people's houses, which are an extension of the network oftentimes, it's an amazing amount of data, and they're only looking at a small portion today because there's not enough resources, there's not enough trained people to do all that work. So AI is doing a wonderful way of doing that, and some of the tools now are starting to mature and be sophisticated enough where they provide that augmented intelligence that Steve talked about earlier. So it's complicated. There's infrastructure, there's security, there's a lot of software, there's skills, and on and on. At IBM Think this year, Ginni Rometti talked about, there were a couple of themes. One was augmented intelligence. That was something that was clear. She also talked a lot about privacy and you own your own data, et cetera. One of the things that struck me was her discussion about incumbent disruptors. So if you look at the top five companies, roughly Facebook with fake news has dropped down but top five companies in terms of market cap in the US, they're data companies, right? Apple just hit a trillion, Amazon, Google, et cetera. How do those incumbents close the gap? Is that concept of incumbent disruptors actually something that is being put into practice? I mean, you guys work with a lot of practitioners. How are they gonna close that gap with the data haves, meaning data at their core of their business versus the data have nots. It's not that they don't have a lot of data but it's in silos, it's hard to get to. Yeah, I got one more thing. So these companies and whoever's gonna be big next is you have a digital persona, whether you want it or not. So if you live on a farm out in the middle of Oklahoma, you still have a digital persona. People are collecting data on you, they're putting profiles of you and the big companies know about you and people that first interact with you, they're gonna know that you have this digital persona. Personal AI, when AI from these companies could be used simply and easily from a personal deal to fill in those gaps and to have a digital persona that supports your family, your growth, both personal and professional growth and those type of things. There's a lot of applications for AI on a personal enterprise, even small business that have not been done yet but the data is being collected now. So you talk about the oil, the oil is being built right now. Lots and lots and lots of it. It's the applications to use that and turn that into something personally, professionally, educationally powerful, that's what's missing. But it's coming. So I'll add to that an answer to your question too, right? So one example we always use in banking is if you look at the big banks, right? And then you look at from a consumer perspective and there's a lot of talk about Amazon being a bank. But the thing is Amazon doesn't need to be a bank. They provide banking services from a consumer perspective. They don't really care if you're a bank or you're not a bank. But what's different between Amazon and some of the banks is that Amazon, like you say, has a lot of data and they know how to make use of the data to offer something that's relevant that consumers want. Whereas banks, they have a lot of data but they're all silos, right? So it's not just a matter of whether or not you have the data, it's also can you actually access it and make something useful out of it so that you can create something that consumers want. Because otherwise you're just a pipe. Totally agreed. Like when you look at it from a perspective of there's a lot of terms out there, digital transformation is thrown out so much, right? And you know, go to cloud and you migrate to cloud and you're going to take everything over. But really when you look at it and you both touched on it, it's the economics. You have to look at the data from an economics perspective and how do you make some kind of way to take this data meaningful to your customers that's going to work effectively for them that they're going to drive. So when you look at the big, big cloud providers I think the push in things that's going to happen in the next few years is there's just going to be a bigger migration to public cloud. So then between those they have to differentiate themselves. Obvious is artificial intelligence in a way that makes it easy to aggregate data from across platforms, to aggregate data from multi-cloud effectively, to use that data in a meaningful way that's going to drive not only better decisions for your business and better outcomes, but drives opportunities for customers, drives opportunities for employees and how they work. We're at a really interesting point in technology where we get to tell technology what to do. It's going beyond us. It's no longer what we're telling it to do. It's going to go beyond us. So how we effectively manage that is going to be where we see that data flow and those big five or big four really take that to the next level. Now one of the other things that Ginny Rometti said was I forget the exact step, it was like 80% of the data is not searchable. Kind of implying that it's sitting somewhere behind a firewall, presumably on somebody's premises. So it was kind of interesting. You were talking about certainly a lot of momentum for public cloud but at the same time, a lot of data is going to stay where it is. Yeah, we're assuming that a lot of this data is just sitting there available and ready and we look at the desperate kind of database situation where you have 29 databases and two of them have unique quantifiers that tie together and the rest of them don't. So there's nothing that you can do with that data. So artificial intelligence is just that. It's artificial intelligence. So that's machine learning, that's natural language, that's classification. There's a lot of different parts of that that are moving but we also have to have IT, good data infrastructure, master data management, compliance. There's so many moving parts to this that it's not just about the data anymore. So I want to ask Steve to chime in here, go ahead. Yeah, so we also have to change the mentality a bit. It's just not enterprise data. There's data on the web. The biggest thing is internet of things. The amount of sensor data will make the current data look like, you know, Trump change. So it's data's moving faster, okay? And this is where the sophistication of machine learning needs to kick in. Going from just mostly supervised learning today to unsupervised learning. And in order to really get into, as I said, big data does, you know, and credible AI does the who, what, where, when, and how, but not the why. And this is really the holy grail of the crack and it's actually, a new moniker is called explainable AI because it moves beyond just correlation to root cause analysis. Once we have that, then you have the means to be able to tap into augmented intelligence where humans are working with machines. Karen, please. Yeah, so one of the things like to what Carla was saying and what a lot of us have said, like, I like to think of the advent of ML technologies and AI going to help me as a data architect to do, to love my data better, right? So that includes protecting it. But also, like when you say that 80% of the data is unsearchable, it's not just an access problem. It's that no one knows what it was, what the sovereignty was, what the metadata was, what the quality was. Or why there's huge anomalies in it. So my favorite story about this is in the 1980s, about, I forget the exact number, but like eight million children disappeared out of the US in April, at April 15th. And that was when the IRS enacted a rule that in order to have a dependent, a deduction for a dependent on your tax returns, you had, they had to have a valid social security number. And people who had accidentally miscounted their children and over-claimed them over the years, stopped doing that. Well, some days it does feel like you have eight children in an account. So when that rule came about, literally, and they're not all children because they're dependents, but literally millions of children disappeared off the face of the earth in April. But if you were doing analytics or AI and ML, and you don't know that this anomaly happened, like I can imagine in a hundred years, someone is saying some catastrophic event happened in April, 1983. And like what caused that? Was it healthcare? Was it a meteor? Was it the cloud? It's happening? That's where I was going. Right. So those are really important things that I want to use AI and ML to help me not only document and capture that stuff, but to provide that information to the people, the data scientists and the analysts that are using the data. Great story. Thank you. Have you got a thought? Get the mic. Go. Jump in here. Yeah, I do have a thought. Actually, I was talking about what Karen was talking about. I think it's really important that not only that we understand AI and machine learning and data science, but that the regular folks and companies understand that at the basic level, because those are the people who will ask the questions or who know what questions to ask of the data. And if they don't have the tools and the knowledge of how to get access to that data or even how to pose a question, then that data is going to be less valuable, I think, to come in. And the more that everybody knows about data, even people in Congress, remember when Zuckerberg talked about how do you make money? It's like, we all know this, but we need to educate the masses on just basic data analytics. We could have an hour-long panel on that. Yeah, absolutely. Peter, you and I were talking about, we had a couple of questions. How far can we take artificial intelligence? How far should we? So that brings in the conversation of ethics and bias. Why don't you pick it up? Yeah, so one of the crucial things that we all are implying is at some point in time, AI is going to become a feature of the operations of our homes, our businesses. And as these technologies get more powerful and they diffuse, the knowledge about how to use them, diffuses more broadly, and you put more options into the hands of more people, the question slowly starts to turn from can we do it to should we do it? And one of the issues that I introduce is I think the difference between big data and AI, AI specifically is a notion of agency. The AI will act on behalf of perhaps you, or it'll act on behalf of your business. And that conversation is not being had today. It's being had in arguments between Elon Musk and Mark Zuckerberg, which pretty quickly get pretty boring. At the end of the day, the real question is, should this machine, whether in concert with others or not, be acting on behalf of me, on behalf of my business, or, and when I say on behalf of me, I'm also talking about privacy, because Facebook is acting on behalf of me. It's not just what's going on in my home. So that question of can it be done? A lot of things can be done, and an increasingly number of things will be able to be done. We've got to start having a conversation about should it be done. So humans exhibit tribal behavior, they exhibit bias, the machine's going to pick that up. Go ahead, please. Yeah, one thing to sort of tag on to agency of artificial intelligence. Every industry, every business is now about identifying information and data sources and their appropriate syncs and learning how to draw value out of connecting the sources with the syncs. Artificial intelligence enables you to identify those sources and syncs, and when it gets agency, it will be able to make decisions on your behalf about what data is good, what data means, and who it should be delivered to, or what actions are good. And what data was used to make those actions. Absolutely. And was that the right data? And is there bias in data? And all the way down, all the turtles down. So all this, the data pedigree will be driven by the agency of artificial intelligence, and this is a big issue. It's really fundamental to understand and educate people on there are like four fundamental types of bias, right? So there's, in machine learning, there's intentional bias, hey, we're gonna make the algorithm and generate a certain outcome regardless of what the data says. There's the source of the data itself, historical data that's trained on the models built on flawed data. The model will behave in a flawed way. There's target source, which is so, for example, we know if you pull data from a certain social network, that network itself has an inherent bias. No matter how representative you try and make the data, it's still gonna have flaws in it. Or if you pull healthcare data about, for example, African-Americans from the US healthcare system because of societal biases, that data will always be flawed. And then there's tool that bias. There's limitations to what the tools can do. And so we will intentionally exclude some kinds of data or not use it because we don't know how to or our tools are not able to. And if we don't teach people what those biases are, they won't know to look for them. I know. Yeah, it's like, you know, one of the things that we were talking about before, I mean, artificial intelligence is not gonna just create itself. It's lines of code. It's input and it spits out output. So it learns from these learning sets and we don't want AI to become another buzzword. We don't want everybody to be an AR guru that has no idea what AI is. It takes months and months and months for these machines to learn. And these learning sets are so very important because that input is how this machine, think of it as your child. And that's basically the way artificial intelligence is learning like your child. You're feeding it these learning sets. And then eventually it will make its own decisions. So we know from some of us having children that you teach them the best that you can but then later on when they're doing their own thing, they're really, it's like a little minor bird. They've heard everything that you said. Not only the things that you said to them directly but the things that you said indirectly. Well, there are some very good AI researchers that might disagree with that metaphor exactly. But having said that, what I think is very interesting about this conversation is that this notion of bias, one of the things that fascinates me about where AI goes, are we going to find a situation where tribalism more deeply infects business? Because we know that human beings do not seek out the best information. They seek out information that reinforces their beliefs. And that happens in business today. My line of business versus your line of business. Engineering versus sales. That happens today, but it happens at a planning level. And when we start talking about AI, we have to put the appropriate dampers, understand the biases so that we don't end up with deep tribalism inside of business. Because AI could have the deleterious effect that it actually starts ripping apart. Well, input is data and then the output is, could be a lot of things. Could be a lot of things. And that's where I said data equals human lives. So if we look at the case in New York where the penal system was using this artificial intelligence to make choices on people that were released from prison and they saw that that was a miserable failure because people that release actually reoffended some committed murder and other things. So it's more than what anybody really thinks. It's not just, oh well, we just trained the machines and a couple of weeks later they're good. We never have to touch them again. These things have to be continuously tweaked. So just because you've built an algorithm or a model, doesn't mean you're done. You've got to go back later and continue to tweak these models. Mark, you got the mic. Yeah, no, I think one thing we've talked a lot about the data that's collected but what about the data that's not collected? Incomplete profiles, incomplete data sets, that's a form of bias and sometimes that's the worst because they'll fill that in, right? And then you can get some bias but there's also a real issue for that around cybersecurity. Logs are not always complete, things aren't always done and when things are done that, people make assumptions based on what they've collected, not what they didn't collect. So when they're looking at this and they're using the AI on it, that's only on the data collected. Not on that, it wasn't collected. So if something is down for a little while and no data is collected off that, the assumption is, well, it was down or it was impacted or there was a breach or whatever. It could be any of those. There's still this human need, there's still the need for humans to look at the data and realize that there is the bias in there. There is, we're just looking at what data was collected and you're gonna have to make your own thoughts around that and assumptions on how to actually use that data before you go make those decisions that can impact lots of people at a human level, enterprises, profitability, things like that and too often people think of AI when it comes out of there, that's the word. Well, it's not the word. Well, last question about this. Please. Does that mean that we shouldn't act? It does not. Okay. It does not. So where's the fine line? Yeah, I think. Going back to this notion of can we do it or should we do it? I think you should do it. Should we act? Yeah, I think you should do it but you should use it for what it is. It's augmenting. It's helping you, assisting you to make a valued or a good decision and hopefully it's a better decision than you would have made without it. I think it's great. I think it also, your answer is right too that you have to iterate faster and faster and faster and discover sources of information or sources of data that you're not currently using and that's why this thing starts getting really important. Right, and I think you touch on a really good point about should you or shouldn't you? You look at Google and you look at the data that they've been using and some of that out there from a digital twin perspective is not being approved or not authorized and even once they've made changes it's still floating around out there. Where do you know where it is? So there's this dilemma of how do you have a digital twin that you want to have and it's gonna work for you and it's gonna do things for you to make your life easier, to do these things mundane tasks, whatever but how do you also control it to do things you don't want it to do? Ad-based business models are inherently evil. Well, there's incentives to appropriate our data and so are things, maybe are things like blockchain potentially that are gonna give users the ability to control their data and we'll see. Well, and I think- No, I'm sorry, but that's actually a really important point. The idea of consensus algorithms, whether it's blockchain or not, blockchain includes games and something along those lines whether it's Byzantine fault tolerance or whether it's Paxos, consensus-based algorithms are gonna be really, really important parts of this conversation because data's gonna be more distributed and you're gonna have more elements participating in it and so something that allows, you know, especially in the machine to machine world which is a lot of what we're talking about right here, you may not have blockchain because there's no need for a sense of incentive which is what blockchain can help provide. And there's no middle man. And there's no high, but there's, but it's really, I think it makes blockchain so powerful is it allows, it liberates new classes of applications but for a lot of the stuff that we're talking about you can use a very powerful consensus algorithm without having that game aside and do some really amazing things at scale. So looking at blockchain, that's a great thing to bring up, right? I think what's inherently wrong with the way we do things today and the whole overall design of technology whether it be on-prem or off-prem is both the lock and key is behind the same wall whether that wall is in a cloud or behind a firewall. So really when there is an audit or when there is a forensics it always comes down to assist admin or something else and the system administrator will have the finger pointed at them because it all resides, you can edit it, you can augment it or you can do things with it that you can't really determine. Now take as an example, blockchain where you've got really the source of truth. Now you can take and have the lock in one place and the key in another place. So that's certainly gonna be interesting to see how that unfolds. So one of the things, it's good that we've hit a lot of buzzwords right now, right? A, I, M, L, block. We got the blockchain bingo, yeah, yeah. So one of the things is, you also brought up, I mean, ethics and everything and one of the things that I've noticed over the last year or so is that as I attend briefings or demos everyone is now claiming that their product is AI or ML enabled or blockchain enabled and when you try to get answers to the questions what you really find out is that some things are being pushed as because they have if then statements somewhere in their code and therefore that's artificial intelligence or machine learning. At least not go to. Yeah, yeah, yeah, yeah. You're that experienced as well. So I mean, this is part of the thing you try to do as a practitioner, as an analyst, as an influencer is trying to, you know, the hype of it all, right? And recently I attended one where they said they use blockchain and I couldn't figure it out and it turns out they use goods to identify things and that's not blockchain, it's an identifier. So one of the ethics things that I think we as an enterprise community have to deal with is the over-promising of AI and ML and deep learning and recognition. Like it's not, I don't really consider it visual recognition services if they just look for red pixels. I mean, that's not quite the same thing. Yet this is also making things much harder for your average CIO or worse, CFO to understand whether they're getting any value from these technologies. Old bottle. And I wonder if the data companies like that you talked about or the top five, I'm more concerned about their nearly or actual one trillion dollar valuations having an impact on their ability of other companies to disrupt or enter into the field more so than their data technologies. Like again, we're coming to another perfect storm of the companies that have data as their asset even though it's still not on their financial statements which is another indicator whether it's really an asset is that do we need to think about the terms of AI about whose hands it's in and who's like once one large trillion dollar company decides that you are not a profitable company how many other companies are gonna buy that data and make that decision about you? Well and for the first time in business history I think this is true. We're seeing because of digital because it's data you're seeing tech companies traverse industries. Get into whether it's content or music or publishing or groceries and that's powerful and that's awful scary. If you're a manager, one of the things your ownership is asking you to do is to reduce asset specificities so that their capital could be applied to more productive uses. Data reduces asset specificities. It brings into question the whole notion of vertical industry. You're absolutely right. One question I got for you playing off of this is again it goes back to the notion of can we do it and should we do it. I find it interesting if you look at those top five all data companies but all of them are very different business models or they can classify the two different business models. Apple is transactional. Microsoft is transactional. Google is ad-based. Facebook is ad-based before the big news stuff. Amazon's kind of playing in both sides. They're kind of all on the collision course though. But well that's what's gonna be interesting. I think at some point in time the can we do it should we do it question is brands are going to be identified by whether or not they are they have gone through that process of thinking about should we do it and say no. Apple is clearly for example incorporating that to the brand. Well Silicon Valley broadly defined I include Seattle and maybe Armok not so much IBM but they've got a dual disruption agenda. They've always disrupted horizontal tech. Now they're disrupting vertical industries. That's quite amazing. I thought. I was actually just going to pick up on what she was talking about. We're talking about buzzword right. So one word we have. I haven't heard yet is voice. Voice is another big buzzword right now when you couple that with IOT and AI. Here you go bingo. Do I got three points. Exactly voice recognition voice technology. So all of the smart speakers if you think about that in the world there are 7000 languages being spoken. But yet if you look at Google Home you look at Siri you look at you know any any of the devices that would challenge you. It would have a lot of problem understanding my accent in the evening when my British accent creeps out or it will have trouble understanding seniors because the way they talk is very different than a typical 25 year old person living in Silicon Valley right. So how do we solve that especially going forward. We're seeing voice technology is going to be so much more prominent in our homes. We're going to have in the cars we have in the kitchen it does everything it listens to everything that we are talking about or not talking about and records it and to your point is it going to start making decisions on our behalf but then my question is how much does it actually understand us. So I just want to one short story. Siri can't translate a word that I ask it to translate into French because my phone set to Canadian English and that's not supported. So I live in a bilingual French English country and it can't translate. What this is really bringing up is if you look at society and culture what's legal what's ethical changes across the years. What was right 200 years ago is not right now. What was right 50 years ago is not right now. It changes across countries. So it changes across countries it changes across regions. So what does this mean when our AI has agency. How do we make ethical AI if we don't even know how to manage the change of what's right and what's wrong in human society. One of the most important questions we have to worry about right. Absolutely well we got it but it also says one more thing before we go on it also says that the issue of economies of scale in the cloud. Yes are going to be strongly impacted not just by how big you can build your data centers but some of those regulatory issues that are going to influence strongly what constitutes good experience good law good on my behalf. And one thing that's underappreciated in the marketplace right now is the impact of data sovereignty. If you get back to data countries are now recognizing the importance of managing that data and they're implementing data sovereignty rules. Everyone talks about California issuing a new law that's aligned with GDPR and you know what that meant. There are 30 other states in the United States alone that are modifying their laws to address this issue. Line up Steve. So one comment we're not we got a number of years no matter what Craig Kurzweil says until we get to artificial general intelligence. So the singularity is not so near. Do you know that he's changed the date over the last 10 years. I didn't know that. Quite a bit. And I don't even prognosticate where it's going to be but really the what we're at right now and I keep on coming back to is that's why augmented intelligence is really going to be the new rage. Humans working with machines. One of the hot one of the hot topics and the reason I chose to speak about it is is the future of work. There is a lot I don't care if you're a millennial you know mid-career or baby boomer people are paranoid as machines get smarter if your job is routine cognitive. Yes you have a higher propensity to be automated. So what this really put this really shifts a number of things. A you have to be a lifelong learner. You've got to learn new skill sets and the dynamics are changing fast. Now this is also a great equalizer for emerging startups and even in SMBs. As the AI improves they can become more nimble. So back to your point regarding colossal trillion dollar. Wait a second. There's going to be quite a sea change going on right now and regarding demographics in 2020 millennials take over as the majority of the workforce by 2025 at 75%. As a baby boomer I try my damn best to stay relevant. Surround yourself with millennials is the take away there or tire. One thing I think this goes back to what Karen was saying if you want a basic standard to put around the stuff look at the old ISO 38500 framework. Business strategy technology strategy you have risk compliance change management operations and most importantly the balance sheet and the financials. AI and what Tony was saying about digital transformation if it's of meaning it belongs on a balance sheet and should factor into how you value your company. All the cyber security and all the compliance and all the regulation is all stuff this framework exists. So look it up and every time you start some kind of new machine learning project or data science project say have we checked the box on each of these standards that's within this medicine if you haven't maybe slow down and do your homework. Do you see a day when data is going to be valued on the balance sheet? It is. It's already valued as part of the current but it certainly is goodwill. Certainly market value as we were just talking about. Well we're talking about all the companies that have opted in right. There's tens of thousands of small businesses just in this region alone that are opt out. They're small family businesses or businesses that really aren't even technology aware but data is being collected about them it's being put on Yelp. They're being rated, they're being reviewed. The success of their business is out of their hands and I think what's really going to be interesting is you look at the big data, you look at AI, you look at things like that, blockchain may even be a potential for some of that because of immutability but it's when all of those businesses when the technology becomes a cost, it's cost per head of now for a lot of them or they just don't want to do it and they're proudly opt out. In fact we talked about that last night at dinner but when they opt in, the company that can do that and can reach out to them in a way that is economically feasible and bring them back in where they control their data, where they control their information and they do it in such a way where it helps them build their business and it may be a generational business been passed on. Those kind of things are gonna make a big impact not only on the cloud but the data being stored in the cloud, the AI, the applications that you talked about earlier we talked about that and that's where this bias and some of these other things are gonna have a tremendous impact if they're not dealt with now, at least ethically. Well I feel like we just got started. We're out of time but time for a couple more comments and then we'll have to wrap it up. Yeah I had one thing to say. I mean really Henry Ford and the creation of the automobile back in the early 1900s changed everything because now we're no longer stuck in the country. We can get away from our parents, we can date without grandma and grandpa sitting on the porch with us. We can take long trips so now we're looked at, we've sprawled out we're not all living in the country anymore and it changed America. So AI has that same capabilities. It will automate mundane routine tasks that nobody wanted to do anyway. So a lot of that will change things but it's not gonna be any different than the way things changed in the early 1900s. Like you were saying, constant reinvention. I think that's a great point. Let me make one observation on that. Every period of significant industrial change was preceded by the formation, a period of formation of new assets that nobody knew what to do with. Whether it was, what do we do? When industrial manufacturing, it was row houses with long shafts tied to an engine that was coal fired and drove a bunch of looms. Same thing, railroads, large factories for Henry IV before you figured out how to do an information-based notion of mass production. This is the period of asset formation for the next generation of social structures. Because ship makers are gonna be all over these cars. I mean, you're gonna have augmented reality right there on your windshield. Karen, bring it home. Oh, thank you. So, I think- Give us the drop the mic moment. No pressure. Your AV guys are not happy with that. So, I think that it all comes down to, it's a people problem, a challenge, let's say that. The whole AIML thing, a people, it's a legal compliance thing. Enterprises are gonna struggle with trying to meet five billion different types of compliance rules around data and its uses about enforcement because ROI is gonna mean risk of incarceration as well as return on investment and we'll have to manage both of those. I think businesses are struggling with a lot of this complexity in these, like we just, you just opened a whole bunch of questions that we really didn't have solid, oh, you can fix it by doing this. So, it's important that we think of this new world of data-focused, data-driven, everything like that is that the entire IT and business community needs to realize that focusing on data means we have to change how we do things and how we think about it. But we also have some of the same old challenges there. Well, I have a feeling we're gonna be talking about this for quite some time. What a great way to wrap up CUBE NYC here, our third day of activities down here at 37 Pillars or Mercantile 37. Thank you all so much for joining us today. Really, wonderful insights, really appreciate it. Now, all this content is gonna be available on thecube.net. We are exposing our video cloud and our video search engine to be able to search our entire corpus of data. I can't wait to start searching and clipping up this session. Again, thank you so much and thank you for watching. We'll see you next time.