 Hey, welcome back everybody. Jeff Frick here with theCUBE. We're at the Data Privacy Day at Twitter's World Headquarters in downtown San Francisco and we're really excited to get into it with our next guest, Dr. Andreas Weigandt. He is now at the Social Data Lab, used to be at Amazon, recently published author. Welcome. Good to be here. Good morning. Absolutely. So give us a little bit about what is Social Data Lab for people that aren't that familiar with it and what are you doing over at Berkeley? All right, so let's start with what is social data? Social data is the data people create and share whether they know it or not. And what that means is Twitter is explicit but also a geolocation or maybe even just having photos about you. I was in Russia over during election day in the United States with Putin. And I have to say that people now share on Facebook what the KGB wouldn't have gotten out of them under torture. So did you ever see the Saturday Night Live sketch where they had a congressional hearing and the guy, the CIA guy says Facebook is the most successful project that we've ever launched. People tell us where they are, who they're with, what they're gonna do, share pictures, location. It was pretty interesting sketch. Only be taught by Black Mirror. Some of these episodes are absolutely amazing. People can't even watch it. I have not seen it, I have to see it but they're like, it's just too crazy and too real and too close to home. Yeah. So what was the question? So let's talk about your new book. That was Social Data. Yeah, Social Data. So I call it actually Social Data Revolution because if you think back 10, 20 years ago, we absolutely, and we doesn't mean just you and me, that means a billion people, they think about who they are differently from 20 years ago. Think Facebook as you mentioned. How we buy things. We buy things based on social data. We buy things based on what other people say not on what some marketing department says. And even the way we think about information. I mean, could you do a day without Google? No, no. Could you go an hour without Google? An hour, yes, when I sleep. But some people actually do Google in their sleep. Yes, yeah. Well, and they have their health tracker turned on while they sleep to tell them if they slept well, right? I actually find this super interesting how dependent I am to know in the morning when I wake up before I can push the smiley face or the okay face or the frowning face to first see how did I sleep? And if the cycles were nice up and down, then it must have been a good night. So it's interesting because the concept from all these kind of biometric feedback loops is if you have the data, you can change your behavior based on the data. But on the other hand, there's so much data and do we really change our behavior based on the data? I think the question is a different one. The question is, all right, we have all this data, but how can we make sure that this data is used for us, not against us? Within a few hundred meters of here, there's a company where employees were asked to wear Fitbit or tracking devices, we should say more generally. And then one morning, one employee came in after not having had an exactly solid night of sleep, shall we say. And his boss said, I'm sorry, but I just looked at your Fitbit. You know, this is an important meeting. We can't have you in that meeting. I'm sorry about that. True story? Yeah. No, that's interesting. So I think the Fitbit angle is interesting when that is a requirement to have companies shoot health insurance and they see you've been sitting on your couch too much. And how does that then run into the HIPAA regulations? You know, they have dog walkers here. I'm not sure where you live in San Francisco, the area where I live, many people have dogs. And I know that a couple of my neighbors, they give when the dog walker comes to take the dog, they also give their phone to the dog walker. So now it looks like they are taking regular walks and they're waiting for the discount from health insurance. Yeah, it's interesting. Works great for the person that does walk or gives their phone to the dog walker, but what about the person that doesn't? What about the person that doesn't stop at stop signs? What happens in a world on business models based on aggregated risk pooling when you can segment to the individual? That is the very, very, very best question. It's a question of fairness. So if we knew everything about everybody, what would it mean to be fair? Because as you said, insurance is built on pooling risk and that means by nature that there are things we don't know about people. So maybe we should propose lobotomy, data lobotomy. So people actually have some part chopped off, part of the data chopped off. So now we can pool again. Of course not. The answer is that we as society should come up with ways of coming up with objective functions. How do we weigh the person taking a walk? And then it's easy to agree on the function, then get the data and rank whatever insurance premium, whatever we're talking about here, rank that accordingly. So I really think it's a very important concept which actually goes back to my time at Amazon where we came up with fitness functions as we call them. And it takes a lot of work. Jeff Bills probably spent 50 hours on that with me going through groups and groups and figuring out what do we want your fitness function to be like? And we have to have the buy-in of the groups. If they just think, you know, that is some random management thing opposed to us, it's not going to happen. But if they understand that that's the output they're managing for, not bad. So I want to follow up on the Amazon piece because we're big fans of Jeff Hamilton and Jeff Bezos. We go to AWS and it's interesting when, or excuse me, James Hamilton, when he talks about the resources that AWS can bring to bear around privacy and security and networking and all this massive infrastructure that they've built in terms of being able to protect privacy once you're in the quote unquote public cloud versus people trying to execute that at the individual company level. And RSA is in a couple of weeks, the amount of crazy scary stuff that's coming in for people that want interviews around some of this crazy security stuff. When you look at kind of public cloud versus private cloud and privacy, you know, supported by a big, heavy infrastructure like what AWS has versus Joe Blow Company, you know, trying to implement them themselves. How do you see that challenge? I mean, I don't know how the little person can compete with having the resources that, again, an aggregated resource pool that James Hamilton has to bring to bear on this problem. So I think we really need to distinguish two things, which is security versus privacy. So for security, there is no question in my mind that Joe Blow with this little PC has not a chance against, you know, our Chinese or Russian friends. Is no question for me that Amazon or Google have way better security teams than anybody else can afford because it is really their bread and butter. And if there's a breach on that level, I think it would be just terrible for them. Just think about the Sony breach on a much smaller scale. That's a very different point from the point of privacy and from the point about companies deliberately giving the data about you for targeting purposes, for instance, at targeting purposes to other companies. So I think for the cloud there, I trust, I trust Google, I trust Amazon, that they are doing hopefully a better job than the Russian hackers. I'm more interested in the discussion of the value of data on the privacy discussion after all this year is the World Privacy Day. And there the question is, what do people understand as the trade-off they have, what they give in order to get something? And people have talked about Google having this impossibly irresistible value proposition that for all those little data you gave, for instance, I took Google Maps to get here. Of course, Google needs to know where to tell me to a left after the intersection. And of course, Google needs to know where I want to be going. And Google knows that a bunch of other people are going here today. And it probably will figure out that something interesting is happening here. Right. And so those are the interesting questions for me. What do we do with the data? What's the value of data? Right. But A, I don't think people really understand the amount of data they're giving over. And B, I really don't think they understand the value. I mean, now maybe they're starting to understand the value because of the value of companies like Google and Facebook that have the data. But do you see a shifting in kind of A, the awareness, and I think it's even worse with younger kids who just have lived on their mobile phone since the day they were conscious practically these days. No, no. Or will there be a value? They have lived on the mobile before they even were born. Children now come preloaded because parents take pictures of their children before they are born. That's true, right, in this onogram and et cetera. But and how has mobile changed this whole conversation? Because when I was on Facebook on my PC at home, very different set of information than when it's connected to all the sensors in my mobile phone, when Facebook's on my mobile phone really changes where I am, how fast I'm moving, who I'm proximity to. It completely changed the privacy game. Yes. So geolocation and the ACLU here in Northern California chapter has done some very good work on that. Geolocation is really extremely powerful variable. Now, what was the question? How has the social privacy thing changed now with the proliferation of mobile? And the other thing I would say is when you have kids that grew up with mobile and sharing on the young ones on your Facebook anymore, sorry, Instagram, Snapchat, we're just kind of the notion of sharing and privacy relative to folks that wouldn't even give their credit card over the telephone not that long ago, much less type it into a keyboard. Do they really know the value? Do they understand the value? Do they really get the implications when that's the world in which they've lived and most of them, they're just starting to enter the workforce and haven't really felt the implications of that. Yeah, so for me, the value of data is how much that data impacts a decision. So for the side of the individual, if I have data about a restaurant and that makes me decide whether to go there or not to go there, that is having an impact my decision, thus the data is valuable. For a company, the decision of whether to show me this offer or that offer, that is how data is valued for the company. So that can actually be quantified. The value of that picture of my dog when I was a child, that is so valuable. I'm not talking about this. I'm very sort of rational here in terms of value of data is the impact that it has on decisions. And do you see companies giving back more of that value to the providers of the data instead of just simple access to useful applications, but obviously the value exceeds the value of the application that they're giving you. So you use the term giving back and before you talked about kids giving up data. So I don't think that is quite the right metaphor. So I know that these metaphor come with a physical world that sometimes being said, data is a new oil and that indeed is a good metaphor when it comes to it needs to be refined to have value. But there are other elements where data is very different from oil. And that is that I don't really give up data when I share. And the company doesn't really give something back to me but it is a much more interesting exchange like a refinery that I put things in and now I get something not necessarily back. I typically get something which is very different from what I gave because it has been combined with the data of a billion other people. And that is where the value lies that my data gets combined with other people data. In some cases it's impossible to actually take it out. It's like a drop of ink you drop in the ocean and it spreads out and you can't say, oh, I want my ink back. It's too late for that. But it's now spread out and that is a metaphor I think I have for data. So people who say, I want to be in control of my data. I often think they don't have deeply enough thought of what they mean by that. I want to change the conversation of people saying, and what can I get by giving you the data? How can you help me make better decisions? How it can be empowered by the data which you are grabbing or which you are listening to that are produced? That is a conversation which I want to ask here at the privacy date. And that's happening with Google Maps obviously, right? You're exchanging the information, you're walking down the street, you're headed here. They're telling you there's a Starbucks on the corner if you want to pick up a coffee on the way. So that is already kind of happening, right? And that's why obviously Google's been so successful because they're giving you enough that you're going to give them more and you get in this kind of virtuous cycle in terms of the information flow. But clearly they're getting a lot more value than you are in terms of their, you know, just based on their market capitalization. You know, it's a very valuable thing in the aggregation. So it's almost like a one plus one makes three on their side. Yes, but it's the one-trick pony ultimately. All the money they make is through ads. Right, right, that's true. But in- It's a good one-trick pony. I love the one-trick pony. It begs the question too when we no longer ask but are just delivered that information. Yes, I think you have a friend, Gam Dias, and he runs a company called First Retail. And he makes the point that there will be no search anymore in a couple of years from now. That's what you're talking about, a search every day. But he said, yes, but, you know, you will get the things before you even think about it. And with Google now a few years ago and other things, I think he's quite right. We're starting to see that, right? Where the cards come to you with a guess as to what you're coming for. And it's not so complicated. If, let's say, you're at a symphony, my phone knows that I'm the symphony. Even if I turn it off, it knows where I ended and where I turned it off. And it knows when the symphony ends because there are like a thousand other people. So why not get Uber's lifts close to there and amaze people by, wow, your car is there already? You know, that's the opposite joke what we have in Germany. In Germany, we have that joke saying, hey, go for vacation in Poland. Your car is there already. But maybe I should tell them jokes. Let's talk about your book. So you got a new book that came out. It's just recently released. It's called Data for the People. What's in it? What do people expect? What motivated you to write the book? Well, I'm actually excited yesterday. I got my first three copies. Not from the publisher and not from Amazon because they are going by the embargo, which is until next week. But Barnes and Noble, they broke three pockets to me. And it looks good. Breaking news. It looks good. Three years of work. And basically it is about trying to get people to embrace the data they create and to be empowered by the data they create. Lots of stories from companies I've worked with. Lots of stories also from China. I have a house in China. I spend a month, two months there every year for the last 15 years. And the Chinese ecosystem is quite different from the U.S. ecosystem. And we of course know that the EU regulations are quite different from the U.S. regulations. So I wrote on what I think is interesting. And I'm looking forward to actually rereading it because they told me I should reread it before I talk to you. Yeah, because when did you submit it? You probably submitted it half a year ago. Yeah. So it's available Barnes and Noble now, Amazon is available. It is available. I mean, if you order it now, you'll get it by Monday. All right. Well, Dr. Andreas Feigen, thanks for taking a few minutes. We could go forever and ever, but I think we got to let you go back to the rest of the sessions. Thank you for having me. All right. A pleasure. Jeff Rick, you're watching theCUBE. See you next time.