 Hey, welcome back everybody, Jefferyk here with theCUBE. We're at Data Privacy Day at Twitter's World Headquarters in downtown San Francisco, full day event, a lot of seminars and sessions talking about the issue of privacy. And even though Scott McNeely in 1999 said, ''Privacy's dead, get over it.'' Everyone here would beg to differ and it's a really important topic. And we're excited to have Michelle Denny. She's the Chief Privacy Officer from Cisco. Welcome, Michelle. Indeed, thank you. And when Scott said that, I was his Chief Privacy Officer. Oh, you were. Well acquainted with my young friend, Scott's feelings on the subject. It's pretty interesting, because that was eight years before the iPhone. So a completely different world than actually one of the prior guests we were talking about privacy is an issue in the Harvard Business Review from 125 years ago. So this is not new. So how have things changed? I mean, that's a great perspective that you were there. What was he kind of thinking about and really what are the privacy challenges now compared to 1999? So different. Such a different world. I mean, fascinating that when that statement was made, the discussion was a press conference where we were introducing a connectivity. It was an offshoot of Java and it basically allowed you to send from your personal computer a wireless message to your printer so that a document could come out. That's what it was? Yeah. Wireless printing. Wireless printing. And really it was Giro technology. So anything wirelessly could start talking to each other in an internet of things world. So good news, bad news. The world has exploded from there obviously but the base premise of can I be mobile? Can I live in a world of connectivity and still have control over my story, who I am, where I am, what I'm doing? And it was a really a reframing moment of when you say privacy is dead, if what you mean by that is secrecy and hiding away and not being connected to the world around you, I may agree with you. However, privacy as a functional definition of how we define ourselves, how we live in a culture or what we can expect in terms of morality, ethics, respect and security. Live and well, baby, live and well. No shortage of opportunity to keep you busy. So we talk to a lot of people, we go to a lot of tech homes and I have to say, I don't know if we've ever talked to a chief privacy officer. So what- You're missing out. I know, this is good. So now you get to define the role, I love it. So kind of what are your priorities as chief priority officer? What are you kind of keeping an eye on day to day as well as what are kind of more your strategic objective? Yeah, so it's a great question. And so the rise of the chief privacy officer actually Scott was a big help in that and gave me exactly the right amount of rope to hang myself with. The way I look at it is probably the simplest analogy is should you have a chief financial officer? Yeah. I would guess, yeah, right? That didn't exist about 100 years ago. We just kind of loped along and whoever had the biggest bag of money at the end was deemed to be successful where somebody else who had no money left at the end but bought another store, you would have no way of measuring that. So the chief privacy officer is that person for your digital currency. I look at the pros and the cons, the profit and the loss of data and the data footprint for our company and for all the people to whom we sell. And we think about what are those control mechanisms for data. So think of me as your data financial officer. Right, right. But the data in and of itself is just stagnant, right? It's really the data in the context of all these other applications, how it's used, where it's used, when it's used, what it's combined with that really starts to trip into areas of value as well as potential problems. I feel like we scripted this before, but we didn't. We did not script that. So if I took out a rectangle out of my wallet and it had a number 20 on it and it was green, what would you say that thing probably is? Yeah, probably Andrew Jackson on the front. Yeah, probably Andrew Jackson. What is that? $20 bill. Why is that a $20 bill? Because we agree that you're going to give it to me and it has that much value. And thankfully, the guy at Starbucks will give me $20 for the coffee for it. Yeah, exactly. Well, which could be a cop the way we're going. But that's exactly right. So is that $20 bill stagnant? Yes. That $20 bill just sitting on the table between us is nothing. I could burn it up. I could put it in my pocket and lose it and never see it again. I could flush it down the toilet. That's how we used to treat our data. If you recognize instead the story that we share about that piece of currency, we happen to be in a place where it's really easy to alienate that currency. I can go downstairs here and spend it. If I was in Beijing, I probably would have to go and convert it into a different currency. And we tell a story about that conversion because our standards interface is different. Data is exactly the same way. The story that we share together today is a valuable story because we're communicating out. We're here for a purpose. We're making friends. I'm liking you because you're asking me all these great questions that I would have fed you had I been able to feed you questions. But it's only that context. It's only that communicability that brings it value. But we now assume as a populist that paper currency is valuable. It's just paper. It's only as good as the story that enlivens it. And so now we're looking at smaller, smaller micro data transactions of how am I tweeting out information to people who follow me? How do I share that with your following public? And does that give me a greater opportunity to educate people about security and privacy? Does that allow my company to sell more of my goods and services because we're building ethics and privacy into the fabric of our networks? I would say that's as valuable or more valuable than that Andrew Jackson. So it's interesting. Could you talk about building privacy into the products? We often hear about building security into the products because the old way of building a big wall doesn't work anymore and you really have to bake it in at all steps of the application development, the data layer, the database, et cetera, et cetera. When you look at privacy versus security, and especially because this goes sitting. You guys are sitting on the pipes. Everything's running through your machines. How do you separate the two? How do you prioritize? And how do you make sure the privacy discussion is certainly part of but gets the right amount of relevance within the context of the security conversation? So it's a sort of a glib answer that's much more complicated. But the security is really, in many instances, the what? I can really secure almost any batch of data. It can be complete, gobbly, go zeroes and ones. It could be something really critical. It could be my medical records. The privacy and the data about what that context is, that's the why. So I don't see them as one or the other at all. I see security and security not as a technology but a series of verb things that you actually physically, people process technologies. That enactment should be addressed to a why. So it's kind of Peter Drucker's management of you manage what you measure. That was like incendiary advice when it first came out. Well, I'm going to say that you secure what you treasure. So if you treasure a digital interaction with your employees, your customers, and your community, you should probably secure that. But it seems like there's a little bit of a disconnect about maybe what should be treasured and what is the value with folks that have grown up, I'd pick on the young kids, not really thinking through or having the time or knowing an impact of a negative event in terms of just clicking and accepting the eula and using that application on their phone. They maybe, they just look at it in a different way is that valid? How do they change that behavior? How do you look at kind of this new generation and there's this sea of data, which is far larger than it used to be coming off of all these devices, Internet of Things obviously, people are things too. The mobile devices with all that geolocation data and the sensor data, and then oh, by the way, it's all going to be in our cars and everything else shortly. How's that landscape changing and kind of challenging you in new ways and what are you doing about it? I mean, the speed and dynamics are astronomical. I mean, how do you count the stars, right? And should you, isn't that kind of a waste of time? It used to be that knowledge when I was a kid was knowing what was an A to Z of the encyclopedia Britannica. Now facts are cheap. Facts used to be expensive. You had to take time and commit to them and physically find them and be smart enough to read. It's on and on and on. The dumbest kid is smarter than I was with my encyclopedia of Britannica because we have search engines. Now, their commodity is how do I critically think? How do I make my brand and make my way? How do I ride and surf on a wave of untold quantities of information to create a quality brand for myself? So the young people are actually in a much better position than, I'll still count us as young, but maybe less young. Less young than we were yesterday. We are digital natives. But I think I am hugely optimistic that the kids coming up are really starting to understand the power of brand, personal brand, family brand, cultural brand, and they're feeling very activist about the whole thing. Yeah, which is interesting, because that was never a factor when there was no personal brand, right? You were part of whatever entity that you were in. Well, you were on a click, right? You were identified as when I was home, I was the third out of four kids. I was a Roman Catholic girl in the Midwest. I was a total dork with a bull haircut. Now, kids can curate who and what and how they are over the network. Young professionals can connect with people with experience, or they can decide, I get this all the time on Twitter actually, how did you become a chief privacy officer? I'm really interested in taking a pivot in my career. And I love talking to those people because they always educate me and I hope that I give them a little bit of value too. Right, right. Michelle, we could go on for on and on and on, but unfortunately, I think you got to go cover a session so we're going to let you go. Michelle, Denny, thanks for taking a few minutes of your time. Thank you, and don't miss another day to privacy day. I will not. We'll be back next year as well. I'm Jeff Frick. You're watching theCUBE. See you next time.