 Hey, welcome back everybody. Jeff Frick here with theCUBE. We're at Data Privacy Day 2018. I still can't believe it's 2018 in downtown San Francisco. At LinkedIn's headquarters, our new headquarters, a beautiful building just down the road from the Salesforce building from the new Moscone. This being done is a lot of exciting things going on in San Francisco, but that's not what we're here to talk about. We're here to talk about data privacy and we're excited to have a return visit from last year's CUBE alumni. She's Eve Lasquez, president and CEO, Identity Theft Resource Center. Great to see you again. Thank you for having me back. Absolutely. So, it's been a year. What's been going on in the last year in your world? Well, Identity Theft hasn't gone away. Shoot. And data... Like you told me it was last time. I know, I wish. And in fact, unfortunately, we just released our data breach information and there was a tremendous growth. It was a little over a thousand previous year and over 1,500 data breaches in 2017. We're almost immune. They're like every day. And it used to be like big news. Now it's like, not only was Yahoo breached at some level, which we heard about a while ago, but then we hear they were actually breached like 100%. There is some fatigue, but I can tell you that it's not as pervasive as you might think. Our call center had such a tremendous spike in calls during the aquifax breach. It was the largest number of calls we'd had in a month since we've been measuring our call volume. So people were still very, very concerned, but a lot of us who are in this space are feeling, I think we may be feeling the fatigue more than your average consumer out there because for a lot of folks it's really the first exposure to it. We're still having a lot of first exposures to a lot of these issues. So the aquifax one is interesting because most people don't have a direct relationship with aquifax, I don't think. I'm not a direct paying customer. I did not choose to do business with them. But as one of the two or three main reporting agencies, right? They've got data on everybody for their customers who are the bank's financial institutions. So how does that relationship get made? Oh my gosh, there's so much meat there. There's so much meat there. Okay, so while it feels like you don't have a direct relationship with the credit reporting agencies, you actually do. You get a benefit from the services that they're providing to you. And every time you get a loan, I mean, this is a great conversation for data privacy day because when you get a loan, get a credit card and you sign those terms and conditions, guess what? You are giving that retailer, that lender, the authority to send that information over to the credit reporting agencies. And let's not forget that the intention of forming the credit reporting agencies was for better lending practices so that your credit worthiness was not determined by things like your gender, your race, your religion and those types of really, I won't say arbitrary, but just not pertinent factors. Now your credit worthiness is determined by your past history of do you pay your bills? What is your income? Do you have the ability to pay? So it started with a good, very good purpose in mind and we definitely bought into that as a society. And I don't wanna sound like I'm defending the credit reporting agencies and all of their behavior out there because I do think there are some changes that need to be made, but we do get a benefit from the credit reporting agencies like instant credit, much faster turnaround when we need those financial tools. I mean, that's just the reality of it. Right, right. So who is the person that's been breached, been penalized? I'm trying to think of the right word the relationship between those who've had their data hacked from the person who was hacked. If it's this kind of indirect third party relationship to an authorization through the credit card company. No, the Equifax is absolutely responsible. So who would be the litigant? Just maybe that's the word that's coming to me in terms of feeling the pain. Is it me as the holder of the Bank of America Mastercard? Is it Bank of America as the issuer of the Mastercard or is it Mastercard in terms of retribution back to Equifax? Well, you know, I can't really comment on who actually would have the strongest legal liability, but what I can say is this is the same thing I say when I talk to banks about identity theft victims. There's some discussion about, well, no, it's the bank that's the victim in existing account identity theft because they are the ones that are absorbing the financial losses, not the person whose data it belongs to. Yet the person who owns that data, it's their identity credentials that have been compromised. They are dealing with issues as well, above and beyond just the financial compromise. They have to deal with cleaning up other messes and other records and there's time spent on the phone. So it's not mutually exclusive. They're both victims of this situation. And with data breaches, often the breached entity, again, I hate to sound like an apologist, but I am keeping this real, a breached entity when they're hacked, they are a victim. A hacker has committed that crime and gone into their systems. Yes, they have a responsibility to make those security systems as robust as possible, but the person whose identity credentials those are, they are the victim. Any entity or institution, if it's payment card data that's compromised and a financial services institution has to replace that data, guess what, they're a victim too. That's what makes this issue and this crime so terrible is that it has these tentacles that reach down and touch more than one person for each incident. Right, and then there's a whole nother level, which we talked about before we got started, that we wanted to dig into and that's children. Recently, a little roar was raised with kind of these IoT connected toys and just a big giant privacy hole into your kid's bedroom with eyes and ears and everything else. So I wonder if we, you've got some specific thoughts on how that landscape is evolving. Well, we have to think about the data that we're creating that does comprise our identity. And when we start talking about these toys and other internet connected IoT devices that we're putting in our children's bedroom, it actually does make the advocacy part of me. It makes the hair in the back of my neck stand up because the more data that we create, the more that it's vulnerable, the more that it's used to comprise our identity. And we have a big enough problem with child identity theft just now, right now as it stands, without adding the rest of these challenges. Child and synthetic identity theft are a huge problem and that's where a specific social security number is submitted and has a credit profile built around it. When it can either be completely made up or it belongs to a child. And so you have a four year old whose social security number is now having a credit profile built around it. Obviously they're not, the thieves are not submitting this belongs to a four year old. They would not be issued credit. So they're saying it's a 23 year old in a different state. They're grabbing the number, they're using the name. They build this credit profile. And the biggest problem is we really haven't modernized how we're authenticating this information and this data. I think it's interesting and fitting that we're talking about this on data privacy day because the solution here is actually to share data. It's to share it more. And that's an important part of this whole conversation. We need to be smart about how we share our data. So yes, please have a thoughtful conversation with yourself and with your family about what are the types of data that you wanna share and what do you wanna keep private? But then culturally we need to look at smart ways to open up some data sharing, particularly for these legitimate uses for fraud detection and prevention. So you said way too much there because there's like 87 follow up questions in my head. So we'll step back a couple. So is that synthetic identity then? Is that what you meant when you said a synthetic identity problem where it's the social security number of a four year old that's then used to construct this? I mean it's the four year old social security number but a person that doesn't really exist. Yes, all child identity theft is synthetic identity theft but not all synthetic identity theft is child identity theft. Sometimes it can just be that the number's been made up. It doesn't actually belong to anyone. Now eventually maybe it will. We are hearing from more and more parents. I'm not going to say this is happening all the time but I'm starting to hear it a little bit more often where the social security number is being issued to their child. They go to file their taxes. So this child is less than a year old and they are finding out that that number has a credit history associated with it that was associated years ago before that number was just made it up. So are we ready to be done with social security numbers? I mean for God's sake, I've read numerous things like the nine digit number that's printed on a little piece of paper is not protectable period. And then I've even had case where they say bring your car, bring your little paper car that they gave you at the hospital and I won't tell you what year that was a long time ago. I'm like, I mean come on it's 2018. Should that still be the anchor? Super read my mind. It was like I was putting that question in your head. It just kills me. I've actually been talking quite a bit about that and it's not that we need to get quote unquote get rid of social security numbers, okay? Social security numbers were developed as an identifier because we have, you can have John Smith with the same date of birth and how do we know which one of those 50, you know, 1,000 John Smiths is the one we're looking for. So that unique identifier, it has value and we should keep that. It's not a good authenticator. It is not a secret. It's not something that I should pretend only I know. I write it on my check when I said my tax return in. Write your number on the check. Oh, that's brilliant. Right, right, so it's not, we shouldn't pretend that this is, I'm going to you business that doesn't know me and wants to make sure I am me in this first initial relationship or interaction that we're having, that's not a good authenticator. That's where we need to come up with a better system and it probably has to do with layers and more layers and it means that it won't be as frictionless for consumers but I'm really challenging. This is one of our big challenges for 2018. We want to flip that security versus convenience conundrum on its ear and say, no, I really want to challenge consumers to say, I'm happier that I had to jump through those hoops. I feel safer, I think you're respecting my data and my privacy and my identity more because you made it a little bit harder. And right now it's no, I don't want to do that because it's a little too. Nine seconds, I can't believe it took me nine seconds to get that done. Yeah, and we have all this technology. We've got fingerprint readers that we're carrying around in our pocket. We've got geolocation, it's this person in the place that they generally, and having them, there's so many things beyond a printed piece of paper, right? It's the angle at which you look at your phone when you look at it. It's the tension with which you enter your passcode, not just the passcode itself. There are all kinds of very non-invasive biometrics for lack of a better word. We tend to think of them as just like our face and our fingerprint, but there are a lot of other biometrics that are non-invasive and not personal. They're not private, they don't feel secret, but we can use them to authenticate ourselves. And that's the big discussion we need to be having is I want to be smart about my privacy. Right, and it's interesting on the sharing because we hear that a lot at security conferences where one of the best defenses is that teams at competing companies, security teams share data on breach attempts, right? Because probably the same person who tried it against you is trying it against that person, is trying it against that person. And really an effort to try to open up the dialogue at that level as more of a sent us against them versus we're competing against each other in the marketplace because we both sell widgets. So are you seeing that? Is that something that people buy into where there's a mutual benefit of sharing information to a certain level so that we can be more armed? Oh, for sure, especially when you talk to the folks in the risk and fraud and identity theft mitigation and remediation space, they definitely want more data sharing. And I'm simply saying that that's an absolutely legitimate use for sharing data. We also need to have conversations with the people who own that data and who it belongs to, but I think you can make that argument. People get it when I say, do you really feel like the angle at which you hold your phone is that personal? Couldn't that be helpful? That combined with 10 other data points about you to help authenticate you? Do you feel like your personal business and life is being invaded by that piece of information? Or compare that to things like your health records and medical conditions that you're being treated for. Well, wow, for sure, that feels super, super personal. And I think we need to do that nuance. We need to talk about what data falls into which of these buckets on the bucket that isn't super personal and feeling invasive and that I feel like I need to protect. How can I leverage that to make myself safer? Right, lots of opportunity. I think it's there. All right, Eva, thanks for taking a few minutes to stop by. It's such a multi-layer and kind of complex problem that we still feel pretty much early days at trying to solve. It's complicated, but we'll get there. More of this kind of dialogue gets us just that much closer. All right, well, thanks for taking a few minutes of your day, great to see you again. Thanks. All right, she's Eva, I'm Jeff. You're watching theCUBE from Day to Privacy Day in San Francisco.