 This is Think Tech Hawaii, Community Matters here. Hey, welcome to the Think Tech Hawaii studios. I'm Andrew, the security guy. This is another episode of Security Matters Hawaii. Today we're going to be talking about security, privacy, identity, consent, kind of where all this stuff intersects in this country and sort of what's driving it. We've got one of the experts here who was a featured speaker last week at GSX conference, the big security conference we just had in Las Vegas. Nathalie attended, Sal DiAgostino was with us from Open Consent and ID Machines and a lot of the things that he runs. Sal, welcome. Welcome. Sal's remote. He's not in the studio with me, but he's online with us. Aloha, brother. Welcome. Welcome aboard. Now, Aloha, you're good to be back on Think Tech Hawaii. Thank you so much. This topic in particular is sort of, I don't know if it's evasive, but I think people don't like to spend the hard time thinking about it. So I thought we'd start with a few definitions to kind of clarify where the discussion is going to come from. But the first thing I'd do with my guests is put them on the spot just a little, and you're a longtime security professional. Tell me what keeps you up at night. Well, a whole range of things. But on this particular topic is the fact that there's just been an explosion of sensitive information about people, places and things that's gotten out there into the world, some by people not knowing they were doing it, others by people taking it because people weren't careful about where they had it. And then on top of that, so the thing that really scares me is then people actually are data hoarders. And it's pretty hard to do security and privacy if people just hoard and don't know how to clean out the closet when it comes to their personal information. I mean, think about it. You keep file cabinets, hard drives, cloud, it's everywhere. And security professionals are prone to those same inclinations. So as a security professional, the amount of sensitive information that is required to deliver security services and the fact that it's seldom inventoried, like cards or card readers, managed, that keeps me up at night. It keeps me busy during the day, too. Nice. I'm glad that somebody yourself is working on it in particular. Let's kick the ball just a little bit to some of these initiatives that you've been involved with Cantara Initiative and Open Consent and IDESG for years. Give us a little bit of that background and then I think we'll get into how GDPR is pushing into us and some of these definitions we've been talking about for privacy and things like that. Yeah, I mean, so Andrew, we met when I was working on the early 2000s when there was something called the personal identity verification credential and standard that was developed for the U.S. government for its trying to do high assurance stuff. And at that time the company I was involved with helped us solve some problems to make all of that work and we did a lot of work there. And I thought that was a pretty good thing, right? I mean, there you had something which was really vetted identity, protected cryptographically with a hard token to use the lingo, had multi-factor authentication available including biometrics which was handled securely. But it was just a hard thing to use with a lot of machines and technology that's out there. And what went over that was just the ease of use of getting online and doing mobile and downloading the app and away you go. And so the work from there then progressed into something that was an initiative about six, seven years ago which was called the National Strategy for Trusted Identities in Cyberspace. And obviously I've been involved in the work with the PIV credential and the digital certificates that are the background of that and the public key infrastructure behind that which drives a lot of, you know, in other countries and other places, national ID programs. It was clear we had an identity issue in the United States. We still do it today. A national ID is not an option here for a number of reasons. So that was a public-private partnership to put together a strategy around that. And that was funded by the Department of Commerce and the National Institute of Standards and Technology. It was a multi-year grant. In the process of doing that, we put together something called an ID framework and also a registry for identity services. And underlying the framework is the fact that, you know, there were some principles that were established and found what it means to do online identity, right? And then when it comes to how to measure that, the way to measure that is how well you're doing on usability because you've got to start there. Privacy because that was one of the guiding principles. And then, of course, the stuff that, you know, more often is recognized would be security and need to be interoperable who's going to sort of work across the population. Each of those things are a challenge. The interesting thing was that by having tackled the usability and privacy stuff there, because personally as a security professional, you know, a lot of what we do is getting people to do the way you should do it. You don't have to dent anything, honestly, in terms of, you know, how to do access control. I mean, unless you want to get fancy, but, you know, that's a whole separate topic. But how to do security where you take into account usability and privacy, I mean, that was an interesting thing. So the IDSG and the IDF Registry still exist. They've been now moved into the Cantara Initiative, which is something, another trade organization that I've spent some time and am currently involved with it in a leadership role as well as the Secretary of its Leadership Council. And I've been involved in a number of sort of like, you know, as you know, I spend a lot of time over the horizon, Andrew. So there are initiatives on things which are called user-managed access, which is a distributed access control protocol, which is a really interesting thing. And Harkins back to the work, we was involved with a core street around Pivman. You know that particular body of work, and that was how to do access control without having to phone home and make an endpoint capable of doing it on its own. And when you try to do things at scale, you've got to do things in a distributed way. Anyway, so I found other kindred spirits working on this user-managed access protocol, and it's pretty well established now. It's in version two, there's implementations on that. I mean, that's about eight or more, gosh, I'm not sure how long, in the cooking. But that's been a fun thing. A lot of the work at Open Consent is also found a sympathetic place in Cantara around some consent management work that's going on there and a consent receipt. And we'll talk more about the idea of receipts for what's going on with your information as opposed to simply clicking a privacy policy and saying goodbye. Yeah, so let's kind of go there with those definitions a little bit. So I agree. As soon as you click that I agree, all the stuff that you're ever going to do that you just agreed to, like you said, there's people probably hoarding the activity associated with that, whether it's your name information, maybe it's other browser information. There's no telling what you're doing when you interact with that. And you've probably just consented for them to do whatever they wish. Signing terms and conditions is not the same as consenting to something. And that's the difference. So privacy policies and end user licenses are not really been that usable from a privacy perspective. Because whatever is going on is obfuscated. I'm going to click it because I've been here a long time and even after I read it I'm going to have to go figure out what it is I just read. So that's not usable. And nor is it driven with the idea of consent by design. So security by design is a good thing. Privacy by design is a good thing. One of the design principles of privacy by design is consent by design. Because if you use that in terms of how you then structure things, it also makes it harder to hoard. So all these things might go back to our newfound what keep us awake at night and the hoarder as an organizing principle around this conversation. So people aren't going to want you to hoard their stuff. So it's a little different if you think of the workflow that you go through to get to the point where you're willing to share information about yourself or conduct transactions, there are ways to improve that. And so that's a lot about what operational privacy is about. And what you can do in the course of improving it is help people help themselves. And that's where the operational aspect comes in. That's where the service level aspect comes in. As opposed to really having gotten very little. A good example is simply having a name and a contact with the privacy policy. Look at the privacy policies you read. Click here. This is that we take care of you. There's a website to go to. If you have a problem, come and talk to me. Which is if you're providing someone a service, Andrew and your business, you're not going to forget at some point during the introduction to say, if there's a problem here, you can always talk to me. And very few of us ever get that kind of service when we're dealing with the next app. Yeah, and especially the embedded stuff. I saw that Google just said the next issuance of Chrome, there are going to be a whole lot more scrutiny on the embedded other widgets that you can embed into Chrome. Because those have been harvesting all this information, which they just allowed. It was like open privilege. Like they were just allowed to take information from Chrome to use it. And like, wow, what's up with that? People don't know this. It's all transparent. Yeah, I mean, if you don't use a privacy plug-in on your browser, like Ghostry or Privacy Badger. I mean, there's a bunch of them. They light up when you get these places, right? And yeah, I mean, it's a good way to understand how your stuff is being harvested. And again, that's the usable privacy thing, right? So the green lock kind of means that you've got a secure connection. I mean, it's not bad. Honestly, it's come a long way. And up until recently, most people didn't require that you communicated with websites or suggest that websites be HTTPS, again, to be geeky, right? But the secure connection as opposed to an insecure one. The same sort of simple presentation of privacy status also needs to be something that is part and parcel of what we think usable privacy is. And yeah, and in the IDSG, we had 15 things where we put together. It was like, you know, here's a 15-item privacy checklist. It's pretty darn good. Open Consents has developed a whole schema around that. You know, taking those 15 and coming a little deeper. But to the point though, but we don't show our schema, our way of evaluating and measuring privacy or consent to people, we show them something which has the same kind of graphic representation, which is a few things that are checked and you're doing a pretty good job here. And with the idea that what you should be doing is presenting people with something that they can click on to figure out, you know, establish what their current relationship is, understand what your public privacy profile looks like, be able to get in touch with you, be able to get a status on the type of information you'll be sharing. I mean, those are things that if you look at the new California laws, which I think there's half a million businesses in California, so it'll impact some people, you're suddenly seeing California companies rush to get the federal government to bail them out. We want privacy legislation to guide, you know, suddenly they've got religion because California puts something down with some teeth and where individuals have rights. Yeah, and so you mean, and that's been happening globally, right? I mean, you know, there's the general data protection regulation which, you know, you kicked in this year, even though it's passed two years ago, which covers, you know, hundreds of millions of people in Europe and their data wherever that goes. You've got, you know, the local, you know, you've got Canada later this year, India just passed legislation. So, I mean, you're talking about billions of people and millions and millions of companies who are confronting a new set of rights that they have to respect and provide services which are considerate of those. So, the idea that you need to figure out how to make this operational matters because, you know, people have the right to ask questions and, you know, again, if your customers are asking you for stuff, it's better to have an answer, better to make it so easy for them that they can answer the question themselves. And that's a lot of them. Yeah, that kind of, as you know, in the IDOLA platform in the previous conversation, we had a Think Tech Hawaii. Our whole focus there at ID Machines was to help cross the knowledge and complexity gap around cybersecurity. At open consent, we're trying to do the same thing when it comes to privacy. Awesome. So, let's take a break. We'll pay some bills and we'll be right back and I think we'll get into a little bit of what the implications are for the future. What can we expect to see? We'll be right back. Don't go there, Andrew. Thank you. This is Think Tech Hawaii, raising public awareness. I just walked by and I said, what's happening, guys? They told me they were making music. Aloha. I'm Marcia Joyner, inviting you to come visit with us on Cannabis Chronicles, a 10,000-year odyssey where we explore and examine the plant that the muse has given us and stay with us as we explore all of the facets of this planet on Wednesdays at noon. Please join us, Aloha. Hey, Aloha, and welcome back to the Think Tech Hawaii studios. This is Security Matters Hawaii, where we're at Saudi Agostino and we're talking about security, privacy. We want to know where it's going because Sal's got that kind of vision, but Sal, maybe to know where it's going, we got to talk about where it started. I mean, it's always good to look at things from a historical perspective and be able to understand the thing that you're talking about. Probably didn't show up today and you figured it out and one fell asleep. Never happens for me that way. Yeah, not usually on this end. I mean, you break down the word. It's an interesting thing. It goes back to, like, in France, in fact, there were private laws, right? So for the aristocracy, there were different rules than for everyone else. So, literally, so initially, privacy was very specifically to set up the rights of the rich. Interesting. Back when, right? It's like, for example, there was a law, the private law in France said that the aristocracy didn't have to pay taxes. They got the church bonus card there. Wow. So it's interesting to understand that. Then talking about privacy, as we begin to think about it, where it's, again, in the United States, it also started with protecting the fairly well-to-do and comes into play around the time that instantaneous photography begins to have an impact. And I forget which President's fiancé someone took pictures of, which I think her last name was Palmer, so I should have written this down before we came on today, knowing we might go there. But that happened. And then the thing everybody will tell you about is in 1890, the Harvard Law Review, the right to privacy was written up as a thing. And so, yeah, that's not that long ago, right? I mean, it was 130 years, roughly, where it became something which, even though at that point it was written up because of this invasion of the living rooms and, you know, bed slash bedrooms of the elite, it became something which then got out and began to be thought of as a right of human being. Yeah, publicly, sure. It makes sense. And then it took a while. And then it's gone through a whole bunch of sort of interesting twists and turns since then. I mean, from an identity perspective, the whole Social Security Administration was kind of an interesting phase, which was in the 1930s. We're talking a lot of what we're talking about today is identity theft. Early on, the Social Security Board, as it was called, was extremely protective of people's personal information. So a lot of the sort of practice around how to handle personal information and think about a lot of the government regulations and how they handle information because government security controls do cover security and privacy. As this basis and how the, what was called the Social Security Board, treated that info. And early on, they would not even give it to other government agencies, even law enforcement. But then at some point that broke down because there's a pretty life or death kind of case. And then ever since then, hard to put the genie back in the bottle. We lost our way a little bit. It sounds like they initially understood the power of identity protection and what it meant. And then it got coerced a little bit. They understood the need to protect personally identifiable information. So PII is a term that, you know, in this field is the thing where security and privacy overlap, right? Yeah, for sure. That is the intersection. Sometimes I like to talk about it. Sensitive information. PII is a kind of sensitive information. So in the enterprise where you're looking at information risk, having it and killing two birds with one stone. So you don't have to look at for a different kind of information and just sort of think about what's sensitive. And then when you apply the controls, they're either privacy controls or security controls. I mean, that's sort of the how-to. But the social security thing was funny. I mean, there are people that had tattoos because they didn't want to lose it, right? Because, you know, at least when I'm 62, if I die, if I got my social security number, I'm good. And back then that was almost enough to get by. Wow. Okay. Interesting. And then other companies were marketing your social security rings. You know, you could get a little bling with your number on it. Interesting. I didn't know that. Yeah, I know. So then it kind of flipped at one point, Andrew. It was like suddenly it was out. And as I said, the genie was out of the bottle. And then, you know, all the way to, you know, current times where, you know, the thing that kept me up at night is just the proliferation of information and devices. And, you know, as a security professional, you know, the amount of bits that are traveling around in the network are certainly, you know, growing astronomically. And it doesn't, you know, it doesn't... Yeah. And at some point that information is going to get connected to people when there's risk around it, right? I was thinking as you taught, as you just, this picture came in my mind, you know, that with enough machine learning and individuals' habits, from their habits, you'll be able to derive the individual without knowing anything about them or who they are. And then from there, it'll be pretty easy to figure out really who they are. Well, yeah. I mean, the social graph that people get from the amount of information that they've let out and after they do a little bit online is probably strong. I don't know this, you know, but it wants to be a bit stronger than some of the, you know, the hard security... Like your thumbprint? Your user print might be more accurate. Yeah, that's crazy. And so I think, I mean, and that kind of leads up to me. It's an interesting point, which is where, you know, in the old days, it used to be authentication and then author... Well, in the old days, I'm thinking that we've already evolved, because this is what I'm working on. Yeah. I built the things where you authenticate and then authorize, right? We all have. That's how we do stuff. I think really what's happening now is we're going to be in a situation which you have consent literally replacing authorization, because it's not going to be a question of who it is, because you, you know, as I said, it's hard to hide. Okay. Or even better would be things that are designed where you don't even need to identify, right? Sure, because why? So for both of those reasons, if authentication is less of a requirement to get to access control or the authorization of the use of assets, then that's really, you know, that's an interesting thing. And a lot of what we're working on now with open consent are the concept of, you know, of tokens and receipts associated with, or actually tokens can be in the receipts even. Okay. To accomplish those sorts of things, right? So, yeah, and in some ways what you're doing is that you're sort of supplanting the role with the right. Okay. Right? Yeah. So that, you know, in order for me to do something, you know, a lot of valid credential and I had to know how to do swift order rescue before someone was going to put me in a boat as opposed to, you know, helping in the medical tent. Sure. And, you know, but now really what you, you could almost craft that differently to say the, you know, I have consented to be able to access the information for those particular roles, right? Sure. Or I have, people have given me their consent. Sure. Or, yeah, I've requested the role of this and then I, you know, and then I've been granted that. But, you know, so it's, I mean, I think it's going to be a big difference in terms of how we go about designing and building the user interactions. And I think we can, I think we can definitely take this, this, all this authentication and all this stuff that's happening transparently without our knowledge, right? We've got to eliminate that so that if you ask me for information and I share with you just the information that's required, maybe it's a token, maybe there's no information you can even read, but you're sure that it's me, you got my consent to do with this what you said you would. And I've got to, as you said, a receipt for that. There's a lot more transparency in us knowing what we're doing with each other than what's going on today where no one really knows what they're giving away and what's being done with it. Yeah, I mean, today, these days when we're talking about it, what's not maintained is the state of privacy that you would like. Yeah. And a lot of what we're designing are sort of these machines that describe and enforce a certain state. And so, I mean, that's kind of the current engineering challenge and some of the things we're actually in the process of building. I don't think it's that, other people are looking to do that too. Awesome. And yeah, it's fun. It's nice to be onto a next cool thing that kind of combines a bunch of this other stuff because it doesn't stop, Andrew, right? No, it's not going to stop. So, thank you so much for coming on today and shining some light on this force. I think it's an odd topic for people and it's something that's going to just keep driving into, especially with, like you said, the California law, everybody's going to have to become aware. It's kind of the next security battleground, I think. Yeah. I mean, if you're going to manage risk, you've got to manage privacy risk and if you're going to manage it, there's a way to make it as easy and even turn it into value for you and the people that you interact with. Yeah, imagine that, people. Imagine your information, your identity being valuable to you instead of everybody else. Are you getting some of the value back for it? So, thank you so much. We are out of time today. Thank you for joining us on Security Matters Hawaii. Join us next week, one o'clock Hawaii time. And, you know, we will share more of this great information with you because security matters. Thank you. Mahalo, Andrew. Mahalo.