 Hey, welcome back everybody. Jeff Frick here with theCUBE. We're in downtown San Francisco at the Twitter headquarters for a big event, Data Privacy Day. It's been going on for years and years and years. It's our first visit and we're excited to be here. And our next guest is gonna talk about something that is near and dear to all of our hearts. It's Eve Maylor. She's the VP Innovation and Emerging Technology for Fordrock, welcome. Thank you so much. Absolutely, so for people that aren't familiar with Fordrock, give us a little background on the company. Sure, so of course the digital journey for every customer and consumer and patient and citizen in the world is so important because trust is important. And so what Fordrock is about is about creating that seamless digital identity journey throughout cloud, mobile, internet of things, devices across all of their experiences in a trustworthy and secure way. So one of the topics that we had down in getting ready for this was OAuth. And as the proliferation of SaaS applications continues to grow both within our home life as well as our work life, we have these pesky things called passwords, which no one can remember and they force you to change all the time. So along comes OAuth. Yes, so OAuth is one of those technologies. I'm kind of a standards wonk. I actually had a hand in creating XML for those people who remember XML. That's right. OAuth took a tack of saying let's get rid of what's called the password anti-pattern. Let's not give out our passwords to third-party services and applications so that we can just give those applications what's called an access token. Instead, it's meant just for that application. And in fact, Twitter, we're here at Twitter headquarters, Twitter uses that OAuth technology. And I'm involved in a standard, being a standards wonk, that builds on top of OAuth called user managed access and it uses this so that we can share access with applications in the same way and we can share access also with other people using applications. So for example, in the same way that we hit a share button in Google, Alice hits a share button to share access with a document with Bob, we want to allow every application in the world to be able to do that, not just Google Docs, Google Sheets and so on. So OAuth is powerful and user managed access is powerful for privacy in the same way. Now there's OAuth and I use my Twitter OAuth all the time. That's right. And then there's these other kind of third-party tools which add kind of another layer. So you might use like a tweet bot is something I like to use on my phone to tweet. Right, right. Well, there's the tweet bot but then there's these pure like identity password manager applications, which you know, you load it into there and then. Last pass or something like that. Right, right. To me, it's just like, wow, that just seems like it's adding another layer. And if, oh my gosh, if I forget the last pass password, I'm really in bad shape, not just for one application, but a whole bunch. I mean, how do you see the space kind of evolving to where we got to now? And how's it gonna change going forward? It just fascinates me that we still have passwords when our phones have. Oh, it's such an idea. Fingerprint, why can't it just work off my finger? More and more SaaS services and applications are actually becoming more sensitive to multi-factor authentication, strong authentication. What we at Fordruck would actually call contextual authentication, and that's a great way to go. So they're leveraging things like Touch ID, like device fingerprint, for example, recognizing that the device kind of represents you in your unique way of using the device. And in that way, we can start to do things like what's called a passwordless flow. Where it can, most of the time or all of the time, actually not even use a password. And so, I don't know, I used to be an industry analyst and 75% of my conversations with folks like you would be about passwords. And more frequently I would say now, we're getting into the topic of people are more password savvy and more of the time people are turning on things like multi-factor authentication and more of that, oh, it knows the context that I'm using my corporate Wi-Fi which is safer or I'm using a familiar device. And that means I don't have to use the password as often. So that's contextual authentication, meaning I don't have to use that insecure password so often. So I think the world has gotten actually a little bit smarter about authentication, I'm hoping. And actually, technologies like OAuth and the things that are based on OAuth like OpenID Connect, which is an identity technology, modern identity, federated identity technology and things like user-managed access are leveraging the fact that OAuth is getting away from having to use if it was a password-based authentication not flinging the password around the internet which is the problem. Okay, so that's good, that's getting better but now we have this new thing, Internet of Things. And people are things. But now we've got connected devices, they're not necessarily ones that I purchased, that I authorized, that I even maybe am aware of. Like a beacon on a wall just observing you. Like a beacon on a wall and sensors and the proliferation is just now really starting to run. So from a privacy point of view, how does kind of IoT that I'm not directly involved with compared to IoT with my Alexa, compared to applications that I'm actively participating in, how do those lines start to blur and how does the privacy issues kind of spill over now into managing this wild world of IoT? Yeah, there's a couple of threads with the Internet of Things. And so I'm here today at this Data Privacy Day event to participate on a panel about the IoT tipping point. And there's a couple of threads that are just really important. One is the security of these devices is in large part a device identity theft problem. I mean with this dine attack. In fact, that was an identity theft problem of devices. We had poorly authenticated devices. We had devices that have identities. They have identities, they have identifiers and they have secrets. And it was a matter of their own passwords being easily taken over. It was account takeovers essentially for devices. That was the problem. And that's something we have to be aware of. So just like applications and services can have identities just like people. We've always known that. That's something our platform can handle. We need to authenticate our devices better and that's something manufacturers have to take responsibility for. And we can see the government agency starting to crack down on that, which is a really good thing. The second thing is there's a saying in the healthcare world for people who are working on patient privacy rights, for example. And the saying is no data about me without me. So there's got to be a kind of a pressure. We see whenever there's a front page news article about the latest password breach, we don't actually see so many password breaches anymore as we see this multi-factor authentication come into play. So that's the industry pressures coming into play where passwords become less important because we have multi-factor. We're starting to see consumer pressure say, I want to be a part of this. I want you to tell me what you shared. I want more transparency and I want more control. And that's got to be part of the equation now when it comes to these devices. It's got to be not just more transparent, but what is it you're sharing about me? Last year I actually bought, okay, maybe this is TMI. I always have this habit of sharing too much information. That's okay. We're on the queue. We like to go. We're places other than these. Be honest here. Companies don't go. I bought one of those adjustable beds that actually has an air pump that- What's your number? Your sleep number. It is a sleep number bed and it has a feature that connects to an app that tells you how well you slept. You look at the terms and conditions and it says we own your biometric data. We are free to do whatever we want. Where did you even find the terms and conditions? They're right there in the app. To use the app, you have to say yes. So you actually read before just clicking on the box. Privacy pro, I got it. And of course, I saw this and to use the feature you have to opt in. This is the way it is. There's no choice. And they probably got some lawyer. This is the risk management view of privacy. It's no longer tenable to have just a risk management view because the most strategic and the most robust way to see your relationship with your customers is you have to realize there's two sides to the bargain because businesses are commoditized now. There's low switching cost to almost anything. I mean, I bought a bed, but I don't have to have that feature. Do you think they'll break it up? So you want the bed. You're using a Fitbit or something else to tell you whether you got a good night's sleep or not. Do you see businesses starting to kind of break up the units of information that they're taking and can they deliver an experience based on a fragmented selection? I do believe so. User managed access and certain technologies like it. Standards like it. There's a standard called Consent Receipts. They're based on a premise of being able to now deliver convenient control to users. So there's regulations that are coming like the General Data Protection Regulation in the EU. It's bearing down on pretty much every multinational, every global enterprise that monitors or sells to an EU citizen. That's pretty much every enterprise that demands that individuals get some measure of the ability to withdraw consent in a convenient fashion. So we've got to have consent tech that measures up to the policy that these organizations have to have. So this is coming whether we sort of like it or not, but we should have a robust and strategic way of exposing to these people the kind of control that they want anyway. They all tell us they want it. So in essence, personal data is becoming a joint asset. So that's in your sleep app. But what about the traffic cameras? And the public facility, I mean, they say in London, right, you're basically on camera all the time. I don't know if that's fact or not, but clue is a lot of cameras that are tracking your movements. You don't get a chance to opt in or out. That is actually true. That's a tough case. The class of the EU. The class of the EU. Security, right? Then obviously, post-911 world, that's usually the justification for, we want to make sure something bad doesn't happen again. We want to keep track. So how does kind of the government's role in that play? And even within the government, then you have all these different agencies, whether it's the traffic agency or even just a traffic camera that maybe KCBS puts up to keep track of, let's slow down between two exits. How does that play into this conversation? Yeah, where you don't have an identified individual and not even an identifiable individual, and these are actually terms, if you look at GDPR, which I've read closely, it is a tougher case, although I have worked, one of the members of my user managed access working group is one of the sort of experts on UK CCTV stuff. And it is a very big challenge to figure out, and governments do have a special duty of care to figure this out. And so the toughest cases are when you have beacons that just observe passively, especially because the incentives are such that, and I won't grant you, the incentives are such that, well, how do they go and identify somebody who's hard to identify and then go and form them and be transparent about what they're doing? Right, right. So in those cases, even heuristically identifying somebody is very, very tough. However, there is a case where eye beacons in, say, retail stores do have a very high incentive to identify their consumers and their retail customers. And in those cases, the incentives flip in the other direction towards transparency and reaching out to the customer. Yeah, the tech of these things, of someone who I will not name, recently got a drive-through red light ticket. And the clarity of the images that came in that piece of paper that I saw was unbelievable. So, I mean, if you're using any kind of modern equipment, the ability to identify is pretty much there. Now we have cases, so this just happened, actually, I'm not gonna say, let's see, do I say it was to me or to my husband? It was in a non-smart car in a non-smart circumstance where it was simply a red light camera that takes a picture of an identified car. So you've got a license plate, and that binds it to a registered owner of a car. Now, I have a car that's registered in the name of a trust. They didn't get a picture of the driver, they got a picture of the car. So now here we can talk about, let's translate that from a dumb car circumstance, registered to a trust, not to an individual. They sent us what amounted to a parking ticket because they couldn't identify the driver. So now that gives us an opportunity to map that to an IOC circumstance because if you've got a smart device, you've got a person, you've got a cloud account, what you need to do is the ability to, in responsible, secured fashion, bind a smart device to a person in their cloud account and the ability to unbind. So now we're back to having an identity-centric architecture for security and privacy that knows how to, I'll give a concrete example. Let's say you've got a fleet vehicle in a police department. You assign it to whatever cop on the beat. And at the end of their shift, you assign the car to another cop. What happens on one shift and what happens on another shift is a completely different matter, and it's a smart car, maybe it's a cop who has a uniform with some sort of camera, body cam, that's another smart device, and those body cams also get reassigned. So you want whatever was recorded in the car, on the body cam, with the cop, and with their whatever online account it is, you want the data to go with the cop only when the cop is using the smart devices that they've been assigned, and you want the data for somebody else to go with the somebody else. So in these cases, the binding of identities and the unbinding of identities is critical to the privacy of that police person, and to the integrity of the data. So this is why I think of identity-centric security and privacy as being so important, and we actually say at Ford Rock, we say identity relationship management is being so key. And whether you use it or not is really kind of after the fact of being able to effectively tie the two together. You have to look at the relationships in order to know whether it's viable to associate the police person's identity with the car identity. Did something happen to the car on the shift? Did something happen through the view of the camera on the shift? Right, right. And all this is underlaid by trust, which has come up in a number of these interviews today. And unfortunately, we're in a situation now if you read all the surveys in the government, particularly in these are kind of the more crazy cases because businesses can choose to or not to and they've got a relationship with the customer. But on the government side, where there's really no choice, right? They're there. Right now, I think we're at a low point on the trust factor. So how is that? I mean, if you don't trust, then these things are seen as really bad as opposed to if you do trust and then maybe they're just in convenient or they're not quite worked out all the way. So as this trust changes and fake news and all this other stuff going on right now, how is that impacting the implementation of these technologies? Well, ask me if I said yes to the terms and conditions on the sleep app, right? I mean, I said yes. I said yes. And I didn't even ask for the app. My husband signed up for the feature. Just showed up on my phone. I said I was in proximity to the bed. So this is not news. I'm not breaking news here. But consumers want the features. They want convenience. They want value. So it's unreasonable, I believe, to simply mount an education campaign and thereby change the world. I do think it's good to have general awareness of what to demand. And that's why I say no data about me without me. That's what people should be demanding is to be led into the loop. Because that gives them more convenience and value. They want share buttons. I mean, we saw that with the initial introduction of CareKit with Apple. Because that enabled what people who are involved in user managed access, we call ourselves Umanitarians. So Umanitarians like to call it Alice to Bob sharing. That's the use case. And it enabled Alice to Dr. Bob sharing. That's a real use case. IoT kind of made real that use case when web and mobile and API, I don't think we thought about it so much as a positive use case. Although in healthcare, it's been a very real thing with EHR. You can go into your EHR system and you can see it that you can share with a spouse your allergy record or something. It's there. But with IoT, it's a really positive thing. I've talked to folks in my day job about sharing access to a connected car to a remote user. We've seen the experiments with I'll let somebody deliver a package into the trunk of my car but not get access to driving the car. These are real. That's better than saving a little money by having smart light bulbs is not as good as, you've got an Airbnb renter and you wanna share limited access to all your stuff while you're away with your renter and then shut down access after you leave. That's an UMA use case, actually. That's good stuff. I can make money off of sharing that way. That's convenience and value. There's only, I just heard the other day that Airbnb's renting a million rooms a night. There you go. So it's not insignificant. You have a home, bristling with smart stuff. That's when it really makes sense to have a share button on all that stuff. It's not just data you're sharing. Well Eve, we could go on and on and on. You're gonna be at RSA in a couple of weeks? Absolutely, I'm actually speaking about consent management. Maybe we'll see you there. That would be great. But I wanna thank you for stopping by and really enjoy the conversation. Me too, thanks. All right, cheese eve, I'm Jeff. You're watching theCUBE. I'll catch you next time. Thanks for watching.