 And please interrupt at any time with your own opinions, because they're probably better than mine. OK, topics, wild and reckless use. So we started doing this shtick. Good Lord. How old is C&I at this point? 1990. 1990. So Paul, Evan, Peter started inviting me to cover this waterfront a long time ago. And so I was there at the start, and we'll talk a little bit about that, and then some of the successes and changes. And then over the last several years, there's been a number of challenges in the internet identity space, from new technologies to new laws. We'll cover some of those challenges to just the fact that successes really help. I mean, if this had just kind of fallen over and died, I wouldn't be here, and you'd still have 4,000 passwords, and we're actually one password for 4,000 different sites. And life would be pretty dark. And then the coming of age. So the current issues that we're wrestling with in internet identity, we'll cover some of those. And we'll wind up with the usual sets of frivolous and foolish predictions. And again, at any particular point, just say, that's not true. And I'll fluster. OK, so we started in this business around 2000. At internet 2, we started with an R&E-driven focus for inter-realm authentication and authorization to support collaboration. At the same time, the feds were doing PKI, and handing out gestures. And of course, Berkeley will do exactly the right behavior so that you can make these technologies work. And it wasn't the technologies, it was the policies that were difficult, especially at Berkeley. I'm glad you don't let the Alumni Association know, Jen. And we had a collaboration orientation. And then that became commercial to commercial relationships. And then it became a consumer identity and social identity foundation. We're now moving yet again into a new world. I'll now talk a little bit about that. So we mentioned our very hierarchical PKI space. Here's the global policy. You will enforce that policy locally. You will wrestle all of your difficult people into this. You will take all of your policymakers and convince them to do different policies that you can take to the global hierarchy. That didn't work. So it was globally scalable but not locally deployable. We knew Federation was locally deployable because we said, just tell us what you're doing. Keep an eye on what you're doing. We didn't know if it was globally scalable. I gotta confess that now 20 years in, I'm not sure that it's gonna be globally scalable because we're meeting lots of challenges in terms of distinct privacy laws across the globe, new forces out there. And frankly, semantics is a problem. The example I use a lot is that in the UK, there's only staff and students as categories of work-ease. We have faculty. What happens when we send an insertion that says faculty to the UK when they deal with it? One more moment. So we got into Federated Identity for Centricate Locally Act globally was our model. And then we added the attributes because we really had the privacy preserving. It was this community in particular, the libraries that made us really emphasize the fact that when in doubt, we weren't going to send identity agreements and attributes faculty, students, enrolled in class 101, whatever. That turned out to be huge for privacy and access control. It would have been no privacy and access control with PKI. Well along the way, we tried to ratchet up the trust model. So we began back in 2001 and 2002 with ClubShare, a place you'd like to visit. And then we moved to NQ because we knew we had to get a little bit more rigorous. And then finally we're in common. And in common it's about to tighten the screws on the members because if we're really going to work globally, if we're going to really trust each other, then we've got to have a bit more in common as to how we operate from the US to global. And we started with ReFeds, which was the R&E Federation's globally. And we went from the business of custom big dogs. So this got started with Stanford and Berkeley and CMU and MIT. And now it's ubiquity and now it's cloud services and now very small places like Unis, K-12, Unis, they're not going to suck it in and run these technologies, they're going to buy the services in the cloud. And along the way, we switched our programming language from Samuel to Jason, something else is coming along, the language changes, the ideas don't. So what are we facing now? We're too successful. Besides that the metadata bundle is huge and low. It takes three gigabytes, three gigabytes of memory in a cloud to store the metadata that we started with this year. It takes four hours to load. This isn't working for us, this is too much success. It also makes the IDP discovery hard. We have people in this room who suffer that pain every day. I can show you, pull down this of IDPs that are just two point font at this point to get them onto six pages. It's just brutal. It's not the way we're going to do this going forward. And in fact, there are some people in the room who are looking at other approaches for that. It's a big and diverse world. Everybody needs identity, it's kind of like the network. And we have to integrate a lot of forms of identity. I'll talk about that in a little bit. Just came from a very interesting meeting about that. An increasing spectrum of skills and interests to serve. Again, the cutting edge, back when we started, I just hung out with 20 really smart people and they're whiteboards and life was cool. And now it's people who don't live and breathe identity. Thank God. And so they need to, we need to accommodate that. Internationally from the original big feds to a growing set of new countries, we just got the application today for the last week actually in refeds for the Russian Federation to join. Interesting. What do some of these concepts mean? The Chinese Federation has joined. What are these concepts mean then? And this one is sweeping up after our mess, moving from skinny to fat trust protocols. So initially, we said again, we are gonna kind of trust what you do there at Berkeley, even though I was a student there, I still am gonna trust it. But that's not gonna work going forward. And so we need to flesh out some common approaches. And in general in this phase, in all of the technology phase and any of the standard stuff, and it probably applies to semantic and syntactic standards as well as the stuff we deal with at the protocol level. You wanna make these things very general, which means it's the profiles that make all the difference in the world. We've got 19 knobs on SAML, how you turn the various ones that make it possible for me to actually understand what you're sending. So we're all moving from these skinny profiles that specified very little where it was easy to meet it, but I'm not sure I'm gonna trust you because I don't know a lot of the things you're doing. Now we're moving to much fatter profiles, that may be a little bit harder to meet, but in fact, if you meet them, then I'm gonna have more confidence that what you're saying is trustworthy. So here's what the coming of age looks like. We'll start with the dynamic metadata and discovery. There's a pernicious interaction between those two. The integration of identities, GDPR, attribute release, where consent fits in that, baseline expectations, which are leveling, moving to those fatter trust profiles, some of the back end threads, and then just because all of this stuff is gonna be humbled by what happens when ransomware hits things. I gotta talk a little bit about that. There's a connection between identity and things that we have to flesh out, and as I talk to my colleagues in business, they're in a much better position to do so than we are in academia because we don't know who has what things on campus. In business, you know what's been acquired. I don't know what acquisitions are like on your campus, but my God, anybody can acquire a thing, and it gets really tricky. So dynamic metadata and discovery. So as I indicated, the federation metadata files are too big to begin with, and now we're asking people to add more metadata. Those logos you didn't put in there, we need those logos to help Todd's discovery process that those security contacts that you didn't put in there, I wanna know if there's an incident who do I contact at a campus to get the relief from that. So we're asking people to add more to the metadata files to get it very big. So we're moving from Etsy hosts to DNS. Very few people in the room. Thank you for the smile, Cliff. Very few people in the room at Etsy hosts, but that's the old way we use to distribute the IP tables. The protocols and code now exist for all of this stuff. There's some experimentation starting. Federations will likely construct some common bundles for ease of deployment. So in common, with about almost 1,000 members now, we import probably 4,000 other members globally from a journey in indication of the growth in the size. We will probably bundle just the in common participants in one bundle, because there's a number of places that are gonna have different rules for international stuff. So discovery is the process of helping a user find their identity. There's active work that Todd is certainly engaged with an RA-21 to solve this discovery problem. It's not chronic, but it's the first step of the process. And so if you can't do that, you doomed. But once you do that, it doesn't have much scale associated with it. There's some browser assist. There's a central discovery process service. And pulldowns, the pilots are starting in all of these tests. They all have good aspects and tricky aspects. Tricky aspects, a lot should be to do with privacy and the disparity among how browsers do things. And all that will probably be compounded by GDPR because GDPR is gonna compound everything. And then there's an interplay between the two. So most sites today will take like, the Jayon site has probably 2,000 IDPs on it. Will you do that by taking this massive bundle of data and listing all the IDPs? Well, there isn't gonna be a massive bundle of data anymore. How are they gonna list those IDPs? Well, we're gonna look at one of those other approaches. But I'll move to dynamic metadata, which we have to do. It's gonna compound the discovery process of pain that Todd knows well. Okay, integration of identities. So federated identity has gotten deep traction. And thankfully, NFAs really come to the wall. So it's gotten very convenient because we're all carrying cell phones because cell phone service is so good. We've all learned to work around if you're in the hospital room and you can't get the telephone signal, there's work around or if you're on an archeological dig in East Africa, there's work arounds. I think we're mastering MFA and it's very good. Certainly the number of attack vectors using any password is decreased, partially because it's only good attack vectors to go after these days. And we'll talk about some of that. But MFA has certainly cut down those phishing attacks. The integration of social identity and federated identity is happening. So the social identity courtesy of Google and Facebook is certainly moving along. Maybe it's gotten a little bit of a bump, hopefully in the road. But those gateways exist. We in the US use a lot of social to SAML gateways. So those parents can check on student accounts and student bills by using the social identity to get to the SAML-based student information systems and student finance systems. SAML, the social gateways also exist in, ironically in Japan, there's a lot more SAML organized identity and people are using their campus identities and corporate identities to get into the social realm. There's still tricky management there of self-assertive versus vetted enterprise attributes. So if I'm gonna accept your social identity into my world, what attributes about your social identity am I going to trust over what I might have in the house? Well, I certainly think you're probably better you're more authoritative on nickname. And so where that, maybe a few other things, maybe you're more up to date on address. But I'm not sure I would want to accept citizenship as self-assertive. That could be the intriguing trick. Well, especially when we have business processes on our campus that capture citizenship, at least service and other kind of processes for campuses. Sovereign identity is the new beast on the block. And last week I was at a meeting at the Computer History Museum that was just teaming with sovereign identity people. These are long-held desires on the part of a lot of people to say I don't need those naked badges, but my own. And they didn't have any big technologies to work with, but they still wanted to put their money and their identity under the mattress. And that's what this does. And then blockchains come along. Blockchains make this easily available. For some definition of this you buy it. It's a game changer in that I can construct blockchains of trust through the distributed ledger that they really solid technologies, the cryptos good. But there's a couple of challenges here. One is scale. This will not globally scale, it just won't. But those people don't want it to go to scale. They want circles of trust that somehow merge together. Secondly is privacy, blockchains is public. So every transaction is public. So who signed what is public? I don't know how that's gonna fit into different spaces that value privacy. Perhaps the hardest part about this is that these are people who, by their nature, don't trust institutions. Don't necessarily trust each other. So who's gonna be in charge of them all? So there isn't, this goes back to George, but I'm sure George will say that kind of stuff. So sovereign.org, S-O-V-R-I-N.org. It's kind of the gathering place for this. And at the meeting last week, 200 identity geeks, I'd say probably easily a third or I'm fluctuating. Many of them come from you. And I guess somebody in the know said something about the relationship between one and church. We need a crusading report to dive into this and find it out, but it was a very interesting, very interesting connection. They were redoing the entire identity stack. So the way we use O-O-O-F, they have to create something equivalent to that, et cetera. So they reinventing stuff and they're not integrated much at all into the existing internet identity space. So we're, Samo and OpenID Connect have learned how to leverage each other. Sovereign isn't anything like that. By the way, someone told me recently, I think last week, that if you look at searches at Google, Samo dominates OpenID Connect story. And a couple of years ago, we were all going through all the OpenID Connect stuff, meaning mobility. It has a clear space in our world, but it didn't displace Samo, it added to the internet identity suite. The question of how this looks going forward, I don't know where Sovereign plays in all this, I don't know. And then there's roles of special identities as well. So the orchid identity remains as a cautious topic. It's persistent, you know, it never changes. It follows you from institution to institution. We can kind of add good strength to that. Authentication process is really, so can we use it for something beyond its intended purposes? Well, thankfully, the investors in orchid have stayed pretty well focused on, this is a disambiguation tool for the scholarly community, and have not been tempted by, I'll take Globis as an example. Globis is just, we love to use the open identifier as your gateway into high performance computing. And by the way, would you tweak orchid a little bit so that it would fit better into high performance computing? And that's when you start to wonder about purposes. Any questions so far? Nobody's left, so I guess we'll... Well, I have a dumb question. I'll go back. Explain to me exactly, the sovereign identity, is this the idea that I own my own identity period and carry it with me wherever I go? Yeah, well, you know, right. So it certainly has, you know, we know in the federated space that you go from one campus to another, if you go from Gmail to Facebook, nothing you want. Anybody else find? But it is, again, if you go to sovereign.org, there is a narrative up there that is blowing with the glories of sovereign identity. And, you know, again, frankly, given the abuses that are happening out there with Facebook and Google, there's a real temptation. We will get to in the talk, is there's gonna be greater abuses than anything we've seen so far? And so maybe we're not looking at the right part. So GDPR, cool stuff. I like it because, first of all, the US has nothing like this in terms of price. I like it because it took 20 or 30 different policy frameworks in Europe and maybe in the form. So it moved, the most important word here is the move from directive to regulation. Directive say interpret as you see fit. Regulation says it gets solved at the EU level, not at the state or national level. It's binding on every member, and more so. So when we send our students to study abroad, they're gonna be governed by GDPR. When the faculty member steps foot in Europe on a business trip, they're covered by GDPR. When that faculty member comes, let's see, comes to the US, there's coverage as well. So it was passed in 2016. It becomes operational in just about a month. In fact, I've been told it's operational already, it's just unenforceable. Policies will become operative in 2018. It covers a vast waterfront of issues. A whole lot of stuff, and I'll cover some of those in a second. It consists of a set of articles, which some of us can quote chapter and verse, and then example interpretations of those rules, so-called investigations. And now, as the rubber hits the road, a lot of the questions are being referred back to this Article 29 working group that preceded any of the GDPR. And they're the ones who are actually going to define such gnarly details as to what we're about to get to. The penalties are huge, 4% of global revenue. The EU has demonstrated they can collect that money. They find Google, a big find, I'm seeing some people smile, and Google Mountain View didn't know about, well, they knew about it, certainly, but it was Google.fr that told me about the money. There are six reasons for attribute release, and it gets really tricky. This is one of the things we're working on right now, how many attribute release bits. A single site, take a wiki, will have several different reasons for releasing different attributes to that site. So your login name gets released because things can't function, that's called legitimate interest. Other things get released because of national security purposes, like the log files become accessible to the wiki owner if there's some kind of malfeasance. But you're displaying it because you want it to look pretty for your posting that you put onto that wiki. That becomes consent because it's not required, it's optional. There's a nice little thumb that somebody gave, and I don't think it's quite true, but it's useful, which is if you can lie about it and still use the site and it becomes consent or a rule of thumb. And it affects, I think, all of us. So this is Daniel Solov as a George Washington University Professor of Lawyer. He's done this wonderful chart about this, and this is a creative comment, so I can use it all. Some of these things are all noble. A lot of these things need to be fleshed out. Here's the territorial scope issue. Here are the reasons for processing data, transparency, access and rectification, right to a ratio. I'm gonna come back to that in a second because what am I gonna do about all those network backups that are like that? I'm gonna raise you out of backups. Enforcement, we talked about already. Sensitive data, I'll come back to you in just a second this way. Normally, details. So almost everything is PII in this definition. Your IP address is PII. Unless you can prove it was dynamically assigned, they assume that it's static, and so it becomes PII. Some identifiers are exempt because we engineered them right. And then there's the sensitive PII, which is actually, there's another word that they're using, special categories for that. And these are going to become very tricky for us to handle. And we just discovered that it was article 29 that's going to do that. So let me give you a point-in example of this. If we had thought this sucker was gonna fly 20 years ago when we started, we wouldn't have stuck all of your root memberships into one attribute called isMemberOf. But we did that, because the sucker wasn't gonna fly. So now there are places that have all you group memberships in one attribute, and we have an all or nothing lease mechanism. So I'll take people at Brown University, many people where they turned group management loose, almost everybody belongs to hundreds of groups. Somebody says, I need group membership. I have about a couple of hundred. No, no, what were we supposed to be privacy preserving? So we're learning about how to handle this. And one of the inklings that we think of in the special categories of GDPR is that we're going to have the obfuscate some of these values. Not so much that you can't figure out what that value is, but per chance if you're looking at it on the screen and somebody else can look at it, it's obfuscate. So just the way that passwords get bounced and then you can click there to see it open, we're going to have to do the same thing. We think, we're engineering the UI right now. And it's going to be doubly problematic because passwords are fairly short. Some of our boot names are hundreds of characters. Errin, Colin Mays, Colin Maxette, Maxette, blah, blah, blah, blah, blah. How do I obfuscate that where you can recognize, oh, that's LGBT in the middle of that, but how do you, we've got some UI issues that we're just wrestling with. Research data use. Fortunately, there are explicit exemptions in GDPR about research data use. So all the epidemiological stuff that people want to do with research data, there are some that has to be fleshed out, but at least there's some mechanisms in place that can happen. I'd like to be forgotten, as I mentioned, cloud-based backups. How do I do that? And they also call for back-data portability. Well, I mean, all of my consent decisions need to be portable. So we're going to be back to, I think, RTF was comma-separated values. Well, how do we make these portable? I don't know. A data-grade notification, 72 hours. This call may be recorded. Can't do that anymore. Or you have to give people the ability to either move forward without it being recorded. Data protection officers and individual data protection training. I know that some of the larger schools are actually starting to do data protection, individual data protection training, and SOLOV has developed a set of courses around that. Okay, attribute release. So we never expected that it was going to be this hard to get the attributes out of this institution. We call the university's attribute the university's region out there. It's typically no offense to any petty registrar that's here, but there's a registrar there who's just Douglas Heelson. And he's Douglas Heelson for students and then graduate students become critical in all this because they're gonna help their researcher do his research for their students and gets on with it. And so I've been working on stuff. I'll show you a slide or two on explicit consent. We haven't started to promote it widely because we're still bashing our head against the wall of research and scholarship. We want the university to label itself and consider itself that if a site is a testing that it is a relying party that is RNS-based that the university will release your login name and that's basically it, the EPPM. And adoption is growing, but glacially it's really frustrating. And Globus just walked away from this last week. We've given up hope. They did cite consent as an answer to this and I'm still trying to promote that. I'm international. So we took RNS began in the US, it went international and suddenly there were lots of universities that go, well I trust my US colleagues and international. Moldavia, they're a city of RNS, but I trust Moldavia. Elbonia is the prime example of problematic spaces. So there's a lot happening now in consent as a remedy to this as a GDPR alternative. There's a contrarian working group here. This one is really fascinating. It's the Interactive Advertising Bureau. All those ads that you see on any one side is being coordinated by IAB. I used to know IAB as the Interactive Activities Board. This is in fact the Interactive Advertising Bureau and Todd and I were just talking about this based upon a business model that is now legal as of the truth. So these people are defining cookies in a way, in particular it's around the purpose for the use of the released information. That is the really gnarly thing. And I have seen taxonomies on purpose of use which would allow some downstream party to go, oh I'm inconsistent with that, I'm gonna use the same day to give it to me. And those purposes of use are heavily marketing oriented. They're not gonna work for our community. Yes, you consent it to being on a mail list and to receive ads but you didn't consent to third party, whatever it is. And it's fine-grained stuff in this setting bit by bit on cookies and all of the cookies in your web browser as of April 25th will have a different format where the bits are being specified so that people can still do cross-site activities using those cookies, using purpose fields that are, these people are being daily to try to narrow down thousands of possible business purposes into a leverageable set. Scary stuff. And then there is the car stuff I've been working on. I'll show you a slide or two of that. It's consent-informed attribute release originally funded by NIST when NIST, like any federal agency, have enough people to turn the lights on. It's a different space now. It's a multi-protocol consent as a service. And the emphasis is really on informed consent. It's not just consent. So we have an architecture that is multi-protocol so there's a reliant party, there's we can serve OAuth, we can serve SHIB, we can serve OIDC. We can combine institutional reasons for policy release with user preferences for that and then spit out records associated with this, allow the user to make decisions, and then we envision, in fact, we actually have it. Another application on your phone like settings, which will be called Reals My Stuff Dude, which is basically all of my consent decisions. And I can go in there and I can manage them, I can replicate them, I can say, you know, if the sites that have similar tags associated with them want to use the same properties. The hardest part of all of this stuff, and there's an enterprise management console which gives me a lot of excitement because I as an IDP operator, let's say I'm an operator, I'm Jeremy Rosenberg operating the IDP at Berkeley, and I notice the site has changed its privacy policy and I want all the users to re-consent and do that with a single commit. So there's some nice stuff in our enterprise management. The hardest part is getting informed content, stuff so that you can make an effective decision and make it in reasonable time. So we look a lot at how long it takes you to live. We call that dwell time. If it's less than a second, something went wrong. Your eyes glazed and you just clipped through. If it took more than 20 seconds, something went wrong, we didn't make it clear. So there's a narrow window of understanding that we're trying to aim for there to get good consent. So the screens look like this, nice greens, nice reds for denies. This was designed by Duke, by the way, they're awfully good about it. We're gonna make consent painful enough one time that we'd like to be able to have you suppress it. And so we have these buttons. Show, save my choices, don't show me the screen again unless necessary. What's necessary? Well maybe the privacy policy changed at the line party. Maybe the value on releasing has changed. The system admin can decide what's necessary and trigger a re-consent decision. Save my choices, but show me the screen again. Don't save those choices. That was put in there so that I can kill demos by saving my choices and then I couldn't re-do those again. These boxes changed by context. Interestingly, the Danes have been doing this for quite some time, eight to 10 years, because of their architecture. There they have statistics on dwell time, statistics on how often people check each of these choices. And what we're seeing is that there's a considerable percentage that makes the same choice every time but wants to see it again. It's a source of comfort for me. The Danes are different than the U.S. or mileage maker. We may not get miles at all. This is where I go to my, show me my stuff, dude, Consent Manager. And if I go there, I get to see all of my release policies for my sites. And I can drill in, see the attributes being released, see the values of those releasing so I can clean this up if that's what you did. Here's my current choices. And then in this case, the organization, the enterprise can recommend stuff. And you see the standard choices and then you see, ask me and why I'm away. So one of the oddities of campus is, and I'm sure that there's the schools here, fitness, is the registrar is guarding the transactional at your place. Oh, oh, oh, I want consent. But it's not watching in all the batch flows of information and it's exactly the same stuff that's moving from one system to another without any consent or any filtering. If you set Fripper, I'll tell you the batch flows, don't know about that. So the side door and the back door are wide open as you're monitoring the inline stuff. This gives us the ability to take a batch file, run it through there and say, oh, you're Fripper protected, I'm not gonna lose. So there's some very interesting things. We added this not because we were freaks about Fripper, but because this is intrinsic to the open ID OAuth world where consent persists from moment to moment. Baseline expectations. Let me see how I'm doing on time. Okay. So as I indicated, we began with Club Shib. It was, I'm an interesting, but I'm an interesting. And now I'm getting all these other places. So we're moving from ad hoc trust towards a set of common expectations. It makes a baseline for trust. And what is it address? Well, I wanna make sure that you're patching your software. Good hygiene. I wanna make sure that the people operating your identity management system are actually authorized to operate your identity management system. I wanna make sure that you're maintaining your federated metadata. I wanna make sure that when you receive attributes, you handle them properly and then dispose of them. And if there's some kind of federated incident handling, you're gonna be responsive. The people at CERN are very concerned about this. You could do real damage. And so they're studying real time frames that say, God, if an account got, if we send something malicious happening and we contact the University of Southern California and say, one of your users is doing malicious things we want to know, you're gonna respond. It's being advanced in stages. This is totally new ground. It involves consensus building on what the policies are and then deployment. And now we're trying to deal with, okay, there's a reporting, failure to comply with what we do. We have never, ever been 15 years of federation kicked out, sometimes we've been federation. And so the steering committee is sitting there going, help, we're gonna really do that. We're gonna disable the University from, because all of your services now, especially in the cloud are coming in for federated identity. We really have the courage to do this. How do we reinstate you? How do we know that you're doing it well? Important stuff to follow, it's taking a while, but it's a sea change. So I'm just, I have a comment to do about current events. The backend threats that we're seeing now, we can't fix through technology. So the backend threats, Cambridge Analytica, et cetera, that was the failure of Facebook to police a policy. We can't do anything technically about that. But we can manage the front end threats. Managing what data the line party gets, making some of these consent dialogues a bit more clear. They're utterly opaque. And this one's gonna be very hard. So if you have an old Android phone like I do, Google gives me the choice when I load a new application of accepting all of the attribute release on none of them. Apple's a bit better, it gives me some controls over it. But nobody's good at it. Nobody tells me what's the damage if I choose not to release a particular attribute. There isn't any normalization around that in the community. And so we really, data minimization, which is absolutely stipulated in GDPR, is gonna be one of the more interesting frontiers over the next year, which here is how we as a community agree on this stuff. I had a fascinating conversation with Adobe last week. They're an IDP through Adobe Connect and a lot of other services. Half a billion accounts, that's a whole billion. They're putting consent screens in front of everything. As of May 25th, just the way you see those annoying little, this site sets cookies, you can see annoying little consent dialogues. All work here is done. But those consent dialogues are going to be challenging because we don't know in particular how to handle the data minimization. Okay, we're moving to that. So I got involved in the Internet of Things and boy is this huge and really hard. So the security vulnerabilities are staggering. I have a list. So the Mirabai attack of last year, which was basically webcams around the world attacking lots of sites. Those are continuing by the way. It's just that the publicity has gone on the ground. The ransoms are being paid. It's the people commanding these sites, these bot attacks are making really good money, which is why the password sniffing has gone away. Because you can make a lot more money doing this. You can get a lot of money from a lot of these, Westinghouse or GE when you say, I want to take this site down unless you pay us money. In Bitcoin, of course. So I've seen the passwords. So it's, oh God. It's username, it's admin and the password is admin. Or it's admin and password is one, two, three. I can show you the slides. It's just stunningly stupid because the guy manufacturing the device doesn't have security on mind. He wants to pop up sprinkler to work really well or he wants the chlorine filter or he wants whatever it is to do well. It's not his business to do security. I'm over the air patches. Don't happen. You multiply that by the number of devices and it's really scary for DDoS attacks. And we haven't, we've gotten better as a networking community in handling DDoS but we don't have it well in all of this. And what we haven't seen yet is that we have somewhere on things. But sometime in the next year you won't be able to start your car unless you pay the password on your car's intelligence. It's one, two, three. The privacy implications are staggering. When information is being captured by the thing. And all of these things uploaded to the cloud. Tad and I were talking about him. He got some Alexa devices and thank God he hasn't activated it. I was at a, yes, I was at a meeting a year ago some of the folks, as they say, a small online bookseller were saying, you don't want to know what we're doing. But in fact, they have built a mental map of things that you have because then you use pronouns. And every time you use a pronoun you have to disambiguate that pronoun and try to figure out what the person's referring to. So they're taking all the data across all these devices. You're using that to build a map of you. And there's lots of, so who owns that knowledge? Who owns the data in the cloud? So you have a Samsung device that uses Alexa. It's sending everything out into the cloud on the Samsung. Who owns that data? Samsung, Alexa, Amazon, not you. That's what, the only thing we know for sure, not you. So your mind is not owned by you anymore. It's being owned by Alexa. And then what can the government ask? I am sure that Amazon is being pressured a lot by the US government, and by other governments too, to add another set of questions to Alexa. That can help figure out, you know, are you likely to be X, Y, or Z on the basis of that? We're particularly vulnerable in this. Because at least on the campuses I've been, a researcher can buy a device in. And it's not even necessarily on the network because it may be using a low-powered network or a cellular network, but it comes back around on it. And there's no life cycle management. And yet we're not going to stop the acquisition of this because they're intrinsic to so much of our research. So wind up, predictions. Sovereign identity will find a small place in the internet identity realm. We're already starting to think about blockchains for small circles of trust. So we have this problem of, since we got rid of PKI, we have trust routes that are PKI-certificates one per federation. Right now they're self-signed. Now can I switch that to a blockchain and have the community assigned with that, those things? That would be an actual improvement of what we do today. So we'll find some ways of doing this. What we do, we do consent. And users will get better fun and privacy. If only because again, if Adobe's about to do this for half a billion users, we're going to see that. And again, where mischief will happen is in that purpose field. Well I can say, well I'm doing a purpose similar to what this company is okay for or would you release the information for it? And so that's where the mischief will happen. IoT security and privacy threats. It's going to be like Y2K. Any of, some of us remember Y2K, well it's going to be Y2K every day. It's just that, that pernicious. Profiles will continue to get fatter to cover more concerns. But I don't think we're ever going to move into the land of auditors. At least not in our community, they're expensive. We're going to be self-assertive. And we have a wall of shame. Should you ever miss anything, we'll put you up there on the in the chronicle of higher education that you didn't do something right and that will scare you. Now, and lastly, this is perhaps the worst is that we're going to, will any of these things stop people from sharing too much with Alexa, et cetera? And no, convenience will win it out over privacy. Just the nature of us human beings. I think that's the last, yeah. Any other questions? Thank you for coming. Thank you.