 So, we'll move on now while the next presenter's slides are loaded. I have the pleasure of introducing our next speaker, Mr. Peter Wynne. Peter currently serves as the Acting Chief Privacy and Civil Liberties Officer of the United States Department of Justice. The CPCLO is responsible for ensuring the department's compliance with the law's regulations and established policies designed to protect the privacy of individuals, as well as ensuring that concerns about privacy and civil liberties are appropriately considered in the development and implementation of law's regulations and policies related to the department's mission. So, warm welcome from the open group please for Peter Wynne. I will duck out Peter and your slides will appear as very shortly and you can take them through us, but welcome Peter Wynne. Thank you Steve. I'll wait for my slides to come up, but while I'm waiting I just wanted to highlight a couple of things. One is that my apologies for the abstract. I had for some reason this is American parochialism. I just assumed that this was a group that was going to be limited to the U.S. and suddenly I discovered it's actually probably more international than U.S. but that's good. But just forgive me for my U.S. centric points of view here. I also wanted to thank Lisa for the focus of her presentation because it dovetails very nicely with the kinds of things that I'll be talking about. Although I'm afraid you're going to see my presentation as being a bit more law-centric and a bit more philosophy and history-centric and hopefully you'll be able to see the connection at the end. The general point I'm trying to make in the talk today is really the transition that's going on between the concept of privacy as a right of individual control to the idea of privacy as being something that reflects trust, not only individual trust but trust within a group of stakeholders and a wider society. And what the point that I'll be trying to make is that in that transition, standard-setting organizations are playing and will play an increasingly more important role because that transition involves transition also from more of a top-down approach to law to a bottom-up understanding of law. So with that introduction, I also wanted to say that I'm here today really in my individual capacity. I'm not speaking on behalf of the U.S. Department of Justice, but then what I'm going to be saying really doesn't implicate DOJ equity so much in any event. But I did want to highlight that I've been struggling myself in the role that I play in the government to come up with a better way of thinking sort of in a big picture way about the role of law, the role of standard-setting organizations and the way in which that can ultimately improve our understanding of what privacy is and its relationship to trust-based systems. Lisa uses the word zero trust. I'm going to be talking about trust but trust in the sense, the same sense that Lisa was using it, which is a multi-stakeholder idea that trust is something that is earned, it is not something that is assumed. So with that in mind, now I'm going to have to figure out how to, here we go. So I want to begin with a story of privacy in a state of nature. And the state of nature, this is a schoolyard, the state of nature, depending on your point of view, maybe a Hobbesian or Lockeian to joke, you know, it could be a state of nature where life is nasty, brutish and short on a schoolyard or it may be a state of nature where everyone's getting along great. But it's a state of nature in the sense that the teacher on the playground is not supervising the kids. And so we have, imagine going back to the time when you're five or six years old and you're on a playground, recess, and little Annie tells her supposed friend Sally that she thinks little Timmy is cute. And you know, Radio Free Sally, we know what Sally's about to do. She yells out to every kid on the playground, Annie loves Timmy, Annie loves Timmy. Right. Annie is mortified, Timmy's embarrassed, Sally gets a lot of attention. The teacher does not intervene. What happens? Well, the kids have their own self-governing privacy system, right? It's called the Blabbermouth system or it's called the Tattletail system, whatever you want to call it, but basically Sally is stigmatized and there's a graduated system of sanctions. You know, sometimes she'll get yelled at, sometimes she'll be shunned and sometimes the kids will use self-help, you know, that Annie and Timmy will beat the crap out of her on her way home. What happens, the critical thing to be thinking about here and the relationship to standards is that the kids' self-governing privacy ecosystem pauses the kids to internalize the social norms of trust. They think it's important to be trusted. They think it's important to be a person with whom others can share confidences with security and trust. In the teacher, the only time the teacher really needs to intervene, if in the kids' self-governing ecosystem it starts looking like they're going to be the lord of the flies, you know, that they're going to start eating each other. But otherwise, it's a hands-off system and if the teacher were to intervene, the kids would start playing the teacher against each other and the system would not nearly be as effective at internalizing these norms of social behavior. So, now, that's no law. That's a system where there's no formal law other than a light touch by the teacher to keep people from hitting each other. And what's the relationship between that and law? Formal law. And so I wanted to just talk about certain kinds of formal law. In the United States and overseas, there's constitutional frameworks, there's statutes, there's regulations that agencies promulgate, and there's common law, which is basically the law that courts create by precedent. And so you can see this in privacy where the Fourth Amendment of the United States Constitution, the EU Convention on Human Rights Article 8 is kind of a constitutional framework. And then below that, there are statutes, Privacy Act in the United States. There are various other privacy statutes that are referenced. The UK, for instance, has a statute, the Data Protection Act of 2018. These are privacy statutes that either the Parliament or the US Congress passes. Then there are regulations that are issued by government agencies. These can be the HIPAA regulations in the United States or the EU General Data Protection Regulation in the EU, which is issued by the EU Commission or ratified by the Parliament. And then there's a common law, which is you'll have courts create property, recognize property rights and enforce property rights because the critical part of protecting privacy is protecting your property. In the United States, we have development of invasion of privacy torts. And in the UK, they really develop breach of confidentiality ideas as well. And these are common law doctrines. Now, I wanted to say something very quickly about ways in which people have a thinking about privacy and how it's transitioned from the idea of privacy as control to privacy as trust. But traditionally, going back to the 17th century, privacy was something that was thought of as being protected by property. John Locke's idea that property comes out of the person that you are. And you can see this in the next century in UK common law, very famous case where the court conceptualizes sort of a fundamental idea of property as protecting privacy, where the seizure of private papers, in this case with a general warrant, was declared contrary to the common law. And therefore, the trespass is the seizure of the paper. So the paper is where the information is stored. And the information, the private information, is protected by the right of property in the paper. You can see this reflected in the US Fourth Amendment, where it protects papers and effects. But what happens in this framework is it, and again, that framework presumes that the primary medium of information is paper. At least it was after the invention of printing. But in the 19th century, you started to see through the invention of electronic communication devices, whether it was the telegraph, the telephone, is that the information was now communicated through wires that were no longer owned by the sender. They were owned, usually by the communication provider. And so property rights ceased to protect individual privacy in connection with that electronic information flow because the property right, the individual's property right whose information was being communicated was no longer protected when they are on the wires of the phone company. Now the phone company had a property right, they could sue for trespass potentially, but they might not want to. So if somebody tapped into the wires, there was no common law trespass action that the courts would recognize. And part of that was recognized early on, also in the 19th century, by a very famous Law Review article, Charles Warren and Louis Brandeis. And they basically looked back to the property model and they said, we've got to do something about this new world of information that we need to sue, we need to create a property right in the information. And they hypothesized, they said, why don't we imagine an intellectual property right, which can apply to information as applying to protect privacy. Now that idea caught steam, it just never took off. But it became one of those ideas that you can't get rid of. And later when Louis Brandeis became Justice Brandeis of the United States Supreme Court, he also dissented in a very famous case, where again the government had obtained information to get a conviction through a wiretap. The court said, hey, there wasn't a trespass, though it's not a violation of the Fourth Amendment. And he said, no, no, no, we need to imagine this to be a property right. Now that was never adopted. And ultimately, when they began to recognize constitutional rights to privacy, they adopted a different approach. But the basic problem is paper-based information builds a control out from an individual property right. But when information is electronic form, it's connected to other information. It's constantly moving in a system of relationships, which are no longer within the control of an individual. So I mean, a good way to imagine if you're damming up a river to get electricity, you don't want to finish pouring the concrete before you ask how the salmon are going to get upstream. That it's not simply between the telephone company or the electricity company and the users who are buying the electricity. There's an ecosystem that you're trying to manage here. And just as in that environmental analogy, when you have electronic information, it's flowing like water in the river. So water is generally good unless it's in your basement. Information is generally good unless it's the name of, for instance, the confidential informant in the wrong hands, in which case you've suddenly got a dead confidential informant. So it's a management problem as opposed to a control idea. And you start to see the courts in the law do the same thing. So they reject, when they finally recognize a privacy interest is being protected under the Fourth Amendment, the court actually rejects the idea of privacy as individual control and goes to a more social idea of reasonable expectations from a social point of view. And you see that same idea developed in the development of what are called the Fair Information Practice Principles. And these are the principles that underlie most privacy frameworks. Notice no secondary use without consent, access, rights of access, rights of amendment, security and enforcement. Now that idea, if you go back to its origins in 1973 in the book called the HEW Report that we saw in just the earlier slide, this slide, they reject again the definition of privacy as individual control and say privacy needs to re-redefined as the idea of mutuality or trust protecting the shared interests of all stakeholders. So where does this take us now? The challenge is that how do we govern personally identifiable information, information that identifies somebody or says something about somebody or would if it was combined with other information? This is a resource. How do we govern that resource? Now, for instance, the General Data Protection Regulation implements these Fair Information Practice Principles by creating a right of an individual control. So it takes that brandicine idea of individual control and jams it on this idea that was developed as basically a stakeholder trust model. And there you have this individual right enforced by government agencies and individuals can go to court to enforce their right. And their control structure is based on consent. But that idea fundamentally is pushing up against what is basically a relational model of knowledge when we go to electronic information. And you can see this if you just look at Claude Shannon's famous article, Mathematical Theory of Communication. We measure information not in terms of a thing like a piece of paper. We are measuring information in terms of the communication. And for there to be communication, there has to be a relationship. And Friedrich von Hayek in almost the same same year that was that Shannon published his famous paper. He also highlights the same problem, which is, and here it's a different context. He distinguishes scientific knowledge from practical knowledge. And he highlights how important practical knowledge is as a social resource. But then if you just look at that idea of knowledge that he begins with or information, it almost perfectly overlaps with the idea of personal data under the General Data Protection Information It's always practical knowledge is nearly always information that will say something about somebody or could if it was combined with other information. So you start to have to see what we think of as privacy or quasi-property as something that's really a social resource. Okay, assuming that this is a social resource, how do you manage it? How do you control it? How do you keep the resource from being polluted or overused? And this leads us and this is a common assumption that, well, if you've got a social resource and you want to keep it from being overused or mismanaged, we often use this idea of the tragedy of the commons. The tragedy of the commons is this idea that if you don't have a property right or if you don't have some kind of administrative command and control structure, one or the other or both, the commons, the area of the pasture that anybody can graze or cattle on will be overutilized and the soil will be ultimately ruined. What was interesting is this was assumed to be conventional wisdom. I mean, it's still highly assumed and it's actually utterly false. A woman who won the Nobel Prize in 2009, Eleanor Ostrom, realized when she went around the world looking at self-governing ecosystems and her focus was primarily on natural resources, fishing waters, forests, irrigation systems. She realized that most of these things were not governed by property rights or by command and control structures, but they were governed by something that was a lot closer to our blabbermouth system that we saw with the kids. And what she highlighted were certain basic criteria that indicated when those systems would be successful and when they would not be successful. But when they were successful, they were much more efficient. They were much better at managing the resource. But one critical aspect that she highlighted was the importance of having that self-governance system recognized by higher-level authorities. In other words, the teacher needs to make sure that the teacher recognizes that the kids are running their own blabbermouth system. And so that's really what I got interested in. And this is where standards come in because a standard-setting organization and the standards that result from that are much more bottom-up. They are much more focused on the kinds of things that the kids come up with on their own and they become much easier to internalize for businesses when the businesses adopt them. And given the fact that the standard-setting organizations are coming up with these really robust, wonderful governance structures, what's the relationship between those governance structures and formal law? These are both kinds of law, but lawyers tend to think of law as being something that's segregated from the bottom-up stuff. And so what the U.S. government in OMB Circular A119 has is the idea is we want agencies to have more interface with these private sector standards to have public-private partnerships. These were developed earlier on. Consumer product safety is a good example of food and drug, but you're starting to see those things being explored in privacy much more so. And Article 40 of the GDPR uses codes of conduct and in the U.K., Liz Denham has done some really remarkable work in this area interfacing, she's the privacy commissioner in the U.K., and she's done some wonderful work with business to integrate the GDPR principles with codes of conduct or standards. In the United States, the Children's Online Privacy Protection Act has developed all so-called industry guidelines, but they're basically the same thing. And these guidelines are recognized by the Federal Trade Commission and they are, you get high levels of compliance. And you're starting to see that in proposed legislation as well. The SAFE Act, which is a bill in Congress right now, talks about voluntary consensus standards. And then you have a draft Uniform Law Commission model private stuff for the state level that also has this idea of an interface with private sector standards. This is all pretty good. The question is, where is this going next? Assuming we're able to have a more, less dysfunctional, I should say, interface between formal law and private law or private sector standards, where is this going? And the next stage is really the stage that Lisa was talking about. You see this in the U.S. government with OMB Circular A130, which is thinking about information governance as a strategic resource. So information is a resource that needs to be managed. And of course, I would refer you to Appendix 2 where they talk about how you make privacy. You recognize that personally identifiable information is also a resource that needs to be strategically managed. And this is actually something that I think I borrowed from NIST, which talks about the way in which you do the old Plan-Do-Check-Act structure as you're implementing this from a total enterprise point of view. And of course, the open group itself is approaching this in the risk management frameworks that it's pioneering. So where does this leave us? It leaves us thinking about getting away from the idea of privacy as a thing that belongs to a person and thinking of it more as a form of knowledge governance and thinking that information is always going to be entangled as opposed to control, that we're going to look for trust or how we earn trust and what do people care about as opposed to thinking of this as protected through adversary relationships in court, for instance. We are thinking of information as in contexts as opposed to rights that belong to individuals. We're thinking of an ecosystem approach as opposed to a physical world, you know, controlling the physical world. We're thinking of it as emergent and dynamic and constantly moving as opposed to fixed and static. And we're thinking about standards as opposed to rules. And this, I think, is the future. And so I'd be interested in any questions. We don't have very much more time, but I would invite any questions. Peter, thank you very, very much for that. I don't usually admit this, but since I'm among several hundred friends here, I'm an attorney by training. So I like the story behind this a lot. It's great. So yes, we'll move to questions. So GDPR that you referenced several times is four letter word, four letter acronym for many companies. Do you think it's fundamentally a good or bad thing from the individual, from the individual's point of view? I think that's yet to be seen. I think that if the I don't think the compliance rates in the GDPR are very high. Right. My budget and I'm only responsible for the Department of Justice's compliance, but the compliance framework in the federal government is very similar to the GDPR. There's almost the same kind of fair information practices that are required as well as the information governance and risk things under the government act. So we have a very similar framework. And I'm struggling to get the Department of Justice compliant, right? I mean, but I know where I'm not compliant. And on a good day, you know, I could sort of imagine maybe 90% on a bad day, I'm about 80 something, but you know, we've got a plan to get there and we're getting resources where they need to go and we're getting the training and we're measuring and we're getting better and better every year. My budget at the Department of Justice, I have about 150,000 people, including the contractors I'm responsible for is less than half the budget of many of the data protection authorities in Europe. They're responsible for millions of systems. I've got about 650. You know, we've got a, you know, I don't think their compliance numbers are very high. And so that's a challenge. The way to solve the challenge, I would incur, I mean, my colleague and friend Liz Denham in the UK is working with industry to come up with stuff that works that industry buys into through codes of conduct. But I think she's one of the very few data protection authorities within the sort of GDPR world that's really exploring and developing these codes of conduct that are available within the statute, you know, within the regulatory framework of the GDPR to the extent that you can create a good interface with private sector standards, you can make the GDPR work to the extent that they do not coordinate with standards to get a real bottom up buy-in. You know, I think they're gonna have the problem that in Texas, anyway, there's a Texas expression where I practice law for a while. And it's called walking in the graveyard. You're over a lot of people, but they're not all listening to you. Yeah, absolutely. I was reading either last night or first thing this morning about, so we might get a real test on, legal test on GDPR regarding Uber drivers who've been fired according to an algorithm apparently, rather than a human decision. And that is, I think it's article 22 of GDPR, something like that, that is gonna get tested for the first time. So the interesting thing to see how that plays out. Question from one of the members of us, one of the attendees today. How about shared personal private information, Peter, where two people have equal rights, like twins, for example? Well, I'm not sure that twins are the best case. What about when I send an email to you and we both share that information? Whose information is it? I think that that's getting to the point I was making about the electronic, the nature of electronic information. And really, Claude Shannon's point about the nature of information itself. Information is always when you drill down a communication. It's always gonna be a communication from one person to another, one thing to another thing, person to a thing. It's going to be existing in a relationship and relationships exist in contexts. And when you start thinking about information that thing, that way you can't think about it anymore as just a data point that belongs to any particular person. It explodes. And this was the point that the HEW committee made in 1973. And the chairman of the HEW committee was Willis Ware, who was, you know, he studied with Alan Turing. Oh, he was a friend of Alan Turing. He's a good temporary. They both studied with von Neumann. You know, they were doing the first computers in the 40s. And Ware was, he could see that computers were fundamentally different from paper. He knew that this was in relationships and he knew that the game was trust and to get away from the idea of individual control. That was what was so essential. Yeah. Understood. Just if we can one minute, Peter, but if we can just squeeze this question in a long time valued member in the open group asked this question, he cares very much about our ODEF standard, data standard. How does standards to the DRES information management get exposure within the DOJ? I want to probably decline to comment about that. One of the problems that lawyers have is they, and I can say this to you because you've just confessed to being one, is that standards are generally invisible to lawyers. That lawyers think of law as being something that's just kind of out there and it's, you know, law is what judges do in fact, right? Actually, law is what people do in fact. And standards understand that and the standards are a bottom-up structure because of that. Interestingly, Justice Holmes, when he is often thought of as the judges are enforcing the formal law, right? And that's what people understand when they hear the expression law is what judges do. In fact, it's not what's in the statute books. But actually Justice Holmes, when he was in Massachusetts before he became on the Supreme Court, he was on the Supreme Court, but then the Supreme Court Justice is what actually sit as trial judges as well. And as a trial judge in Massachusetts, he was the first judge in the United States to recognize an industry standard as establishing the standard of care in a tort case. Right. That's, that's, didn't, I did not know that, but interesting, Peter, we have to leave it.