 So we are back. So one small message, you know, let's do the messages later because we don't have a lot of time. So the next talk is titled Privacy, an unequally distributed resource presented by Katharine Jermoul. Please give her a big round of applause. Thank you so much. Thanks for your patience. I had an emergency run across camp, which takes longer than 10 minutes. So today, we're going to talk about privacy as an unequally distributed resource. And the first question that I have for you that are here is, how many people here are camping? Like everyone, right? No, nearly everyone. I'm camping too. And when I was telling some of my friends and family that I was going to be camping with 5,000 of my best friends, they were like, why would you do that? How many of these people do you really know? Why do you want to go camp out in the middle of Randenberg with a bunch of people that you don't know? And I think that this brings up some interesting conversations that came up in the research that I did for this talk, which is that for quite a lot of people, this idea or concept of privacy is around the idea that we have walls. And in one case or one study of privacy that I found, they were really studying privacy and sex, essentially. And what they found is that people who have private rooms prefer to have sex in private rooms. And although this is definitely not what this talk is about, it got me thinking because it got me thinking about the idea of privacy and closed or private spaces. And essentially, the thesis is if people have access to closed or private spaces to do private things, they prefer that, that that's their choice. And it reminded me of this common phrase, good fences make good neighbors. Now, how many folks have heard this phrase before? It's often quoted, how many people know actually the context of the poem that it was in? OK, a few. And the problem with this quote is used often to talk about, oh, we should have walls, we should have divisions, and this will guarantee some level of privacy. But in the poem, which is called The Mending Wall, it's actually kind of a division between the narrator and his neighbor. The narrator is mending this wall with his neighbor, and he's a little bit aggravated because he has to do it every year and it's the same thing, and he doesn't see the point. But his neighbor says, good fences make good neighbors. And he does it in kind of an ominous way. So in the poem itself, this is kind of an ominous phrase that makes you think that there's something that the neighbor wants to hide. And yet it's interesting that it's used often in discourse around privacy and essentially property. However, there's also kind of this separate movement, which is around communal living, which now is a very hip ad word. This is from a startup in Berlin that's making essentially a lot of money off of vege's, which have been around for a long time, and making them hip and cool again. Isn't it so fun to live communally? You should pay us lots of money to do it. And the same goes for co-working, right? Is essentially we're sharing a big room, some people that we might know, others that we might get to know. But essentially the idea here is that we're sharing space to do something that may or may not be inherently private. And for many of us that have worked in co-working facilities, I'm sure there has even been sometimes where you have either overheard or overseen something that you thought, hmm, maybe I shouldn't have seen that. Maybe I shouldn't have heard that. Maybe that was something private that I now should hopefully try and forget. And this is touted in the media, of course, as some sort of millennial trend. This is an article from a few years ago that now millennials are trying to be like folks from the Middle Ages. Those super weird millennials, they don't like walls. And even though this is somewhat comical, we can always count on Reddit and other messaging boards for SNARK. And this was one of the top comments for that story, which essentially says that millennials don't have resources. And it is because they are poor that they live communally. And so my thought when I read articles like this is that privacy and property are being inherently linked here. That one does not have privacy unless one has four walls, ownership, some sort of domain or control over one's physical space. And maybe let's say the physical space that our traffic runs over. And to me, I guess I want to say this is probably valid for some people, and it probably fits for some people. But for me, this is inherently problematic because this inherently means we will always, as long as we have unequal access to property and private spaces, we will always have unequal access to privacy. And to me also, I don't really feel more private when I'm alone in a closed space. I feel private, or I feel most private, when I'm sharing something intimate or something private with somebody else. I feel most private when I'm able to share some piece of me or my history or a moment with someone that is essentially just ours or shared between us. And for that reason, I think I'm not alone in this that privacy doesn't have to mean property. Privacy can also mean trust. Privacy can mean the ability for me to share something with you and for you to understand, based on social norms, based on our relationship, based on our trust, that this is something private. And it's not just me. Dana Boyd is a researcher who has been studying privacy for two decades. And in her work on privacy, she specifically focused on young people and their interactions online. And this was now at least 10 years ago, might be more like 20. And what she found when she studied this interaction, and I'm going to read a longer quote from her, but what she found is that it is indeed much more based on trust. Privacy is not about control over data, nor is it a property of data. It's about a collective understanding of a social situation's boundaries and knowing how to operate within them. In other words, it's about having control over a situation. It's about understanding the audience and knowing how far the information will flow. It's about trusting the people, the situating, and the context, and people seek privacy so that they can make themselves vulnerable in order to gain something, such as personal support, knowledge, or friendship. Essentially, she goes on to say that privacy is violated when that trust is violated. That privacy is violated when your friend, who you thought you could trust, goes and tells somebody else something that you thought was private, whether you explicitly said it or not. Finally, she goes on to say, understanding the context is not just about understanding the audience. It's also about understanding the environment. Just as people trust each other, they also trust the physical setting, and they blame the architecture when they feel that they were duped. Consider the phrase, these walls have ears, which dates back to medieval times. And the phrase highlights how people blame the architecture when it obscures their ability to properly interpret context. Now, what she's saying there is really great, because we've been talking about privacy as property and as walls. But what Dana Boyd points out is that walls can actually obscure context, and that privacy is instead about the context of our interaction. And walls, therefore, can make us feel like we have privacy, even when we don't. I think that this is an important distinction. And so for this reason, we're going to move forward with two definitions of privacy. Of course, there's probably more. But we're going to go with the definition of privacy as some sort of private space, as ownership or domain over that private space. And we're going to go that for some, it's more about trust. And this trust is that we share information with the trust that you and I understand what is private, and we will not violate that trust. And for both definitions, we hope that this is not violated. We do not expect it to be violated. And when it is, we will feel as if our privacy was breached, which brings us to privacy as an auxiliary for privilege. So now that we have privacy defined, what we're going to do is talk about how it acts as an auxiliary or a corollary for privilege. I feel like you can't talk about privacy without mentioning like a singe or Snowden or somebody like that. So consider this my obligation fulfilled. Snowden's tweet here is about nothing to hide. And a lot of people use this argument, I don't have anything to hide. Why should I care about privacy? And what he says here is that surveillance in and of itself is never about privacy. It's about power. I disagree slightly. And that's mainly that I do think it's about power. But I think it's about the interaction of power with privacy. If you have power, it gives you the ability to violate other people's privacy without asking them and without even saying sorry afterwards. And this means that from surveillance, whether it be from a government, a boyfriend, or a corporation, that surveillance is about some level of power over another person, which I believe is fairly connected to privilege. And so here, indeed, we see privilege giving people power and allowing them to violate other people's privacy without their knowledge or consent. Another way that privacy acts as a corollary for privilege is the argument that I hear, unfortunately, a lot of times in communities like ours, which is what I call just use and don't use. Just use 3ma, and everything will be totally OK. If you just download 3ma, we solve the privacy thing. Everybody can go home. We solved it. Let's go. But no, right? If you use 3ma, if you do signal, and yes, I am extremely happy that these exist, and yes, we should have end-to-end encryption, and yes, this is all good. But we didn't solve privacy. The only thing that we solved is if you use 3ma and most of the people that you talk to are also on 3ma, guess what? Your messages are probably private. So don't act like we've solved it when privacy is a much larger part of your life than one messaging app. Secondly, don't use. So here's a meme, and it's funny. And I can laugh, and I can talk to my friends who have Alexa and say, please turn it off before I get there, and please don't use it. To some degree, with my friends, I can have this conversation. But if you go talk to a stranger or a co-worker or somebody that you don't have that level of trust and communication with, and you start shaming them and blaming them because of how they live their life, how is this going to be a productive conversation? How do you expect this to end? Do you think they're going to be like, ah, yeah, great idea? No, they're not. What they're going to do is they're probably never going to talk to you about privacy again. And if they can avoid talking with you, they may never talk to you again, right? So this isn't what we want to do. We want to be able to use the knowledge that we have about private and secure messaging and other services and be able to communicate it to other people, and that means not blaming them. And privacy as a corollary for privilege continues when we look about privacy in a legal context. So the right to be forgotten has been around for quite some time now. And how it generally works, let's say in the Google context, is you say, I want this and this and this link removed because I have the right to be forgotten. And if Google says yes, they just remove them and they let you know that your attempt was successful. And if Google says no, you need to sue them. And I've been following some of the lawsuits against Google in this case, and here we have two of them. They're two UK-based businessmen. And I was thinking to myself, of course they are, right? Because who has the time and the energy and the money and the resources to sue Google? Definitely not me. But we should be suing Google a lot more. We should all be suing Google a lot and other places as much as we can. However, a lot of us will need to figure out how we can pool our resources or get better access to resources so that we can do such a thing. And we see here that even one of the businessmen lost his case. And for this reason, just because even if you have privilege and money and time and opportunity to sue somebody like Google, it doesn't necessarily mean it will be successful. So hopefully it's somewhat clear how kind of our own privilege allows us to gather more privacy or have greater access to privacy. But now I want to talk about privacy in and of itself as a privilege. Here, sorry, some of the text is small. We can see the big logos. To the right-hand side is the PGP encryption methodology. And at the bottom is private DNS. So who here knows or uses some of the technologies on this screen? Yes, vast majority here. And this is probably not without sometimes error, or we had to learn how to use some of these. It helps that we like playing with computers. This has helped us be able to figure out how to secure our communications, how to make our browsing perhaps more private, and so on. But this is a privilege. This is technical ability that we had to learn. We weren't born knowing how to run BSD, right? So when we think about the privacy that we have access to because we know about and are able to actually use these technologies, we must also acknowledge that we are essentially part of the 1% in this case. And it continues with understanding of how the sausage is made. And what I mean by that is that we have the ability to understand how privacy attacks work. We've probably understood how de-anonymization attacks work by linking different pieces of information that have been collected over time. And we have the ability to understand how much data might already be available to us and therefore be able to be linked to our person or our persons within our family, let's say, our friends group, and so forth. And being able to do this is something that we had to learn. And I'd like to bring up the privacy paradox in this context. So the privacy paradox essentially says people say they care about privacy, but they actually don't. And this is because they still use things like Facebook. I find this tremendously problematic in a lot of ways. But the most important way that this is problematic is that people can care about their privacy and they can still use Facebook because they do not necessarily connect the two automatically. So they can use Facebook because their mom's on Facebook or their friend's on Facebook or their favorite band is on Facebook. And that's why they use Facebook. And they can also care about their privacy. It doesn't mean that people don't care about their privacy just because they use a popular service. Instead, it means perhaps we should do some better outreach and communication on how one can assess and quantify privacy risk over time and how we can make this a more understandable and more observable risk that people engage in every time that they do something or post something online. And this privacy as privilege extends, of course, for those of us that have the opportunity to live within the EU with GDPR and other regulations. And what GDPR was supposed to hopefully bring is more data rights for everyone worldwide. This was kind of the hope is that it would be the precursor to a worldwide movement for data rights. And instead, what it has often become is what I like to think of as first-class data citizens and second-class data citizens, where those of us that have the opportunity and the privilege of living within the EU have the right to delete requests, ask about automated processing, and so forth. But those of us that do not live there are second-class citizens. We do not have those rights. They were not extended automatically to us because of GDPR implementations. And so this is another way that privacy acts as privilege. And that brings me to the final example of privacy as privilege, which is the selling of your data. Now, this is not necessarily a new idea, but this has been greatly popularized due to cryptocurrency technologies, the blockchain, and so forth. And what people say is, Google and Facebook make money off of my data. Why don't I make money off of my data? I'm just going to go sell my data. And here's a headline from one of the startups. Unlock the value. Sounds so good, right? And the problem that we see here is whose data is going to have the most value? Is it going to be, let's say, the 50-year-old CEO white man with a lot of money versus, let's say, a refugee who just arrived here? Who is going to be able to unlock the value? We already see this in market economics around data purchasing and buying that indeed those with more money are automatically going to get more value or money from their data. And so when we double down on this, we are essentially reinforcing this and we are making sure that the have and the have nots when it comes to data control, privacy, and wealth from it continue to operate as they do today, which brings me to privacy and oppression. How does privacy interact with oppression in our world? First and foremost, probably the most obvious example, is the so-called gig economy. Work for yourself. You don't have a boss. You can just choose your times. Perfectly flexible hours, right? But the problem, of course, that I'm sure most of us know is that the gig economy does not mean that you don't have a boss. It just means that an optimization algorithm is your boss. And as we all know, the algorithms want to be fed. Got to feed the algorithms the data. That's how they live. So when you sign up for riding with Deliveroo, when you sign up for another ride sharing app, we sign up to watch people's dogs while they're on vacation. When you do any of these things as a part of this VC-backed gig economy, what you're essentially saying is goodbye to your privacy for, let's say, flexible work hours. But that may not even be the case, because of course, your privacy is gone. They will watch where you are. And when there is surge pricing in your neighborhood, they will immediately say, would you please get on your bike? Will you please get in your car? You could make a lot of money right now. You should go get in your car. And this is inherently the trade-off that you're making. And the most extreme example of this is we can always trust Amazon for bringing us the most extreme examples of lack of privacy. And here's a US patent. How many people here have seen this US patent? Let me tell you about it. It's very exciting. So several years ago, Amazon filed this patent. And it's a very special bracelet that you get to wear. And it's only exclusively available for Amazon warehouse workers. And when you're an Amazon warehouse worker on, you have your nifty little bracelet on. What happens when you reach for the wrong item is, just, oh, no, you're not going fast enough. So you essentially get to shock yourself as part of your daily interaction with your work, which is fantastic, right? What else this New American dream? And what we have here, though, is essentially the largest concentration of corporate surveillance of its employees. You're not even going to be able to move your arm without Jeff Bezos getting an update. Of course, he's too important for the updates, but eventually it'll trickle up. And I bring this up to show that when we have the privilege of ordering in our underwear the toilet paper we forgot to go to the store for, and now it's Sunday, and the stores are closed, and I need toilet paper tomorrow, when we exercise that privilege, hopefully not many of us do, but in general, as a society, then we are essentially oppressing other people's privacy via Amazon. Finally, a good example of oppression and privacy is anybody that has ever needed to receive aid or assistance. Here I have some details, and I'll clarify for the non-German speakers of the room, but these are some of the guidelines for receiving Hadzfea, which is essentially the German version of long-term unemployment. And what you're going to have to give over, just like in a lot of countries, if you want to receive unemployment assistance or aid, and increasingly with NGOs and other type of organizations for receiving aid, is you're going to have to show where you live, what it costs, how much money you made in the past, how much money you have, how much money your parents made, and so on and so forth. And you have to represent all this information before you are even eligible for the aid. And what are we creating here? That you must choose between eating or being able to pay rent and your privacy? And how are we then treating the most vulnerable of our society with openness to their own right to privacy or with the ability to say, well, we don't really trust you, so you have to show us everything that you do in order to get this? Which brings me to anonymity and privilege. I want to take a slight detour here, because I think anonymity is something that we regard highly within communities like CCC. And I want to touch a little bit about how this interacts with privilege. Namely, I want to address the idea of quote unquote hacker anonymity. Now, for the most part, a lot of so-called hackers and hacker groups, they might be called anonymous, but they usually have a group of individuals who go by some sort of hacker names, code names, crypto name, whatever you want to call it. These pseudonyms. And normally, rather than being truly anonymous and rotating pseudonyms every time they're online and so forth, they have some level of use of a particular set of pseudonyms or one pseudonym in particular. So that wouldn't be what we would consider like mathematically anonymous, but nevertheless, let us accept that definition. But if anybody knows Kevin here in the photo, he's a hacker from, I think, the 90s. Yes, the 90s. He was very good at social engineering. And he went to jail, I think in 2000, it was, for some crimes he committed in terms of stealing identities and so forth. And when he was in jail, people were like, free Kevin. He's so great. Let's free Kevin. And I don't necessarily think it's wrong for people to protest against imprisonment or wrongful imprisonment of individuals. And in this case, he was probably in prison far longer than what was due, given his crimes. But the point is, is that after prison, he went on to form a security company and now advises the NSA. And I've got to ask you, how many people can do that? As in, let's say, instead of being arrested in 99, he was arrested in 2001 in November or 2002. And let's say, instead of being a white man, that he was a Muslim man of color. Do we think now that he would have been released, that there would have maybe been a protest against his wrongful imprisonment, that he would be now rich and advising the NSA? And in this same realm, I must say that what we have created with our goal for an anonymous internet is an internet where some people feel the ability to openly dox and stalk others. Here is one example post. Some information were enacted, of course, from 8chan. And 8chan, there's now offline right, but was hailed as the ability to post anonymously, right? And the people that feel empowered to post anonymously on places like 8chan are the people that posted on places like 8chan. And what they used the power and the privilege of their anonymity for was to primarily target those with less power and less privilege, dox them online, stalk them, abuse them, and so forth. Which begs the question, where did we go so wrong with anonymity online? How did it come to be that anonymity online is only accessible to those with the most privilege and the most aggravation? I want to talk about who deserves privacy. And I hope that we have some agreement and maybe some disagreement in this conversation. But who deserves privacy? So this is somewhat interactive. If you feel like standing because you've been sitting for a while, feel free. Otherwise, you can raise your hand. How many people think, either by raising your hand or standing, that your mom deserves privacy? Who thinks your mom, two hands if you have two moms, yet double the fun? All right, how about your children? Your children or the children that you plan to have or the children that you know, do they also deserve privacy? All right, what about your representative? And here I mean your political representative. Does Muti Merkel, Donald Trump, and so forth, do they deserve privacy as well? And what about criminals or suspected criminals? Do they deserve privacy? How about neo-Nazis? Who wants to give the neo-Nazis some privacy? And here is the famous slippery slope that everybody likes to bring up, is how can we apply privacy when we should probably apply it equally? And this is a hard one also for me, because neo-Nazis in the country of my birth are deciding to go on mass shootings and other domestic terrorism events. And there is some piece of me that of course wants my family and friends to be safe, but there is, of course, a much larger piece of me that says we cannot apply privacy unequally and that it should be an equal access to privacy. However, what I ask us here is, do we already have an equal access to privacy amongst all these groups? Does everyone in this question have an equal access to privacy? Does your mom have equal access to privacy as your local neo-Nazi? Because I think probably the answer here is no. And so the problem that we have here is privacy as we know it and as we understand it is already unequally distributed. And the problem that we have in trying to apply it fairly is that there's already certain groups of this that are more active in what we would consider privacy than others. And again, I'm not arguing for let's backdoor encryption, but I am arguing for us to think about this as a community and to think about how we can possibly make sure that your mom has the same amount of access to privacy as a neo-Nazi. And what I'm really asking here is, what are we going to do now? We are part of the technologists and experts that help make something like anonymity online that make privacy technology available. And we have kind of seen here and I hope it's fairly evident who is mainly using those technologies for privacy. I would say that the people that are most able to access and use those technologies include quite a lot of people that we did not maybe initially intend. It doesn't mean we're going to block them. It doesn't mean we're going to disable all privacy technology and research and so forth. But it does mean what did we do? And how do we write this imbalance? How do we make sure that the people that feel OK being anonymous online are not essentially just violating the privacy of other individuals? How are we going to come up with a solution for that? And I think, obviously, it goes way beyond this talk. But hopefully, lots of smart people in this room, I think we can probably start to work on ways where we can distribute privacy more equally and we can make it so that it's not only a resource for those who are able to get paid or pay in cryptocurrency and for those who feel OK or are able to hide their IP address and access and their hatred online. I have some suggestions. I hope that you have some too. What I'm hoping is after this talk is we can have some conversation, move it perhaps to one of the workshop tents, and discuss and debate how we can make this more equitable. But a few thoughts. We can make and meet things where people are at. And this means figuring out how to interface well with popular technologies like Facebook. Yes, it's icky, but you're going to have to do it. We need to avoid blaming and shaming people for operating normally online. We need to figure out how to have a productive and inclusive conversation there with other folks and help use our education and privilege to spread that to other communities. We need to engage in collective organizing and probably with lawyers in such a way that we can help form things like data trusts and other ways that we can get collective bargaining or access and control over larger amounts of people's data. This means also non-EU residents. We need to develop privacy tools and attacks against invasive algorithms. So I had a talk on this at 34C3. Still have lots of side projects if you want to talk about how we can use things like adversarial machine learning for social good and evasion, then let's talk. We can fund and foster projects and conversation amongst tooling in new locations and communities. And this means that not only just having Europe and the US and a few other places represented at the conversation, this means engaging with folks in greater Asia, engaging with folks in the global south, and making sure that privacy concerns from different cultural perspectives are actually addressed. Be a privacy advocate, call out privacy privilege. When you see somebody shaming someone, when you see somebody saying, how come you don't know how to use Tor? Then you say, hey, you didn't know how to use Tor once. Let's talk about it. Let's be a welcoming community. And related to that is the whole security versus security and privacy versus security debate here. This is raging once again in the US because of the comments about military grade encryption versus other grades of encryption, which is, of course, completely false. And the argument is we should ditch the whole privacy debate and we should just switch to a security versus security debate. And the problem with this is that we're essentially saying privacy is no longer a good argument. And maybe that's the case in the US, but I still think that we should make sure this doesn't distort our understanding of privacy. Again, let's meet together. After this, I want to hear your thoughts. But to close, what I want to bring up is the fact that privacy is becoming, essentially, an advertising word. Apple is the one leading the charge in the US, at least, saying privacy and pretty much all of their advertising now. And although I think it's great that Apple is investing some of their money and time into privacy technologies and research and so forth, what I don't want is for privacy to maybe move from being property, although here it is clearly property as well, but to just meaning let's just throw some encryption on it. Let's just drop a little data points here and there. Because what I don't want privacy to lose is the idea that privacy is built on trust. So what if we could sue Google for violating our trust? They wouldn't know how to do with that. In fact, they would have a very hard time building an algorithm to infer what we deem as private, what is our definition of privacy. And if we could say, hey, that was private, you should have known that. The fact that you sold that or you leaked that or you made that available to these advertisers, that's not OK. I'm going to sue you. That would indeed change the discourse of privacy, at least in a legal context. So words mean things because we believe they do. And if we allow privacy to mean information security or if we allow privacy to mean property, then that's all it will be. It will only be available to those with the most wealth, the most privilege, and possibly the most aggravation online. We must actually take time to define privacy in the way that we experience it and the way that many people experience it today, which is trust and an understanding of social norms and context and obligation to one another as humans and affirm that as a right for everyone. Because everybody deserves the right to feel safe online. Everybody deserves the right to feel safe in their house and to go about a, let's call it, normal life. And for that, we're going to have to try and figure out new ways we're going to redistribute privacy and share some of the privacy access that we have in this room with the most vulnerable of the world and the population and figure out how to take some of it from those that are most angry and privileged online. This is something I believe we can do. And I hope that we can work together to make sure that privacy is a more equally distributed resource. Thank you very much.