 I'm a privacy designer, and I wrote a little book about that called Design My Privacy. That's used in a lot of schools in the Netherlands to teach privacy design. But more importantly, I'm a technology critic, which means that I try to not believe everything that Silicon Valley tells us face value. I also work for the Ministry of Technology and Innovation in the Netherlands, which doesn't exist, but the website does. The main question I work with in technology is how do I get a wider audience to understand what's going on and to care about it? And one of the places I do that is set up, which is I'm a co-founder of this media lab, this cultural organization in Newtract here in the Netherlands. And we do funding projects. We're a nonprofit, and what we do a lot is we use humor to explain things, use humor to reach a wider audience to help them understand what's going on. So we do a lot of crazy projects. But I won't really go into those, I'm afraid. But what this talk is going to be about is what does it mean to be free when surveillance is the dominant business model of the internet? And I think it starts with this button that a lot of us click all the time. I agree, button. We don't really know what we're agreeing to. And what I'm interested in is what is the long term, what happens in the long term if we keep clicking all these buttons all the time? And I believe in the long term we could see the rise of something that I call social cooling, which is a purposeful connection to global warming. And I'll get to that later. The core idea is that we change our behavior a lot and that we start to apply more self-censorship. But I'll get into that, all of that. But this is what we're going to talk about. So what I worry about is when I want to explain new forms of surveillance to my mom, how do I do that? And that's why I think we need words like this. So this is my hack in a way. What metaphor can politicians understand? So that's why I try to come up with this word. Like I said, as oil leads to global warming, I think we could make the narrative that data could lead to social cooling. And I started by building a website. It was just a pet project. I made socialcooling.com. And it went viral, so that was really nice. Well, then things exploded. I'm going to talk to you about three things. The first, I'm going to talk about the rise of the reputation economy, because that's the core foundation of this idea. Then I'm going to talk to you about what social cooling exactly is, so what is happening in the long term in this, after this reputation economy comes up. And thirdly, I'm going to talk to you about what to do about it, because I don't want to leave you all depressed. But it starts all with this, with this idea that a lot of us think more and more feel, which is the idea that maybe I shouldn't click on that little link. You're on Facebook, you're browsing, and you come across this link. You think, I could click on it, but someone might remember that and that might look bad. Who here recognizes that feeling? See, the more I give this talk, the more I find people feel this, and it's really growing. And I think this is in the core, the smallest version of what I'm talking about. I call it click fear, or in Dutch, click phrase. And scientific studies point out that this is happening more and more, that a lot of people are feeling this, and that we see these chilling effects pop up, these effects where people stop clicking on things or stop doing this, because they think it might harm them or it might hurt their chances. And, of course, this idea is not that strange. I mean, we all understand that if you feel you're being watched, you change your behavior. And even earlier, architects like Jeremy Bentham created this idea for the panopticon, that the concept you probably know. This prison, where the core idea was that the prisoners could be watched by the prison masters, but they couldn't see the other way around. So you couldn't see the jailers. You couldn't see if they were looking at you. And this meant that if you were in this prison, you constantly had this idea, am I being watched right now? Am I doing something that's illegal or not? And even if you wouldn't, you might not have been watched at the moment, but you started internalizing this idea, is what I'm doing right now, is that illegal? And that was the whole idea by Bentham. He said, these prisoners, even after they leave the prison, they will still have ingrained this idea of this jailer who's constantly watching them. They will keep thinking, even when out of jail, is what I'm doing right now, is that okay? Is this acceptable? Of course, all these prisons were closed because they turned out to be quite inhuman, but in a way, this is exactly what we're building now on a societal scale, and that's because of the rise of big behavioral data, and us gathering all this data about our behavior. And we do this because big data has the promise of behavioral change, right? And the most basic version that you all know is, of course, advertisement. Advertisers gather data to give better ads. We all know this story, but it goes a lot further than that. I think what most people miss in this story about why our data is gathered and why advertising is important, it's also about risk management, right? That's where, actually, if you look at the numbers, the bulk of the money that is made through data is not made by advertising, but it's made by risk management. People who want to manage you, or citizens or consumers, as a risk. And if you want to see what the extreme version of this is, all you have to do is you have to go to China. In China, they're building something called the social credit system, and this is basically a score that, well, it says if you're a good citizen or not, it's a rating that says if the citizen well behaved or not, and this is what they say. They say when people's behavior isn't bound by their morality, a system must be used to restrict their actions, right? So China, fundamentally, doesn't really trust their citizens to make the right choice than themselves. They say, well, we'll nudge you, we'll help you to make the right decisions. The system will be based on various criteria, ranging from financial credibility and criminal record to social media behavior. And from 2020 onwards, each adult citizen should, besides his identity card, have such a credit code. So they're making this mandatory. And they're wasting no time because already you have the first version that's called Sesame Credit. Sesame Credit is a commercial variety made together with Alibaba Group that you might know from Aliexpress and websites like that. And already you see that students in bars are comparing scores. Like how's your score? You have a good score. Even more impressive, already this system is connected to the largest dating website in China. So if you want to date someone, you can actually look up their score and see are they paying their bills on time, are they buying good things. That's already made. I'm gonna skip this one. So what you see is that, are you buying and saying the right things becomes very important in China. Because getting a government job or a loan or a visa will be a lot easier if you have a good score. The Chinese government doesn't want citizens with a low score to leave China because that might be a bad representative. But this is the thing that gets me most because until now you could say, well, you know, that's your own fault. If you're a bad citizen, then you get a bad score and you get less chances, that's fair. But the thing is, this is also a factor. So if your friend's score is bad, that influences your score. And vice versa, if you have a bad score, you might drag your friends down. And that's where it gets really insidious and where you really start thinking, who are my friends? And that's where you see the rise of things like data discrimination. You can talk about data stratification where certain people will cluster together. And this is one of my new hobbies is to take these really bad stock photos and then to do this. I'm gonna do something to it and do this before and now I'm gonna give you the after and just see how it changes. Right, that went from a nice couple to this kind of sassy romantic love story where she wants to be with him but he's five star and she's only a three star, right? And in a way, this is what we're building in, or they're building in China. Okay, so like I'm saying and you could be thinking, that only happens in China. Oh, those Chinese, they're so crazy. But the thing is, we're also building that. We're building the exact same thing over here. Only we're calling it the reputation economy, which means we let the market build it. And to give an example, this is a Danish startup called Deemly and I quickly want to give you a small piece of video from their promotional video, basically. Is there any sound by any chance? No, I've turned, sound's turned on here. What the video says is, this is Cheryl and she has a really good score on her ride sharing app but she doesn't have it on another website. So what if you could connect those? So then you could have one score that represents you as a person. Wouldn't that be great? And then the video goes on to explain that with this score, wouldn't it be great if you connect it and show this score on things like, here we go. Using your Deemly score on your resume to get a better job, like in China. Use it to get a bank loan, like in China. Use it on your dating profile to get a better date, like in China. So we see the exact same thing happening here except the market is building it. And most of this whole ecosystem of all these ratings is actually invisible to us. Because we know our Uber scores and our Airbnb scores, we know we have those but those are the tip of the iceberg. Most of these scores are invisible. For example, Tinder has ratings on how attractive you are. You don't know about those but they are there. And that's just an example. More and more you see that these data brokers which are companies specialize in gathering your data and making these scores are taking your data basically and then combining it into these scores. There are sometimes up to 8,000 scores at the tally right now. So all your data, the thing that people don't understand that's your data up there but those scores are their data. Yeah, so they're up to, already up to 8,000 these slides are out of date. And they're gathering scores about things like your IQ but also your psychological profile, sexual orientation, economic stability, just things that you might not have explicitly shared but that their algorithms are able to deduce from the patterns they see in society. Someone else might have a similar profile like you and they know that they are or they have had, I'm not gonna go into that but their algorithms are able to deduce this about you. So up to 8,000 things about you. Okay, another video which we don't have the audio for but this is about a famous study where your Facebook likes could be correlated to your personality and all your aspects as well. And that's just from Facebook. They're able to deduce really shockingly high accuracy, your gender and all these things. Yeah, so important thing to understand is that these, what especially in America, in Europe it's a little bit different but in America you have to understand that these scores, they will always say these are our opinions about these people. They are not facts, if you say they are facts in the United States you can get sued if it's not true. So then you get libel case but what they always say is our opinion. Of course their clients will take these things as facts so it doesn't really matter but that's how they do it. And so you have to understand that people say often, well I don't want them to use my data or share my data. Your data is really quite safe in a way at Facebook because they won't share your raw data. That's really not valuable. What's valuable are the deduced, derived, and scores from that, the distilled project. Those are valuable. And that's no longer your data, that's their data. So that's really important to understand, really important distinction to make. A lot of projects currently say, are we gonna make ways to, so you can have your own data. It's kind of missing the point. So in the United States these opinions are protected as corporate free speech so there's really nothing you can do about them in a way. So what you see is that people are becoming aware of this, of that this is going on, that these ratings are being made. You see this more and more and what you also see is that people start to optimize their behavior. So, for example, if you find that Facebook is now starting to give advice on loans based on your Facebook friends, you're gonna start thinking like in China about who are my Facebook friends. You also see that these algorithms, of course, are rarely fair, but I don't really wanna go into it all too much, but yeah, this is starting to have really a big effect on us. Can you see that? Oh yeah, these are some examples of what you see having now. So Spotify a couple of weeks ago said what you're listening to will now be sold as a life, as a rating about your mood. So your mood is now available to advertisers and of course you can go further than that. This is an interesting case where people got physical letters in the mail about a skin condition they had and they'd never told them, all they had done is go online, look up information about the skin condition and these data brokers were able to connect these people who were searching online to their physical home address and then sold this data and these connected data and people got physical mail. So you can forget about being anonymous online. Already there are companies now that offer to give the real identity of people who are surfing. This is another mind-bending video that I recommend you watch. It's about psychographics. It's by a company in the UK called Cambridge Analytica and they have been very influential in getting both Trump to win and getting the Brexit to break work. This video is mind-blowing because of his honesty in saying how he uses these data, these scores about you, these subjective, your psychological profile basically that they've gathered from you on your online data to influence the messages to better reach you as a voter. So they know that if you are highly neurotic and they will give you a message about gun rights and if you are highly this and that, they will give you another message. So they're very focused targeting you, micro-targeting you based on your psychological profile to get you to vote a certain way. And of course, a week afterwards, we saw that this data set was actually leaked. And again, here you see what I was talking about. The modeled data, as well as data described as modeled voter behavior, ethnicities and religions was leaked. So this is about 200 million Americans, this data set got leaked, including all the psychological profile stuff. That's scary. Now, when I give this talk at tech conference, I always get the thing, oh, it's not new. We've been rating people for forever. Your bank always had these ratings. No, it's not new, but this is deeper and is everywhere, right? So now your greengrocer has not just scores about your creditworthiness, but about your psychological profile. That's a whole new ballpark. This market is huge, right? Some estimates said in 2015 in the US alone, it was worth 150 billion. That's including things like Facebook and all that. But that's just to give you an example of the scale. Okay, let's skip this. So the short term, you could say that these systems in the short term do have some advantages, right? They could create trust. You might rent your room to someone on Airbnb through the trust system that's created through these ratings, that's valuable. They might be able to reduce crime. There are long-term advantages and those are being extolled a lot. People are talking about those a lot. But I want to talk to you about the long-term negative effects of living in a reputation economy, because I think those are rarely talked about. I think in general what you could say if you want to have it summarized in a way is that all these companies are interested in risk, managing you as a risk. And what you see is that this pressure to manage risk is kind of put on the shoulders of the consumer, of the citizens, who themselves will start to manage themselves as a risk, right? I don't want to be a risk. I want a good loan. I want a good this and that. So I'll be less of a risk, so the company will accept me. And that brings us to social cooling, right? The long-term effects. I'm going to talk to you about three aspects that I think are we will see in the long-term. And I've kind of already talked about the first one, which is that we'll see a rise of a culture of self-censorship, right? We've already seen this happen with people not clicking on links that much anymore. And loads of studies are pointing this out now. Especially the one about spring break was the one that really started this for me, this exploration into this subject. Where people, you know, the students don't go as wild as they used to on spring break because they're being watched there. The videos will go online the next day. So yeah, that's fascinating. This is what the girl says from spring break. She says, we are very, very reserved. You don't want to have to defend yourself later, so you don't do it, right? So that's one of the quotes from these students who just don't party as hard anymore. In scientific circles, this effect has been known for a long time and it's called the spiral of silence, where you don't want to say something weird or fall out of the mainstream ideology. So what you see here, basically, is this first point, this idea of self-censorship, is that what I find so horrifying about it is that you do have freedoms, right? You do have the freedom of expression, for example, but you don't use it, right? Because you feel the social pressure to not do it, right? So that's really insidious in a way, right? That's really weird. The second thing is, if you look at the first part was about how individuals react to this reputation economy, but the second thing is what this does to a society in the long term, and that you will see societal rigidity. Society will be less able to change, and that's very much where we see the societal value of privacy. Because I think what privacy allows is for minority values to, in the long run, become majority values. We see this time and time again in society. So let's give an example. Let's look at weed here in the Netherlands. In the Netherlands, weed was illegal, but then, of course, some people tried it at home, and then, hey, Janine, you should try a bit of weed. And then she was like, oh, that's not so bad. And slowly, surely, society's opinion about these things changes in the background beneath the surface. And then after a while, people are like, you know, what are we still worried about? And then society's opinion changed, changes. And that's because privacy allows us to do things outside of the view of the controlling eyes, you know, just to explore. And you see, weed is interesting, but the way more important thing is, of course, what Martin Luther King points out, that this is also how society changes values about things, like discrimination about gay rights, et cetera. So Martin Luther King has a great video where he talks about how proud he is to be maladjusted. He doesn't want to adjust to the pressure of society to accept discrimination and all that. So this is a pretty important thing about privacy and about feeling free to fight against injustices that's related to this idea. The third thing that in the long term could happen if we live in a society like this is that we'll create a culture of risk avoidance. And it's more of an economic argument, you could say. And I'll give you an example. So there was an experiment in New York in 1995 where they started giving doctors already scores on how they were doing. And of course, if patients died under their knife, that was a negative effect, a negative influence on their score. So what you found was that these doctors started to, the doctors who operated on high-level cancer patients, cancer patients with advanced stage cancer, the doctors who tried to help these people, they got low scores, because these people relatively often died. I'd say like one in two of them died. So these doctors had low scores because they had the guts to help people. All the doctors who didn't do anything, who said, you know, sorry, I can't help you. Those had high scores because people didn't die under their knife, but these people died prematurely. So you see that these systems create these perverse incentives to not take risks, to avoid risk. And of course, with people like doctors and other professionals in our society, we would like them to take risks, right? Taking risks is important. And if we take too strong incentives to not take risks, that's a problem. I see very much a paradox of the creative industry that I'm sure we're all, you know, part of, which that, sorry, this isn't Dutch, but being, having the guts to be different was one of the things at research point now is really important for children to become creative. This is about a project by Tayno about measuring, how children can become more creative. That's very important now. They have to be different. They have to feel that it can be different. Of course, you can wonder in a system where, you know, it's all about rating you and scoring you all the time, to what extent you can still feel that you can be different. I want to give you a short example of a company that really freaks me out and that points to all these things, and it's called Red Owl. And this is a trend in the security business that really worries me. They're about securing the human layer of the enterprise, you know, about insider risk. And what they do is they really, they gather all kinds of data about the people and about the employees in the company, including, you know, third-party feeds, social media data, email, all kinds of things. They create these, you know, it ingests all the data sources, and then they put the piece together and basically create these risk scores for citizens, for employees, including, you know, who's most likely to leak, for example. I think that's kind of freaky. Of course, this is one that really blows my mind, which is that they say, it's about people. Yes, it's about people, but not in a good way. You know, this is really crazy. But this is what the market is going towards. So you see, again, this pressure, this risky version that's really dominant. Okay, so those were the three things about social cooling aspects, you know, the rise of self-censorship, the long-term ability of society to change might lessen, and we see, you know, less ability to take risks, which is important as well for the economy. Now we get to the final part of this talk, which is, you know, to not leave you all depressed, is how do we deal with this? And I think there's a couple of areas that we should quickly talk about, which is politics, the wider audience, economy, and finally you and me. And of course, first thing we have to do is to wake up politics. So Dutch politicians, like Rutte said, well, we all have to be a little bit more normal. That's something that we don't need, right? So the problem here is we are becoming too transparent. And some, sorry, I'm just going to slide again, but you could say it leads to a democratic deficit, right? It's something that we don't want. And I think politicians have to understand that better. They really have to understand how, in the long run, these systems could damage democracy. It's about a balance of power, right? Often, these algorithms that rate us are completely untransparent. We have no idea how they work. But we are, as citizens, very transparent. I think there's a power imbalance there that politicians really have to understand better. Luckily, there are things like the GDPR, which is the new European data protection regulation, that are really starting to fix this. So politicians are waking up on the European level, which is really great. The second thing that we have to do is we have to educate the public. And that's one of the things that I try to do a lot. So I'll give you a short example of one of the things that we did at Setup, which was an exhibit last year. We made it called Divis Devices. And one of the things that we had there was a coffee machine called Taste Your Status. And this was a coffee machine that basically gave you coffee based on your area code. So if you lived in a good area of your code, you got really nice, normal coffee. But if you lived in a progressively worse neighborhood, you got progressively worse, watered-down coffee, which is a small Arduino to pump extra water in it, depending on the data. And I'll give you a short video about how that worked, but this is in Dutch and there's no sound, so that won't really work, but yeah, there were lines for this thing, and people really got it. It was really, for us, the best way to take a shortcut, the shortest way to explain that your data is increasingly influencing your life, right? That's the thing that people don't get, like all the things you do, the data is coming back to you. So yeah, that's how it's fun, these people with the scores, and how many, what rating do you think these people have? Any idea? I'll give you, these are four-star people. They got four-star coffee, right? I guess the core thing that I try to do here towards getting the public to understand is that, for me, it's framing it as an environmental issue. I mean, that's the thing that I think could be very useful, and that's of course where social cooling comes in. So I try to explain to people that data's not a new goal, I think that's a pretty bad metaphor. I think because gold is so great, I think we need, there's nothing wrong with gold. People like gold. With oil, we better understand that there are downsides to it, right? We really understand that now. We're in the middle of going away from oil, we see the downsides. I think we need to start seeing the same thing with data and look at it like we're looking at oil now, and it leads to not environmental nature damage, but damage of our social environment, you could say. So that's again why I believe this metaphor is useful to say, if oil leads to global warming, data could lead to social cooling. So I created a little map and I think if we look at where we are now, we're only just starting with this awareness, it's early days. I mean, there's no leaks, we're a start of that, but people, almost nobody understands this connection in my experience. Again, yeah, I already explained this. I think what we have to understand is the long-term, this could be damaging trust, like if oil damages the environment, you could say that the trust is also something that's being destroyed. I think the third thing we have to do after the politicians have to understand it and the citizens have to understand it is that we have to create markets for privately friendly products. So I think, again, we could use metaphors there, like biological food could turn into ecological data or something like that, right? We have to get some kind of, this is probably not the metaphor, or if you have to find, again, metaphors to get people to understand, they need to have good data, good services. Just like we started to eat biological food, we need to have that for data. So, yeah, otherwise we could end up with things like this or this data has severe effects on your chance on the labor market. And finally, the thing that I think we have to stop doing individually is we have to stop saying that we have nothing to hide. I think that's most you probably understand that's a problem. And we're gonna skip the big picture, which I'm, because I don't have any time, right? To do that, okay. Very briefly to say is that, there's a philosopher who explains that when people say they have nothing to hide, they mean against the old system of control, like police, the crime, police picks you up, you go in front of a judge, you go into prison. We understand that system of control. That's what people mean when they say I have nothing to hide. They mean towards that system. They understand that. But there's a new subtle system on top of that about you've performed the wrong behavior that gets noticed and then you get subtly influenced to perform for more preferable behavior. That's what the looser points to and I think that's what's going on here. So when people say I have nothing to hide, they mean from the old system, from the police. You do have something to hide, right? So if anything I've learned in the past couple of years, I've been able to explain in one sentence what privacy is, privacy is the right to be imperfect. And that's a really important right to have, right? Because we're all human, so we're all important all the time. I often hear nerds say, well, privacy was just a phase, we're going away from that, like it's some kind of ebb and a flood thing. I think that's ridiculous. I mean, if you compare it to other rights that we have, you see that, right? If you say women's rights were just a phase, was there for a while, it's gonna go away. Now you see how silly this is to say. And of course Edward Snowden does it even better when he says arguing that you don't care about the right to privacy because you have nothing to hide is no different than saying you don't care about the right speech because you have nothing to say, right? So privacy is a fundamental human rights and there's a reason for that. In conclusion, we really have to understand, develop a nuanced understanding of these privacy issues and the data issues, especially all of us, politicians, my mom, we have to anticipate these long-term side effects that I've been talking about, we have to anticipate social cooling. And I hope that's an optimistic view for me. I hope that's a way, by comparing it to global warming, that also says that we can fix this, right? We are fixing global warming, we can fix data. If we don't, we might end up with a society that's more well-behaved, but also perhaps a little bit less human. Thank you. Thanks. Any questions? Thank you very much, guys, for being here, attending someone who is willing to stand in, coming straight from heaven. No problem, man. Talking to us about cooling of society, heaven's gift. Okay, we still have 14 minutes for the next lecture, but I have someone who found the microphone. Are you willing to answer one question? I couldn't agree more than, it was fantastic what you said. You're welcome. But my question is, don't they realize that we need creative people? We need people who do unexpected things. We need people who are out of the, yeah, across any border or, because they are the future and they have to create wealth. So if we suppress any innovation because it was not seen before, I have worked in research and it was very curious that they said, oh, this is totally new. I said, yes, that's why I'm here, yeah? Well, again, my whole point is yes, we don't, most people don't realize that. I think fear is a short-term emotion and creativity is a long-term emotion, you could say. It's about long-term understanding of technology and that's really something that we don't excel at currently and that we have to improve. And of course, this is not really a story that a lot of tech people really want to hear, right? Because they want to hear the story where their blockchain is fixing the world and everything's gonna be amazing. That's the Silicon Valley story, of course, that sells technology. If you're gonna say, well, let's be nuanced, let's also look at the negative sound effects in the long-term, it's always gonna be an uphill battle, yeah. I'm afraid so. Okay, guys, welcome to the semi-automated society of today. That's what it is. Thank you, see you for the next lecture in 13 minutes. Bye-bye.