 Welcome to our next talk, social cooling. You know, people say, I have no problem with surveillance. I have nothing to hide. But then, you know, maybe the neighbors and maybe this and maybe that. So tonight, we're going to hear Taiman Schipp, the Fussel Balland. He's a privacy designer and a freelance security researcher, and he's going to hold a talk about how digital surveillance changes our social way of interacting. So please, let's have a hand for Taiman Schipp. Really cool that you're all here, and really happy to talk here. It's really an honor. My name is Taiman Schipp, and I am a technology critic. And that means that it's my job to not believe what he tells us. And that's really a lot of fun. How do I get a wider audience involved in understanding technology and the issues that are arising from technology? Because I believe that change comes when the public demands it. I think that's really one of the important things when change happens. And for me as a technology critic, for me words are very much how I hack the system or how I try to hack this world. So tonight I'm going to talk to you about one of these words that I think could help us. Because framing the issue is half the battle. We can frame the problem. If we can explain what the problem is in a certain frame that makes certain position already visible, that's really half the battle one. So that frame is social cooling. But before I go into it, I want to ask a question. Who here recognized this? You're on Facebook or some other social site. And you can click on the link because you think, ooh, I could click on this, but it might look bad. It might be remembered by someone. Some agency might remember it and I could click on it, but I'm hesitant to click. Is that better? Can everyone hear me now? No. No? Okay. Should I start again? Okay. So you're on Facebook and you're thinking, ooh, that's an interesting link. I could click on that, but you're hesitating. Because maybe someone's going to remember that and that might come back to me later. And who here recognizes that feeling? So pretty much almost everybody. And that's increasingly what I find when I talk about this issue that people really start to recognize this. And I think a word we could use to describe that is click fear. This hesitation could be click fear. And you're not alone. Increasingly we find that research points, that this is a wide problem, that people are hesitating to click on a lot of links. For example, after the Snowden revelations, people were less likely to research issues about terrorism and other things on Wikipedia because they thought, well, maybe the NSA won't like it if I push that. Okay, I'm not going to move. And we see it in Google as well. So this is a pattern that research are pointing to. And it's not very strange, of course. I mean, we all understand that if you feel you're being watched, you change your behavior. That's a very logical thing that we all understand. And I believe that technology is really amplifying this effect. I think that's something that we really have to come to grips with. And that's why I think social cooling could be useful for that. Social cooling describes in a way that in an increasingly digital world where our digital lives are increasingly digitized, it becomes easier to feel this pressure, to feel these normative effects of these systems. And very much you see that because increasingly your data is being turned into thousands of scores by data brokers and other companies. And those scores are increasingly influencing your chances in life. And this is creating an engine of oppression, an engine of change that we have to understand. And the fun thing is that, in a way, this idea is already really being helped by Silicon Valley, who for a long time have said data is a new gold, but they've recently in the last five years changed that narrative. Now they're saying data is the new oil. And that's really funny, because if data is a new oil, then immediately you get the question, wait, oil gave us global warming, so then what does data give us? And I believe that if oil leads to global warming, then data could lead to social cooling. That could be the word that we use for these negative effects of big data. In order to really understand this and go into it, we have to look at three things. First, we're going to talk about the reputation economy, how that system works. Second chapter, we're going to look at behavior change, how it is influencing us and changing our behavior. And finally, to not let you all go home depressed, I'm going to talk about how can we deal with this. So first, the reputation economy. Already we've seen today that China is building this new system, the social credit system. It's a system that will give every citizen in China a score that basically represents how well behaved they are. And it will influence your ability to get a job, a loan, a visa, and even a date. For example, the current version of the system, Sesame Credit, one of the early prototypes, already gives everybody that wants to a score, but it also is connected to the largest dating website in China. So you can kind of find out, is this person that I'm dating, what kind of person is this? Is this something who's, you know, well-viewed by Chinese society? This is where it gets really heinous for me, because until now you could say, well, these reputation systems, they're fair. If you're a good person, you get a higher score. If you're a bad person, you get a lower score. But it's not that simple. Your score influences your score, and your score influences your friend's score, and that's where you really start to see how complex social pressures arrive and where we can see the effects of data stratification, where people start to think, hey, who are my friends and who should I be friends with? Okay, now you can think, that only happens in China. Those Chinese people are different. But the exact same thing is happening here in the West, except we're letting the market build it. I'll give you an example. The Danish company, and this is their video for their service. Nope. Renting apartments from others, and she loves to swap trendy clothes and dresses. She's looking to catch her first lift from a rideshare app, but has no previous reviews to help support her. Luckily, she's just joined Demi, where her positive feedback from the other sites appears as a deemed score, helping her to win a rideshare in no time. Demi is free to join and support users across many platforms, helping you to share and benefit from the great reputation you've earned. Imagine the power of using your deemed score alongside your CV for a job application. Like in China. Perhaps to help get a bank loan, or even to link to from your dating profile. Like in China. Sign up now at demily.co. Demily, better your sharing. Thanks. There is a change, there is a difference still. The fun thing about here is that it's highly invisible to us. The Chinese government is very open about water building, but here we are very blind to what's going on. So, mostly when we talk about these things, then we're talking about the systems that give us a very clear rating, like Airbnb, Uber, and of course the Chinese system. But the thing is most of these systems are invisible to us. There's a huge market of data brokers who are, you know, not visible to you because you are not the customer. You are the product. And these data brokers, well, what they do is they gather as much data as possible about you. And that's not all. They then create up to 8,000 scores about you. In the United States these companies have up to 8,000 scores. And in Europe it's a little less of around 600. These are scores about things like your IQ, your psychological profile, your gullibility, your religion, your estimated lifespan. 8,000 of these different things about you. And how does that work? Well, it works by machine learning. So machine learning algorithms can find patterns in society that we can really not anticipate. For example, let's say you're a diabetic. And, well, let's say your this data broker company has a mailing list or has an app that diabetic patients use. And they also have the data of these diabetic patients, what they do on Facebook. Well, then you can start to see correlations. So if diabetic patients, often like gangster rap and pottery on Facebook, well, then you could deduce from that that if you also like gangster rap and pottery on Facebook, then perhaps you also are more likely to have or get diabetes. It is highly unscientific, but this is how this system works. And this is an example of how that works with just your Facebook scores. C was lowest, about 60% when it came to predicting whether users' parents were still together when they were 21. People whose parents divorced before they were 21 tended to like statements about relationships. Drug users were ID'd with about 65% accuracy, smokers with 73%, and drinkers with 70%. Sexual orientation was also easier to distinguish among men. 88% right there. For women, it was about 75%. Gender, by the way, race, religion, and political views were predicted with high accuracy as well. For instance, white versus black, 95%. So an important thing to understand here is that this isn't really about your data anymore. Like, oftentimes when we talk about data protection, we talk about, oh, I want to keep control of my data. But this is their data. This is data that they deduce, that they derive from your data. These are opinions about you. And these things are what, you know, make it so that even though they're filled in a psychological test, they have one. A great example of that and how it is used is a company called Cambridge Analytica. This company has created detailed profiles about us through something what they call psychographics and I'll let them explain it themselves. By having hundreds and hundreds of thousands of Americans undertake this survey, we were able to form a model to predict the personality of every single adult in the United States of America. If you know that the personality of the people you're targeting, you can nuance your messaging to resonate more effectively with those key audience groups. So for a highly neurotic and conscientious audience, you're going to need a message that is rational and fear-based or emotionally-based. In this case, the threat of a burglary and the insurance policy of a gun is very persuasive. And we can see where these people are on the map. If we wanted to drill down further, we could resolve the data to an individual level, where we have somewhere close to four or five thousand data points on every adult in the United States. So yeah, this is the company that worked with both the Brexit campaign and with the Trump campaign. Of course, a little after the Trump campaign, all the data was leaked. So data on 200 million Americans was leaked and including you, again, you see this data described as and religions. So this is this derived data. You might think that when you go online and use Facebook and use all these services that advertisers are paying for you. That's a common misperception. That's not really the case. What's really going on is that according to FCC research, majority of the money made in this data broker market is made from risk management. So in a way, you can say that it's not really insurer of not really marketers that are paying for you. It's your bank. It's insurers. It's your employer. It's governments. These kind of organizations are the ones who buy these profiles. The most, more than and than the other ones. Of course, the promise of big data is that you can then manage risk. Big data is the idea that with data you can understand things and then manage them. So what really is innovation in this big data world and this data economy is the democratization of the background check. That's really the core of this market that now you can find out everything about everyone. So, yeah, now you're in the past only perhaps your bank could know your credit score, but now your greengrocer knows your psychological profile. That's a new level of, yeah, what's going on here. It's not only marketable but it's also huge. According to the same research by the FCC, this market was already worth $150 billion in 2015. So it's invisible, it's huge and hardly anyone knows about it. But that's probably going to change. That brings us to the second part, behavior change. We already see this first part of this how behavioral change is happening through these systems. That's through outside influence. And we've talked a lot about this at this conference. For example, we see how Facebook and advertisers try to do that. We've also seen how China is doing that, trying to influence you. Russia has recently tried to use Facebook and companies like Cambridge and Litigar to try to do the same thing. And here you can have a debate on to what extent are they really influencing us. But I think that's not actually the most interesting question. What interests me most of all is how we are doing it ourselves. How we are creating new forms of self-censorship and are proactively anticipating these systems. Because once you realize that this is really about risk management, you start to, you understand you. When people start to understand that, this will go beyond click fear, you might remember. This will go beyond, this will become, you know, when people find out this will be, you know, not getting a job, for example. This will be about getting really expensive insurance. This will be about all these kinds of problems and people are increasingly finding this out. So, for example, in the United States, if you, the IRS might now use data profiles, are now using data profiles to find out who they should audit. So I was talking really to a girl and she said, oh, I recently tweeted about a negative tweet about IRS and she really grabbed her phone to delete it and she realized that, you know, this could now be used against her, in a way. And that's the problem. Of course, you see all kinds of other crazy examples that the audience, the wider public is picking up on, like, ooh, so we now have algorithms that can find out if you're gay or not. These things scare people and these things are something that we don't understand. So, chilling effects, this is what this boils down to. For me, more importantly than these influences of these big companies and nation states is how people themselves are experiencing these chilling effects, like you yourself have as well. That brings us back to social cooling. For me, social cooling is about these two things combined. At the one hand, this increasing ability of agents and groups to influence you. On the other hand, the increasing willingness of people themselves to change their own behavior to proactively engage with this issue. There are three long-term consequences that I want to dive into. The first is how this affects the individual. The second is how it affects society. And the third is how it affects the market. So, let's look at the individual. Here, we've already seen there's a rising culture of self-censorship. It started for me with an article that I read in New York Times where students were saying, well, we're very, very reserved. She's going to things like spring break and said, well, you don't want to defend yourself later, so you don't do it. And what she's talking about, she's talking about doing crazy things, you know, letting go, having fun. She's worried that the next day it will be on Facebook. So what's happening here is that you do have all kinds of freedoms. You have the freedom to look up things, you have the freedom to say things, but you're hesitating to use it. And that's really insidious. That has an effect on a wider society. And here we really see the societal value of privacy. Because in society, often minority values later become majority values. An example is weed. I'm from the Netherlands. And there you see, you know, at first it's something that you just don't do and it's, you know, a bit of a woo. But then, oh, maybe, yeah, you should try it as well and then people try it and slowly under the service of the society people change their minds about these things. And then after a while it's like, you know, what are we still worried about? Well, this same pattern happens, of course, with way bigger things like this. I must honestly say to you that I never intend to adjust myself through racial segregation and discrimination. This is the same pattern that's happening for all kinds of things that change in society. And that's what privacy is so important for. And that's why it's so important that people have the ability to look things up and to change their minds and to talk about each other and so watch all the time. The third thing is how this impacts the market. Here we see very much the rise of a culture of risk avoidance. An example here is that in 1995 already doctors in New York were given scores. And what happened was that the doctors who tried to help advanced stage cancer patients, complex patients who tried to do the operation, difficult operations got a low score because these people more often died. Well, doctors that didn't lift a finger and didn't try to help got a high score because, well, people didn't die. So you see here that these systems that they bring all kinds of perverse incentives. They lower the willingness of everybody to take risks and in some areas of society we really like people to take risks, like entrepreneurs, doctors. So in the whole part you could say that this what we're seeing here is some kind of trickle down risk aversion. The way that companies and governments want to manage risk that's trickling down to us. We of course want them to like us, want to have a job, want to have insurance and then we increasingly start to think maybe I should not do this. It's a subtle effect. So how do we deal with this? Well, together I think this is a really big problem. I think this is such a big problem that it can't be managed by just some hackers or nerds building something or by politicians doing a lot. This is a really society-wide problem. So I want to talk about all these groups that should get into this. The public, politicians, business and us finally. So the public. I think we have to talk about and maybe extend the metaphor of the cloud and say we have to learn to see the stars behind the cloud. That's one way that's a narrative we could use. I really like to use humor to explain this to a wider audience. So for example last year I was part of an exhibit to help develop an exhibit about dubious devices and one of the devices there was called Taster Status which was a coffee machine that gave you coffee based on your area code. So if you listen to good area code you get nice coffee you live in the bad area code you get bad coffee. I won't go into it but these kind of like often times you can use humor to explain things to a wider audience. I really like that approach. We've got a long way to go though. I mean if we look at the long how long it took for us to understand global warming to really come to a stage where most people understand what it is and care about it except Donald Trump well with data we've really got a long way to go we're really at the beginning of understanding this issue like this. So the second group that has to really wake up is politicians and they have to understand that this is really about the balance of power this is really about power and if you permit me I'll go into the big picture a little bit as a media theorist so this is called Gilles Deleuze he's a French philosopher and he explained in his work something that I find really useful he said you have two systems of control in society and the one is the institutional one and it's the one that we all know the judicial system so you're free to do what you want but then you cross a line, you cross a law then the police gets you you go in front of a judge, you go to prison this is a system we understand but he says there's another system which is the social system the social pressure system and this for a long time wasn't really designed but now increasingly we are able to do that so this is a system where you perform suboptimal behavior and then that gets measured and judged and then you get subtly nudged in the right direction and there's some very important differences between these two systems the institutional system has this idea that you're a free citizen that makes up your own minds and while the social system is like that's working all the time constantly, it doesn't matter if you're guilty or innocent, it's always trying to push you the old system the institutional system is very much about punishment so if you break the rules you get punishment but people sometimes don't really care about punishment, sometimes it's cool to get punishment but the social system uses something way more powerful which is the fear of exclusion we are social animals and we really care to belong to a group the other difference is that it's very important that the institutional system is accountable, you know, democratically to us while the social system at the moment is really really invisible these algorithms, how they work, where the data is going it's very hard to understand of course it's exactly what China loves so much about it there's no, you can stand in front of a tank but you can't really stand in front of the cloud so, yeah, that's great it also helps me to understand when people say I have nothing to hide I really understand that because when people say I have nothing to hide, what they're saying is I have nothing to hide from the old system from the classic system, from the institutional system they're saying I want to help the police I trust our institutions and that's actually really a positive thing to say the thing is they don't really see the other part of the system how increasingly there are parts that are not controlled or not democratically checked and that's really a problem so, the third thing that I think we have to wake up is business business has to see that this is not so much a problem perhaps but that it could be an opportunity I think I'm still looking for the metaphor here but perhaps if we, again compare this issue to global warming we say that we need something like ecological food for data but I don't know what that's going to look like or how we're going to explain that, maybe we have to talk about fast food versus fast data versus ecological data but we need a metaphor here of course laws are also really helpful so we might get things like this and I'm actually working on this, it's funny or if things go really out of hand we might get here right? so luckily we see that in Europe the policies are awake and they're really trying to push this market, I think that's really great so I think in the future we'll get to a moment where people say well I prefer European smart products, for example I think that's a good thing, I think this is really positive finally I want to get to all of us, what each of us can do I think here again there's a parallel to global warming where at this core it's not so much about the new technology and all the issues, it's about a new mindset a new way of looking at the world I think we have to stop saying that we have nothing to hide, for example if I've learned anything in the past years understanding and researching privacy and this big data market is privacy is the right to be imperfect increasingly there's pressure to be the perfect citizen to be the perfect consumer and privacy is a way of getting out of that so this is how I would reframe privacy it's not just being about which bits and bytes go where but it's about a human right to be imperfect, because of course we are human we are perfect and sometimes when I talk at technology conference people say well privacy was just a phase it's like ebb and flood and we got it and it's going to go away again that's crazy, you don't say women's rights are just a phase we had it for a while and it's going to go again and of course Edward Snowden explains it way better he says arguing that you don't care about the right to privacy because you have nothing to hide it's no different than saying you don't care about free speech because you have nothing to say you're a human so I think what we have to strive for here is that we develop a more nuanced understanding of all these issues I think we have to go away from this idea that more data is better data is automatically progress no it's not, data is a trade-off for example for the individual more data might mean less psychological security less willingness to share less willingness to try things for a country it might mean less autonomy for citizens and citizens need their own autonomy for people to vote in their own autonomous way and decide what they want in business you could say more data might lead to less creativity less willingness to share new ideas to come up with new ideas so that's again an issue there so in conclusion social cooling is a way of understanding these issues or a way of framing these issues that I think could be useful for us that could help us understand and engage with these issues and yes, social cooling is an alarm, it's alarmist I'm trying to say this is the problem and we have to deal with this but it's also really about hope I trust not so much in technology, I trust in us in people that we can fix this once we understand the issue in the same way that when we understood the problem with global warming we started to deal with it it's slow progress but we're doing that and we can do the same thing with data it'll take a while but we'll get there and finally this is about starting to understand the difference between shallow optimism and deep optimism often times the technology sector is like cool, do technology and we're going to fix this by creating an app and for me that's we have to be optimistic but that's very shallow optimism that TEDxMIC optimism true optimism recognizes that each technology comes with a downside and we have to recognize that that's not a problem to point out these problems that's a good thing, once you understand the problems you can deal with them and come up with better solutions if we don't change in this mindset then we might create a world where we're all more well behaved but perhaps also a little bit less human thank you we still have five more minutes we'll take some questions if you like first microphone number two hello, thanks that was a really interesting talk I have a question that I hope one will work it's a bit complicated there's a project called ND by a foundation called the sovereign foundation do you know about it? okay, very great, perfect so just to quickly explain these people want to create an identity layer that will be self sovereign which means people can reveal what they want about themselves only when they want but is one unique identity on the entire internet so that can potentially be very liberating because you control all your identity and individual data but at the same time it could be used to enable something like the personal scores we were showing earlier on so I wanted if you had an opinion on this yes, well the first thing I think about is that as I try to explain you see a lot of initiatives that try to control your own data but that's really missing the point that it's no longer really about your data it's about this derived data and of course it can help to manage what you share from it but too little I see that awareness second of all this is very much for me an example of what nerds and technologies are really good at it's like oh we got a social problem let's create a technology app and then we'll fix it well what I'm trying to explain is that this is such a big problem that we cannot fix this with just one group alone not the politicians, not the designers not the nerds this is something that we have to really get together fix together because this is such a fundamental issue the idea that risk is a problem that we want to manage risk is such so deeply ingrained in people it's such a base in fear, it's fundamental and it's everywhere so it's not enough for one group to try to fix that it's something that we have to come together with together thanks a lot there is a signal angel who has a question from the internet I think yes, backing sheep is asking do you think there's a relationship between self-censorship and echo chambers in a sense that people become afraid of their own belief and thus isolate themselves in groups with the same ideology that's a really big answer to that one I was emailing Vinsurf and miraculously he responded and he said what you really have to look for is this, not just the reputation economy but also the attention economy and how they are linked so for a while I've been looking for that link and there's a lot to say there and there definitely is a link I think important to understand or what I could nuance here is that I'm not saying that everybody will become really well behaved and grey bookworm people the thing is that what this situation is creating is that we're all becoming theatre players we're all playing in identity more and more because we're watched more at a time and for some people that might mean that I think most people will be more conservative and more careful some people will go really all out and oh, enjoy the stage we have those people as well and I think those people could really benefit and the attention economy could really give them a lot of attention through that so I think there's a link there I could go on more but I think it's for now where I'm going to we're short on time, we'll take I'm sorry, one more question the number one I think the audience you're talking to the audience you're talking to here is already very aware but I'm asking for like tactics or your tips to spread your message and to talk to people that are in this phase saying ah, I don't care they can surveil me what's your approach in a practical way how do you actually do this yeah so I'm really glad to be here because yes I am a nerd but I'm also a philosopher or a thinker that means that for me what I work with is just our renews but words and ideas I think those I've been trying to show can be really powerful like a word can be a really powerful way to frame and debate or engage people so I haven't found yet a way to push all this talk like I was making a joke that I can tell you in one sentence what privacy is and why it matters but I have to give a whole talk before that privacy is the right to be imperfect but in order to understand that you have to understand how it affects your education economy and how it affects your chances in life the fun thing is that that will happen by itself that people will become more aware of that they will run into these problems they will not get a job or they might get other issues and then they will start to see the problem and so my question is not so much to help people understand it but to help them understand it before they run into the wall that's how usually society at the moment deals with technology problems is like oh it's a problem I believe you can really see these problems come way earlier and I think humanities where I'm from is really helpful on that in trying to like the looser really clearly explaining what the problem is in 1995 so yeah I think that I don't have a short way of explaining why privacy matters but I think it will become easier over time as people start to really feel these pressures um sorry thank you very much for the question I think we all should go out and spread the message this talk is over I'm awfully sorry um when you people leave please take your bottles and your cups and all your junk and thank you very much again thank you