 Okay, well thank you very much everyone for coming today, I really appreciate your time. We also have a couple of other people coming a bit later but it's lovely to see you all here. Today we're talking about privacy and we have three guest speakers who are very well esteemed but I will be introducing them later. First of all I'll talk a little bit about the organizers of this event. So Studium Generale have been organizing a large variety of events on social issues just like this current one today and the range of what they offer is also even broader in enhancing the university's culture such as with movie nights or open mic evenings. So please make sure that you take a look at the university website if you're a student here to keep a look about what Studium Generale are offering. In fact, if you would like to bolster your CV, events like this, they'll have a little check mark online, you can use to, if you've been to enough of these events, to get a certificate that can contribute to your CV or be a little badge on it. So make sure you check about that online as well. This symposium is also co-organized by the privacy and security team of Tilbo University. As you may know it's currently the university's yearly privacy and security awareness week and so this current event has actually been organized as a part of that. Also to introduce myself as the moderator of this event today, my name is Lucas Jones. I've been stuck at Tilbo University for eight years. I was a student here in my bachelor's in 2015 and then I did my master's degree here for law and technology and then on the day of my graduation I was hired to be a lecturer for the law and technology department and I've been working as the thesis coordinator over these few years and this means that I've seen a lot of different topics that the students have been writing about and in particular I've seen, you know, the students explore a huge variety of different technical contexts, so from medical devices to social media platforms to self-driving vehicles, generative AI and the list goes on and on but no matter how varied the technical context there is one sort of central theme that many students end up turning to, privacy, a huge proportion of our students ultimately end up orienting their topic in some way towards privacy, so for example trying to see whether or not regulators can or should ensure a certain or sufficient degree of privacy in a particular technical context and why is this the case? Why is this such a salient topic area and what does it even mean to have a sufficient degree of privacy? There are so many important questions in this domain that even the title of this symposium begins with a question, nothing to hide. So I will leave the meat of this matter to the guest speakers today but let me start off just by asking you a few questions. Who here, quite a simple one, has a mobile phone? I'd imagine all of you right and who here has, if I perhaps go a bit more into your private life, a wearable technology, so a Fitbit or an Apple Watch or also your AirPods count as well? So a decent number of us. Now I wonder how many of you are aware about the fact that Apple has recently patented technology for their AirPods and so these AirPods in the future may have the next generation sensors that are able to monitor and measure the biosignals of its users, including the electrical activity of its user's brains and perhaps this can bring a lot of interesting benefits so maybe you can make yourself seem super clever in a meeting by silently and secretly googling things through your AirPods nonverbally or maybe you can do so while you're commuting or perhaps your AirPods may be able to tailor specific music playlists for you based upon how your brain responds to certain types of music at certain types of days. So in my view, perhaps this seems quite nice and desirable. Generally speaking, society has been quite accepting of technology that gives us personalized benefits in return for accessing our information. But I don't know about you, but when Amazon was developing Alexa, I was one of the sort of people who would say, you know, there's no way I'm going to have that device in my living room listening to everything that's going on in my home and yet now with my AirPods, I walk around with two microphones strapped to my face whenever I'm listening to music and now I don't even realize. My point here is not about Apple's relationship with privacy but more about how the goalposts have been shifting and how technology has been in quite a granular way encroaching more and more into our privacy and tempting us with exciting benefits in return. While our right to private life is something codified in law, privacy in general may mean many different things to many different people. It's a very multifaceted domain and in that regard it's very important that we take a multidisciplinary approach in answering or tackling questions about it. That's why I'm particularly excited in regards to the diversity on our stage today. So I'll introduce the speakers individually before they come on stage. But just to illustrate what we have, we have Dr. Birch Abcopes, who is a professor of regulation and technology. We also have Dr. Joanna Strykeritz, an assistant professor of persuasive communication at the University of Amsterdam. So welcome to Tilburg or below the rivers. And Dr. Nicole Houtz, who is an assistant professor and researcher in the domain of risk perception, environmental psychology and human technology interaction. So how will this symposium go? In the first half we will have two lectures of 15 minutes each, each followed by a short five minute Q&A. Then we will have a break of around 15 minutes, or sorry, 10 minutes. And then in the final half, we will have our final lecture with a Q&A and then a more interactive session where we can ask wider questions to the three speakers on the stage. And maybe you also have any opinions or thoughts that you want to input. So without further ado, let me introduce our first speaker, Professor Birch Abcopes, who is a full professor of regulation and technology at the Tilburg Institute for Law, Technology and Society, the department I also work at. Birch Abcopes main research fields are in cyber security, cyber investigation, privacy and data protection. And he's also involved in topics such as DNA forensics, identity, digital constitutional rights, code as law and regulatory implications of human enhancement, genetics, robotics and neuroscience. So a lot. He currently teaches the courses cybercrime, law and technology and privacy and the protection of personal data at Tilburg University. And in his keynote talk, he will share insights from his NWO Vici project that investigated how to protect privacy in the 21st century. So let's have a round of applause for Professor Birch Abcopes. Thank you very much for the introduction. Thank you for having me. I want to share my views on privacy, a little bit of my knowledge, what privacy is, why it matters and why preserving privacy is challenging in the digital age. I was a bit surprised by the overall title of the security week, Nothing to Hide. I thought we have left that behind. But just for argument's sake, who of you has nothing to hide? Which doesn't surprise me. But if I'd ask the question 15 years ago or so, probably a few hands would have raised. Not that people really didn't have anything to hide back then, but people would use the argument with the idea, I don't mind that they process personal information of me, because they would think it's largely the personal information of others that will be processed. And even if it is of me, it will not lead to any damage. And that, of course, is a fallacy. People realize nowadays that information that is out there can be used in some context by someone in some ways that are detrimental to you. You don't know when, you don't know where, why, by whom, but it might be. So that makes it important to have a certain degree of privacy. The best quote I know about privacy is that it is like oxygen. You only notice it when it's gone. It's so taken for granted that you have privacy that you don't realize it until you no longer have it. When you have a serious violation of your privacy, you notice that you had it before. And then it's usually too late. You can't get the genie back into the bottle if something is known about you. Certainly nowadays on the internet, it's very difficult to get that removed from collective knowledge. So what is privacy? For me, there are many definitions based on all the literature. This is the definition I came up with, because this is telling something about why privacy matters. For me, privacy is having spaces in which you can be yourselves. I use the multiple because you're not just one person, you have multiple identities as a student, as someone who works, as a father, as a son, a daughter, a sibling, as someone who goes to the gym, someone who plays the guitar in a band, someone who likes Harry Potter, someone who votes extreme right, all kinds of identities. And you show different identities in different contexts to different people in different situations. And it is important that you have room for being these different people, being yourself, in different ways, in different contexts. And privacy is what ensures that you are able to do so when you want to, so that you can relax when necessary, not be playing certain roles all the time. But you can also drop down your mask, you can feel free to say what you want to say, to do what you want to do, without fear of others using what you do, what you say against you in the future. So privacy has two dimensions, a spatial connotation, boundary management is a stream in privacy literature that you can decide whom to allow access, who you want to have close to you, and whom you want to keep away, boundary management, but also the element of being yourself, of self development. And in a typology of privacy that I developed with colleagues, a couple of years ago, in a large scale project, we did on how to preserve privacy in the digital age, where we try to systematically map all the different types of privacy that are out there in a certain way. I won't go into all the details, but this is structured along two dimensions, a horizontal one and a vertical one, the horizontal one is the spatial element, you could say, starting from the far left, where you're all by yourself, alone, in your bedroom, for instance, or the most private space that you have your body, and your mind. Emminently the place is where you can be yourself within your body, within your mind, without any interaction to others. But of course, social life, people are social beings, you interact with others. First of all, intimately with a few intimate others, your partner, your parents, your kids, close family members, where you want to enjoy a degree of spatial privacy in the privacy of your home, which is eminently the place where you can be yourself, traditionally speaking. But also where you make decisions on whom to interact with intimately, life decisions, whether or not to take children, have children decisions on life and death, which is an element of decisional privacy. In the wider sphere of life, you interact with more people in the semi private zone, for instance, in a lecture hall like here, it's publicly accessible, but it's a bit private in the sense that not everyone is getting here. And this is one example of the many places, think of the pub, think of the gym, think of a political cafe, all kinds of meeting rooms where you meet other people, interact with other people, with a certain expectation that the things that you discuss there remain private, communicational privacy, or that you can associate with the ones you have, think of alcoholics anonymous, for instance, share information about your life without fear that the information will be used against you. And then on the further right in the completely public zone where in principle anyone can meet you, you still have a certain expectation of privacy in the things you carry with you, you don't want people to just open your bag and look what you're carrying with you. Or when you move around in public on the streets, you don't expect people to extremely stare at you and look at you know your hope to be a little bit inconspicuous, one among the crowd. So that is the spatial element of privacy. But on the vertical side, we've put the positive and negative sides of privacy's freedom, you have positive freedoms and negative freedoms, privacy is usually thought of as a negative element. The most well known definition is probably the right to be let alone, which is a way of warding off other people. Leave me alone. Leave me in peace. And the boundary management also has that element of shielding of others. But privacy also has an important positive side, the side of developing yourself, being able to be who you want to be in different contexts, being different selves, changing who you are. And so the self development is also an important element of what privacy facilitates. I don't have much time to discuss privacy with you. I want to share one element of my research in the past years that I thought might be relevant for today. Privacy in the digital age. And for this, I start with a mapping exercise I did. What are these spaces in which you can be yourselves? What are traditional privacy places? And there are remarkably many large wide variety of all different kinds of places in society, where at different times of the day in different times of your life, you want to be able to be yourself. With the same zones that I explained, but this is a bit of a different axis, vertical here, which at the top is related to that boundary management where you can control whom you want to allow access, where you can simply close the door or open the door if you want to give people access to your private life. But down here, at the bottom side, there is no room for control. You can't really control how others access you. People can see what you're doing or can hear what you're saying. But you rely on people's discretion not to use what they hear, what they see against you, not to tell others. And that's of course very contextual to what extent you can expect them not to reveal what they have seen. Now, the point of this exercise was that nowadays we also have digital spaces, primarily the combination of your smartphone, personal computer and the cloud in which information is stored, seamlessly connected. I'm sure you don't know exactly what is on your smartphone itself and what is in the cloud, because it's all retrievable at the same moment. But that is your digital space. And in contrast to the physical spaces where you move from one to another, and it takes a little bit of time to move from one to another, you seamlessly, in the flip of a second, move between different areas in your digital space. So in a sense, these digital devices constitute, I think, an equivalent of the home in the sense that where the home, the physical place, the house where you live used to be the place where you could most feel yourself, where you would keep your private belongings, your memories are stored there, your music, your books, your photographs, you all leave that at home and then you move out into the world. Nowadays, all those things are stored in your digital devices in your digital space. So in a sense, digital space is the equivalent of your traditional home as the place where you can be yourself. But of course, you do much more with your smartphone. You not only store your private things there, you also do all kinds of interactions. And so if you look at what is being done in digital spaces, you store content there, the equivalent of letters, the equivalent of your diary, you express your thoughts. But in particular, when you look at all the activities that you enact, where you use your smartphone, where you use digital interactions, you see all kinds of activities that you used to be able to do in very different separate private spaces. And in a sense, your digital devices constitute all of the social activities that you do throughout life, throughout the day. So the point of this is that privacy in the digital age is complex because we have spaces that collapse instead of all these different spaces with different boundaries and different social norms that exist within those spaces. You now have a digital space in which all these boundaries are a bit blurred. You move easily from one space to another. Information moves easily from one place to another. So the context in which you enact your life collapse. And that also means that your partial identities collapse, or that what you do in one situation might also turn up, intended or not, in other contexts showing other parts of your identity. And this means that identity management, if you want to be yourself, it's not only the contextual self you are at the moment. In principle, when you have a digital component, and nowadays probably you have your phones turned off, but it might still be receiving those. Collects your location at this moment. So you're even generating digital traces at this very moment. All those information might turn up in future in different contexts. So in principle, if you want to preserve privacy, you would have to think of all the things that might happen in future of the one thing you're doing at this moment, which makes life very complicated. So I think that is one of the challenges of privacy in the digital age. So time is running out. I don't have much time for this presentation. Time is also running out for privacy. What should we do? Well, I'm a lawyer, or at least I pretend to be. So the first response is we need to update the law. Fortunately, that is happening. The law is being changed in many ways up to a point. Not as quickly as you might hope. Although sometimes quicker than is alleged. It's not always the case that the law is very slow and technology is just very quickly moving forward. Lawyers can make laws quickly. But there is a battle to be done here. But just updating the law I think is not enough. Because if you look carefully, many of the legal frameworks we have are based on assumptions that still think we're living in the 20th century. Namely that your home is your most private space, that your communications contents are the most private aspects of your life. Which no longer is the case. I can't go into that. We can discuss it later on if you're interested. But there are many ways in which I think the laws should be fundamentally revised, which I also think is not going to happen. Simply because fundamental revisions of the law takes a lot of time and a lot of political will that is not there. So instead you could also look, maybe not instead, besides you can also look at the technology itself. And also this is happening. People are trying to enact privacy by design, data protection by design, by building in privacy norms, privacy safeguards in the design of technology itself, in the devices, in the software systems, in the services. Which is fine, but which also works only up to a point. Privacy is a very fuzzy concept and it's particularly context sensitive. Which means that it's very hard to translate into digital rules. So design based privacy protection is useful, but it's not exactly going to solve the entire problem. So you look at the people who produce information, who process information about you, service providers, also the producers of devices of software, governments who process a lot of the data about you. They have a large responsibility to protect your privacy, partly to comply with the existing laws, partly also for ethical reasons or for simply decency norms to process the data in a way that is fair. They're doing a fair job, but only up to a point. So there is also responsibility for people, individuals themselves, for you, for me, as consumers, as citizens, to try and protect privacy ourselves. And I think that is also what this week is about, to help understand the problems and to see what we can do as individuals. But that is very limited. So I think the discussion might be going in that direction. What can we do? And is that enough? Spoiler alert, no, it's not enough. But at least there is things that we can do. And I think it all starts with awareness. So I'm very happy that this initiative takes place. Because if you're not aware of the importance of privacy, why it matters, why you need breathing space, and why it would be unlivable to live in a society where you constantly have to think, oh, what if this piece of information is going to be used against me? That is not doable. But on the other hand, just always clicking yes, installing new apps on your smartphone, jumping to the newest social media platform, because all the other people are doing it as well, may not be the smartest move, and might also help to make the problem larger, rather than solve the problem. So it starts with awareness and taking privacy seriously, because remember, as I said, privacy is like oxygen. You really need it. Thank you for your attention. Okay, great. Thank you very much. So Hannah is going to go around with a microphone in case anyone has any questions. So please don't be shy. We have enough time for one more question. Anyone has a question, maybe? No? You have a question? Oh, there is two. Oh, that's it. There was regarding the topic of privacy. I'm not sure if you've seen the news, but there was one from the UK government, which was about ending the end to end inscription into social media apps. Now, that was pretty interesting because the goal in itself was good, was like just scan for illegal materials like child abuse images. But in the end, it also has like a negative side because how do you know they're only going to be looking for that? How do you know they buy that search for other materials? What was your opinion on that? It's interesting that you ask. Maybe not everyone will have seen this piece of news, but it's a discussion that keeps popping up every few years. Increasingly services use end to end encryption. So on your device, the data are encrypted, then transmitted in a way that is unreadable, even if people intercept it, they can't make sense of it because it's encrypted. And at the end, it is decrypted, which is very important for privacy preservation, but creates problems for particularly governments who want to intercept information from criminals, from other threats to security. But also platforms increasingly have a responsibility to make sure that the content that is going from one place to another, from one person to another, does not contain child pornography, does not contain racism, other types of offensive material. So this is a tension, right? And I said it was interesting that you asked because I wrote my PhD in the 1990s on the problem of encryption. And we had the same discussions back in the 1990s. So basically my analysis then, and I think that still applies today, is that encryption creates problems for law enforcement, for security services and for others who have an interest in a legitimate interest in getting to know the contents. But if you do something about that in the way of building in backdoors, because that is what this is about, if you remove end-to-end encryption, you build in a way for the intermediary to take knowledge of the information, you create huge risks because not only the ones with legitimate interests can use it backdoor, also others. Hackers, providers that are maybe less well-intentioned. And the balance of this is that the risks are far larger than the benefits. So the conclusions I made back then is that, yes, this may mean that governments have a little bit less information by intercepting it in this way, but they have all kinds of other ways of getting the information at the end, at the start of at the destination. So they should focus their efforts on getting the information before it is encrypted or after it is decrypted. And I think that still applies today. Removing end-to-end encryption is a very bad idea. I would like to introduce my next speaker called Joanna Strykaris, who is an assistant professor of persuasive communication in Amsterdam, the Amsterdam School of Communication. She's a researcher and teacher here, and she also is a co-director of the Digital Communications Methods Lab. Her research focuses on how insights gained from data can be used to adjust digital communication between organizations and consumers. She is also interested in how such data-driven communication impacts cognitions, attitudes and the behavior of consumers, as well as what unintended effects such communication has on individuals and society. So if we could please have a round of applause for Joanna. Thank you very much. Right, yeah, I think everyone can hear me. Thank you very much for the introduction and thanks for an invitation to Tilburg. Nice to come here. So what I would like to talk about the upcoming 15 minutes is how we as individuals make privacy choices online, especially when it comes to different activities that we do, where we share our data. So when you think about, we already kind of heard in the first presentation, whenever you are online, whenever we do any activity digitally, we leave willingly or unwillingly traces of our behavior. So here are some examples of probably things that you do in your daily life all the time. And they apply to different contexts in different parts of your life. So think about writing emails which go digitally. Think about using social media like Facebook. Think about looking up information that is later collected by, for example, advertisers to bombard you with advertising for whatever you have just looked up. But also very sensitive stuff like your dating life. I mean, a lot of young people now find their partners via apps like Tinder. So all of these activities, while they are just communication activities, while maybe once you were doing offline with the private sphere, like we heard in the first presentation, now they are online and they leave traces that different organizations and governments can collect. So this means that this need to protect our privacy is rising and we see that also among consumers. Here are examples of things that come from a study conducted a few years ago. What we can do as individuals to protect their privacy. And you can see a bunch of things. I'm not going to have time to go into them. Some of them are more, some of them are much less effective to protect your privacy online. But you can see that while, for example, leading cooking is at least seemingly very popular. Other means of protecting your privacy digitally, like opting out or using certain software which are maybe much more effective are not so widely used. Most people say, OK, I don't even know what it is and I've never used it. At the same time, we already heard and I thought it was a nice bridge from the first lecture we've heard that there is kind of responsibility with us as consumers, as individuals to protect our privacy. And when you look at how regulations like the GDPR that was introduced a few years ago and focuses on the online privacy. When you hear European Commission or other lawmakers talking about it, they really underline this idea, OK, this is the moment for you, all and me to take control and this is centered on individual control. And that's like the theme that comes back there. And what I wonder in my research, does that work this way? And what can we do to make this control really happen and empower individuals? So what these regulations and this idea of responsibility towards all of us, they have in common, then kind of depart from what many would call the transparency and choice paradigm. So this idea that there's going to be some transparency, you probably have experience yourself that when you go to a website, you get some information about what kind of data is collected, how it is collected, how it would be used. Some information is better than other. But the idea is that you should have that information, this baseline. And then we have a bunch of individual rights that we can, for example, say, no, I'm not going to let you collect my data. In research, we would call it as part of digital literacy of all of us that we have understanding of this. And the idea would be OK. Once you kind of know what's going on and you know what kind of rights you have, you're going to be able to make an informed choice. You're going to be able to choose. Am I going to go for sharing my data and leaving my digital traces? Am I going to say no, I don't want my data to be shared? Microphone. So it like this. So what do we see in practice from these regulations? So when you think about your daily life as a consumer, this is the practical side of things. And I think you can all recognize it. So what you see on the left, that's the opt out functionality that Google's offer you. So many of you, I suppose, have a Google account. You could go somewhere to setting somewhere hidden, hidden, hidden. There is this button that you can turn off personalization, meaning Google will not process your data for advertising. The thing on the top right is the meta buttons hidden somewhere in their settings that you can actually withdraw your consent from meta telling I don't want you to process my employee information, for example. And the bottom right, I think all of us recognize it. That's a typical cookie notice screen shot it from one of the news websites. So nice in theory, we get the information, we get the choice. What happens in practice? So in practice, we know that these functions are not used very widely. And I have been trying to study now for a few years. Why is that so? So why do we have all these possibilities and why don't we not take them? Because I mean, they are relatively simple. So before I tell you what I think happens and why we don't use them, I actually wanted to ask you first. And if you can take your mobile phone and we are all internet users, if you can go to the Wooklub.com and fill in that code, I'm curious, what do you think? Why? All of us and probably many of you do not use these privacy settings available to you. Let me see if the tech works with me. I rejected cookies to be sure of that. So if everything goes well, you should be able to type something there. You don't have to be a researcher to come up with exactly what we've been studying ourselves as well. Things like lack of awareness. It takes time. It is complicated. We have a lack of knowledge. It's hard to find these options. We are all lazy and just don't do it because of laziness, ignorance. So actually, I'm going to show you how indeed we've studied all these topics and to what extent they contribute to us not taking care of our privacy. And what else do we have? Too much information. That's also complicated, inconvenient, cumbersome. Yeah. So I think you already figured out my research. So let's show you how we did that. So let's start with knowledge and awareness. So would that solve the problem? Is that the problem that we just don't know? That was actually something that I did during my PhD. That was when the GDPR was being developed and everyone was hyped about it. And we try to see, OK, is that law that prescribes companies to give us certain information? Is that going to be the solution? And what we did, we did a number of studies. I want to share two of you where we designed interventions for just consumers, where we gave them short videos. And in one of the videos, we explained how does this technically work that Google collects your information? So what type of information they collect? How does that work? How do they process it? How do they store it? What do the Indian do with it? And in another study, we focus more on the cookies. That's something we see all the time. And then we try to focus. OK, so either we give you knowledge about what cookies are and how they work, or we explain you what is the legal background of cookies and how you can actually reject them. So focusing on either of these two things. And our idea was, OK, by doing this, maybe individuals can be more empowered and can take more informed decisions. I couldn't skip it to give you some theory, Breguant. Why we thought it's going to work. It was not only because the regulators thought so and they put it kind of in the GDPR, that these are the information types that you should be given. But when you actually look at science, it kind of overlaps which theory that we often use to explain why people take action to protect themselves, which is so-called protection motivation theory. In a supernatural, the idea is that if you see something as a problem and at the same time, you have the feeling that you can do something about the problem. If these two things come together, that's when you're going to take an action. If you can see a problem, but if you can't do anything about it, well, too bad. And you can feel I can do something about it, but it's actually not a problem. You also know something about it. So that was kind of the theoretical idea we had there. What we found was a bit discouraging to some extent. We saw that when you actually explain to individuals how whole data collection and digital traces online work, it actually makes them feel less serious about the issue. So as soon as we told them how Google collects your data, I thought, hmm, I thought it was worse. It's less bad than I thought. And in the end, we actually made them less motivated to do anything about this. So that was a bit of a side effect of our experiment. And good news was that when instead of telling them how it exactly happens, you tell them, OK, you actually can do something about it because there is a law that gives you some kind of possibilities. We saw that then people started believing in that law. They thought, maybe this actually works. And they started feeling, oh, I actually know kind of how to do it. And in the end, we saw in the long run that people started rejecting cookies, for example. So we tracked them, we did what the companies do as well. We just tracked what they do online for a few weeks. And then we saw much more people clicking and looking for their reject button once they knew that the law is kind of behind these buttons, that the buttons are not for nothing. So we saw in these studies that, well, maybe it's not such an easy story that giving awareness solves it as long as soon as you are aware, you're making form choices is a little bit more complicated on that. But in the end, there is some some truth there that awareness needs to be there before you can take any action. So that was one of the things we saw in the in the word cloud. Another one that we saw there was things about laziness and complexity and being tired and it's all too much. And that's what we were wondering next in our research case. So we know that awareness does something, but it's not everything. So what could be other things that demotivate individuals from taking a privacy action? Looking at a lot of different studies and talking to a lot of different people, we thought, okay, there seem to be three things that go beyond you just knowing stuff that might have something to do with the fact that we are just all most of us do nothing. First of all, you have to be aware how severe the problem is. So we heard at the beginning, the whole idea, nothing to hide. That was really a thing a few years ago. It came from this idea, I mean, it's not really a severe problem. I have nothing to hide. So we thought, okay, is that still maybe playing a role in people not protecting themselves? Then we thought, okay, if you do read studies, a lot of people say it's complex and I fully agree that stuff is really hidden. It is really complex. So maybe there is just luck of a simple and effective action that you could do. Maybe that's the thing. Maybe it's just too difficult. And then the final thing that we've been noticing is what research names privacy fatigue. Many of us are simply tired. You hear about privacy all the time. You feel like you cannot do anything about it. You just get very cynical. Yeah, it's important, but like whatever, I'm not going to do anything with just too much. So we thought, okay, instead of focusing on just awareness and knowledge, maybe it's going to work better if we focus on these three more feeling related things, give someone a feeling that's a problem, give someone a feeling that they can do a simple and an effective action or give them the feeling that actually you don't have to be tired. There is something you can do. So we thought, okay, we're going to try this. We collaborated with an NGO that helped us develop trainings for individuals which focus on targeting these different states that you can be in that they can demotivate you. This is how these trainings look like. So you were invited like a few thousand people and they participated just just consumers in this training. They were in Dutch, but basically this was the first training about threat and we tried to explain people, okay, so what are the potential threats that could arise from you just simply sharing your data online. So nothing scary, but just a simple sharing what could be a threat there. Then we thought, okay, we're going to also tell you how effectively protect yourself practically. So instead of just telling you, you have to be aware, we're going to give you actual steps. And we decided back then for installing an ad blocker like a tracker blocker like a ghost tree, because that's a relatively simple and effective one time measure. So we told people step by step, hey, this is what the thing does. This is how you do it. And the final thing we tried is the case. We know that a lot of people are tired. Will it help if we tell them, okay, we know you're tired. And we know that this is all complex and too much. But actually there are simple and effective things you could do to kind of reduce their feeling of fatigue and tiredness to maybe make them feel a little bit better than privacy. So we did that and we tracked these people for a few months after to see did they do something, did they behave differently, did we manage to motivate the demotivated to protect their privacy in the end. And nice news was when people did this whole training we made, they became much more aware of the threats. So they really felt okay, there is something I have to hide. And that worked very well. Telling them how to install tracking blockers and telling them okay, this is actually not rocket science, there are simple steps you can do so help them to feel empowered and they felt okay, this is actually quite simple and it feels like it's effective. So I can do that. And when we told people like, okay, we know you're tired, but like there's no need for it. They actually started to feel a bit less tired. So that was kind of promising. What we again did find is that actually the fact that they at once had something to hide and were a little bit less tired did not do anything for the behavior they were showing. So we did not find any positive effects of all this stuff we tried for months and months on them actually protecting their privacy in many different ways that we tested. We did see, however, which was very nice for us that giving someone a very simple training in installing a broker really helped them. So we found that 80% of the people who actually did receive that information did that. So we were actually surprised ourselves. We did not think that so many people was going to do it. And they actually half a year later they were still using it. So they never uninstalled it. I mean, that's an extra task. So they did, they did something to protect their privacy and that was kind of a long term thing. So what can we kind of conclude from these different studies about like what demotivates us to protect their privacy online. We see a lot of a high what we would call threat appraisal. So people are aware that privacy is something that they should protect. If you ask people, do you think it's a problem that companies might collect unusual data? Do you think it's a problem that you leave digital traces online? There's going to be almost no one who tells you it's not nowadays. So that's nothing to hide ideas. That's really dead. Like it's not really a thing. And if we do it on scales, the average is usually above six out of seven on perceiving this as an issue. So there's not much to gain there. Everyone knows it's a problem. And the problem lies somewhere else actually doing some kind of action to protect yourself. So I think these two are actually quite crucial right now when we look from communications science perspective at least at least why we don't protect our privacy. We saw again and again in these studies that there is relatively low feeling of empowerment among people. Maybe that's something you recognize. When I showed you the settings, I said they were hidden. You said in the I think that things are complex. And that's what people feel. They feel like, OK, nothing I can do about it. We also saw very low trust in the law. So there is laws that supposed to protect you, but no one really believes that they do. We did some interviews and then people said, yeah, these big companies, they still make money. So there has to be some kind of trick that they have to collect our data, otherwise it wouldn't be possible. So we see a lot of this trust towards these measures we have. And then at the same time, I guess that's something that all of us can identify with the cookies coming up and again and again and again. And you have to click all the time reject that at some point you're done with it. And it's pretty tiring. So we do observe a lot of fatigue going on there. That's quite difficult to counter. So our studies would suggest that basically having a simple measure would be the solution in the sense of helping people to do something straightforward effective. But at the same time, the belief needs to be there and kind of the fatigue would need to be lowered. So that's something that how we see what we can see in a privacy research and communication science. And that's what I wanted to share with you today. Thank you. Thank you very much, Ioana. So now we have an opportunity for one question. If anyone would like to volunteer themselves for something to ask. Don't be afraid. There's a gentleman of the four here. Oh, oh, yeah. Yeah. So you actually said that if you have people you told people about how Google works and then they actually became less enthusiastic about protecting their privacy. So why do you think that is? Yeah. Yeah. I think we see this more often in our studies. And what we do notices that there is a lot of what we would call folk theories about this stuff, how this stuff works. So because it's quite complex and but you still want to kind of understand what's happening. A lot of individuals would just make stuff up to put it very simply to explain it. So then it feels for many consumers, you know, it feels that the data that's being collected is often actually more than than actually is being collected. So we did studies recently about, for example, like smart devices listening to you. And there is a lot of innovative that's true or not. But in the Netherlands, 90 percent of consumers actually believe that they have had an experience that their phone kind of listened to their conversation and then they got an ad because it could not have been otherwise that like I just talked to my mother about this and no one else never Googled and I still get an ad. These kind of stuff happens a lot because we all want to explain the world kind of. And then once we told them, OK, this is actually what Google collects. And this is what they at least they claim that they're collecting how they use it. We saw this reaction like, yeah, I thought it was more and I thought it was worse and actually maybe it's not that bad. So I'm not going to do anything about it. Did you also discuss what they're going to do with that after collecting? So I mean, at least for my knowledge, it's like they're putting people in certain consumer target groups, which can have crazy names actually, I saw once. Yeah. No, that's true. So in the first study, what we did was it was very more generic. So explain how it works that they use it for advertising and how they make money of it. But in the second study we did, we focus more on the risk things, like the crazy name stuff. And then we told people like, hey, they're actually profiling you in stuff that you might not want to be profiling. And then people really were like, OK, this is bad. So it's really how you frame it. If you say it like Google says it's very neutrally, this is what we do. And we sell ads based on that. And OK. But as soon as you say, OK, but they actually can put you in like expecting parent group and then target you with baby stuff, then people are OK, this is not so nice. Yeah. Thank you very much, everyone. So welcome to the the second and final half of the symposium today. And to start off, I'd like to welcome Nicola Houtt. Today has been good for practicing my Dutch pronunciation. So Nicola Houtt is a researcher in the domain of risk perception, environmental psychology and human technology interaction. The focus of her research is on public is on the public perception and acceptance of beneficial but also potentially risky technologies in the domain of energy, ICT and transport. Nicole has extensive experience with multidisciplinary research and education, particularly bringing together the fields of psychology, engineering and ethics. Previous to working at the University of Twente, she also worked at TU Eindhoven and TU Delft. So if we could all please welcome Dr. Howitz to the stage. Yes, so very nice to be here. So I will talk about privacy and security risk perception among users of smart home IOT. Let's see if this works. Yeah. So the last few years I've been doing research around smart devices in people's homes. Like a smart weighing scale, a smart lamp, a smart security camera, a smart speaker, smart sensors, but nowadays everything is getting more and more connected to the internet. Uploading data into the clouds, being able to communicate with other devices. Also, besides these devices, we have smart TVs, smart office, smart fridges. Pretty much everybody in the house is getting smart nowadays. And of course, besides benefits, there are also risks to that. So I have some examples that I found in the English media on the risks. Quite some cameras, smart home cameras have been hacked and the footage has been posted online. Imagine finding out that for weeks somebody has already been watching you online or many people actually watching you online, seeing how you move around the house, how you talk to other people in your home. Also, predators have been talking to people through smart cameras, even to children. Yeah, imagine finding out somebody is talking to your child in the bedroom for weeks already. It's psychologically very uncomfortable. Then there's also examples of smart speaker data just recorded and randomly sent to other people, allowing other people to hear what you were discussing at home. And then people also do not realize how much companies can infer from our data. This was about a smart vacuum cleaner. It makes a very good floor plan of your house. And the manufacturer can infer a lot about your house from that. And also nowadays we have smart energy systems in our house and imagine living in a cold country and the heating going out overnight and your baby being at risk of dying because of it. So I have even more examples. Smart toys spying on kids. So if you think about your house like that, it starts to feel quite dangerous if you realize everything that can go wrong in your house. Wait. OK, so not only your privacy is at risk, but also your physical safety can be as risk. Imagine your oven going on when a cyber attacker decides to switch it on without you knowing it. And also your mental safety is at risk, your mental well-being. So how this slide got here and it shouldn't be here. It's a little bit confusing to me. OK, I think all my slides got mixed up because I was first supposed to get a different slide. So let's go through them. OK, I was looking for this slide. I was working at the Eindhoven University of Technology and together with cybersecurity experts, we were examining how people experience being cyber attacks. So we gave people smart devices to try out at home and we said, yeah, we just want to know how you experience them. We interviewed the participants regularly and then we simulated cyber attacks. I say simulated because we didn't actually hack them. We kept the passwords of the devices to ourselves. We had some arguments for why we did that and we used that to do something in their house. So after weeks or months of trying these devices, the lamp would go on or we would have Alexa say something to people or something else. We opened the shutter of this smart security camera and while we were interviewing people, they went like, yeah, if something would happen to the shutter, you know, I would really notice it. So this shutter gives me a feeling of privacy. But actually, when we cyber attacked people as we did, they didn't notice at all that we did something. They didn't think of being cyber attacks. They usually didn't even report anything to us. And if they mentioned something, they went like, yeah, the lamp was on and I don't know why it was on. Maybe my housemate left it on and then they moved on. They forgot about it completely. So what we learned from the study is that people do not realize when they're cyber attacked. And the underlying reason we discovered was that people just have a really low awareness of the privacy and the security risks of smart devices. When we ask people about the negative sides of the devices, they hardly mentioned privacy and security. And when we did mention it, they were downplaying it like, yeah, it's not likely that I have something happens to me. I'm not an important person. They had these all kinds of reasons. They were not very keen, not very focused on it. And when we told them, hey, we actually conducted cyber attacks on you, they thought it was very funny and they were very interested in trying them again. So we didn't tell them how we cyber attacked it, but we did it again. And then they said, huh, I noticed the lamp went off or on yesterday or, hey, my smart speaker suddenly started talking. That's really funny. But then they also mentioned a variety of other things those devices do that people just don't understand. And they went like, yeah, I always ask this and this to my smart speaker. But when I did it yesterday, it didn't understand me. So that's probably you guys who did that to me. I said, no, it was not us. So what we also learned that even when people are aware of cyber attacks happening, they have difficulties distinguishing them. And of course, we were actually interested in understanding how people experience being cyber attacked. We were assuming that people would get very stressed from it. We had a lot of difficulties getting to study through ethical committees. We got a lot of limitations to our study. And in the end, we were allowed to do it. And people didn't realize they were hacked. They were not stressed about it. They were not upset. It was just a funny thing that they experienced with us. In other research, I've been reviewing a lot of literature from the computer human interaction field where they asked people about how they perceived the privacy and security risks. And then they had actually all kinds of arguments why they were not really worried about these risks. And let's see if I can get to the right slide now. This one, they're really mixed up. It's very funny. So this is one slide that I wanted to show. So what people said, I have nothing to hide. It's still an argument used by people. They say, I don't do anything secret in my house. I have really boring conversations at home. So it's OK if I get cyber attacks. I have nothing to hide. Also, people said, yeah, I don't think I'll be a cyber attack because I'm not an interesting target. I don't have a lot of money. I don't have any I don't have any power. So why would they attack me? So I think people in general are underestimating how unpleasant it is to be hurt in your private spaces and your privacy affected. It's like what you already said, it's like you only know that it's gone when it's gone. And people also do not realize that this is also a crime of opportunity, just like burglary. People do not only burglary your house because you have a lot of money in it, but also because you forgot to lock the door. So that's good to know. I'm going to search through all my slides to find the right one. So this one I wanted to discuss trust. People have an argument that they trust the manufacturer of the devices. They say, yeah, they have a very important brand to protect. So I'm pretty sure they do everything they can to make the devices safe and they could care privacy and security. And then they also say, if it would not be the case, I would read about it in the newspaper or my friends would tell me. So, you know, I have trust that everything will be all right. They also trust that the laws are all in order, which I think is is not sufficient. The law is behind on this. It's not good enough. So people are having too much trust in other parties rather than take the right action themselves. And also people just love the benefits and they don't want to give up the risks. And the more people say that they like the benefits of the devices, the less they are to perceive privacy risks and to take action to protect themselves. And then then you see where the slide is here. Why do people perceive little privacy and security risks also because they have a lack of understanding of the devices and what happens with the data. In my study, for example, one family said, yeah, I actually don't know what happens to the data. I think it stays on the device and they had no clue that it actually goes into the cloud. So there is still a lack of understanding and also, of course, on how to do something about it. So practical knowledge on how to change the settings, how to do things that improve your privacy and security. Something that I skipped telling because the slides were mixed up, maybe I can go there now. OK, so I will now say something about a quantitative study. We did, you know, people have all these reasons to not worry so much about privacy and security risks. So we were studying which of these factors now actually the best predictor of low privacy risk perception. We ask people, imagine that you get a smart speaker gifted to you and you will install it. How do you perceive the privacy risks and how likely are you to take privacy protective action? And then we looked at factors that made people less aware of the risks and less protective towards the risks. And we found that particularly perceived enjoyableness. So the more people like the devices, the less they perceive the risk and the less they took protective action. And also people that felt resignation towards the lack of privacy, like, yeah, cannot do anything about it anyway. All my data is out there already, doesn't matter anymore. Those people also perceive less risk, interestingly enough. And also took less protective action. And people were perceiving more risk and taking more action when they had more self efficacy. So when they felt that they were able to do something to improve their security, then they were also more likely to do so. And there we found further more differences between people that already had a smart device. Let me see where that slide is. That's this one. So here indeed, self efficacy and actually trust in the smart speaker companies determined people's privacy risk perception when they already had a smart speaker in their home. And security self efficacy was an important and the only predictor of taking protective action. Well, for people who do not yet own a smart speaker, and this had to imagine, I'm getting one for as a gift. How would I go about it? Yes, here's the slides. People who do not yet own a smart speaker for them perceived enjoyableness and resignation were the most important predictors. But security self efficacy did not affect their decision making at all. So yeah, I think smart devices in the home are a very specific type of privacy risk. These devices are not person specific. So you do not often manage them alone. You have them together with a housemate. But still we see in our research that very often one person is the one that decides about the device, installs it, has the user account, manages the settings. And there is another person in the house who is just a passive user and is much less in control. These devices can bring a lot of risks, but a lot of people are not aware of it. And one of the risks that I didn't mention yet, but I think it's also very important to realize is that these devices allow people within the household to spy on each other. So housemates can see from each other what questions are asked to the smart speaker. If you have a smart door lock parents can supervise their children more strictly. And also if there are power imbalances in the household these can be even strengthened with smart devices. An example and one of the papers I read was that the husband decided for the wife what the temperature was going to be set on because he thought she always puts it too high we're not going to do that. So she just gave not give his wife access to the smart thermostat. And of course also in controlling households you can watch each other through the smart camera and be even more controlling over the other person. And that also gives counter reactions. So for example where parents are supervising the children through the smart lock or to the smart camera the children try to avoid that. And for example they start leaving the house through the window and then leaving the window open while they're gone so that the parents don't know it and then the house is less protected and instead of more protected what you try to achieve as a parent. So I think particularly smart devices are a very interesting topic of privacy research. And yeah they pose a lot of challenges especially because the smart home is really a place where you want to feel private. You want to feel comfortable. You want to relax and you don't want to worry about risks that which you would do when you're outside of the home. And I think if you are starting to feel unsafe in your house it will psychologically influence you. It will influence your well-being. So I think it's a very important field of research. Yeah thank you. Thank you very much Nicole. So Hannah will go around with a microphone for any final questions before we go to the final stage of this event. So please feel free to ask anything that comes to your mind. Someone has a question maybe. Again don't be afraid to ask anything. No I think we can. We have one question here. All right. So you told us about your research. In which you said in which you gave people a camera and you kept the passwords for yourself right. So I was wondering whether you did something with. No let's let's start over. And so what I would do if I would get a camera like that I would want to change the password first off because I know that password most of the time is standard. So did you take into account that part in your research or how did you take that in an account. Yeah so of course if you do a study like that we on the one hand try to make it as natural as possible so you really get insight in how people naturally do things. But on the other hand we got a lot of limitations from ethical committees. And one of the limitations was that we were not allowed to have people install the apps on their phones and then maybe they would be out of the house. Notice that something was going on in the house and then the ethical committee was worried that they would rush home and get an accident for example. So one of the things we were not allowed to do is have them install the apps on their phones. We gave the participants an iPad on which they had to install the apps and work from that. And the iPad was not allowed to leave the home. So of course the research had a lot of limitations because of all the ethical concerns there are which makes it less valid if you would really do it in a different way. On the other hand we needed to have those passwords to execute those attacks. We didn't have power to really execute cyber attacks. So yeah we had to do something I would say. But I do think that the findings we had that people just didn't realize they were cyber attacks are really realistic findings. Of course we also had the limitations that we are not allowed to execute nasty attacks like put the volume really loud put on really negative music scream at people say oh I saw that you are naked yesterday. How ugly you look. You know all those things we were not allowed to do and pretty much if you would do that people would be more suspicious of something happening. And on the other hand a lot of cyber attacks are not noticeable at all. So the fact that we had noticeable cyber attacks is also not fully natural because a lot of cyber attacks are not noticeable at all. You only notice them months later when all your private data is out when you get extorted. So yeah. That the camera went on all of a sudden without them noticing that's actually maybe even worse than someone yelling at you. Yeah so yelling at you know that you have to do something. You probably get over soon once you figured out that it happened and you can fix it quickly. And with the other one it takes a long time before you realize it and can fix it. But I think in both cases you will have a sense of terror and you will feel less safe. And especially if you hear it happening more often or you do not manage to get that sense of control. Of course if your house is burgled you can think OK I'm going to ask my neighbors to watch out for my house. I'm going to put better locks. I'm going to put a security camera. I'm going to put my valuable things in a safe and then you have all these things done which makes you feel probably more in control. And with smart devices it's much more difficult to do that. And especially if it starts being in every device of your house. On your cooking hoods your oven your vacuum cleaner. It's the future. All our devices are going to be smart in the future I think so. It's much more difficult to regain that sense of control I think. And especially if you are not high educated if you're not trained if you don't have a social environment where you can ask people for help. It's something that we should worry about and try to prevent from happening. You believe that everything will be smart. I think the trend is in that direction. I think dumb TV's you cannot buy them anymore on the market. And also other devices are more and more sold as smart and not as unconnected. They want a smart cooking Not yet. But I think they start. I don't know. I don't know that I want. I'm not sure. But I think they will be there also. Yeah. Great. Thank you so much. And so for the final part of the event for the final 15 minutes or so I'd like to invite the three speakers to take a seat upstage so that we can ask them all some questions together. So there's an opportunity here as well for any final remaining questions. Perhaps madam you had one earlier that was left unasked. So would you like to go ahead when we hand you the microphone. We talked about it a little bit already over the break that a lot of the privacy is left up to the individual to protect it. And I was wondering what else would be a priority to get done if it's not the individual that takes care of it. What would you advise and who would you advise to take steps. Yeah. But we still have our with my microphone on again. I can I can start on it. I think when I when I worked my PhD was focused on empowerment for knowledge based on like the regulations and having one of the conclusions in my dissertation was something like we need to be more paternalistic and protect the user and focus much less on this like we all going to be fine because we let let people know what happens and then they just decide for themselves. Maybe I'm a bit less radical now six years later. But still I think there is probably much more space will be already heard from you for changes in the law and protection, especially like I study also a lot what happens the next step with the data. So I mean privacy and data collection is one thing but the data is often used for something. And what I study as well is manipulation that can be done if you have enough information on someone and that is very difficult to expect individuals to understand how it happens. We've heard about these creepy profiles that Google makes. It gets to moments when maybe there is like a red line that certain practices might not be. How do you say it? We might not want them as a society. So I think there is like one part is like awareness and knowledge and making individuals like help them understand. And then when it gets so complex, I think at some point maybe you cannot expect everyone to understand everything that would be my take. If I can add or although you remember I had a slide already with all the different actors. I think the largest part should be on the part of the ones processing information. So the corporations and governments to not want to process all the data that they theoretically could. They should see more fair treatment of people as their priority rather than going into the paradigm of collect as many data as possible and then do all kinds of magical analysis. We don't know what happens, but the technology tells us that these are the patterns of the data. So maybe have a bit less trust in data and technology, have more trust in people which is not going to happen by itself. So the law is an important stimulant, I think for pushing people to pushing corporations and governments itself themselves to protect privacy better. So privacy by design is an important starting point and what we were also discussing a little bit in the break privacy by default. So I think a lot can be proved can be improved by having privacy settings that are by default more privacy friendly. For instance, the devices that you buy with the standard password, if they are automatically programmed after three times to ask you do change the password, you cannot use this default password anymore that would incentivize people to at least choose a password that is not online, easily guessable. And there are many other ways of which you can have default settings, I think to at least incentivize it a little bit. I do think if I may add I don't really believe in the paradigm of control that is part of the GDPR that people can really have control of the data that are out there, but I do think that the way that the whole ecosystem now is moving towards having your smartphone as the default infrastructure for all kinds of apps and all kinds of apps being the default entry point into all activities and social life. I don't have a smartphone and it's becoming rather difficult to live. I can still manage, but I'm not sure how long I can keep this up. Because for many things you nowadays need apps or at least you need some kind of online interaction. So in a sense do wonder whether you could also do without your smartphone for a while or without social media. Probably not because it becoming difficult to live in that life. But if everyone just goes along in this way. We make the problem worse. So if a significant number of individuals would really start. Leaving behind their smartphones and I think younger generations sometimes are doing that. Maybe that also is a way to change the world. Any final remarks? No. Yeah, maybe I can also add that I noticed from our research you cannot leave it all up to the people. They're not always enough aware. It's quite cumbersome as you also pointed out. So it would be good if we would have regulation that steers the design of the devices that people indeed get choices when they install it. They cannot leave the standard password, but also that they need to go through set up steps where they are made aware of what happens with the data, for example. So I think indirectly through the law and the design of devices, they should much more take into account how people think, how they make decisions. So that's people automatically make the right decisions when they have the devices. Including, for instance, simple things. If a camera is turned on, a smart camera in the home, you might also want to have a flashlight to announce people. Now the camera is getting on so that people are aware that if they haven't pushed a button, someone else has done this. Yeah, so maybe I can tell about a smart device that we had. The shutter really made a sound like, and then everybody was convinced, OK, if somebody cyber attacks me, I will hear the shutter opening up. But then nobody noticed it actually, because the TV was on and the sound was not there. So indeed, if you know that people that it works like that for people, you should allow people to make a choice like, I want if my shutter goes open, I want to get this signal so that people can actually make sure that they are aware of whether their privacy is at risk or not. Thank you. We have a question over here on the right hand side. So I've been trying to tell tell my family about the risk of privacy and why they should protect it. But every time the same answer is, yeah, I have nothing to hide. So what's really the perfect answer to that, to convince my family? Maybe you can find some newspaper articles about people that actually experienced privacy invasions and how they really experienced this terror and stressful. I think stories about individual experiences are usually very a way that people can imagine like, OK, this is this would really be a problem if this would happen to me. Yeah. Yeah, I want to add that in some past studies, we've done like experience of privacy invasion was the most the biggest motivation to actually do something about it because then you notice the problem. And we did some other research when we tried to ask people, so what do you actually see as privacy risks? We just did interviews and ask people and what came. And I mean, that makes sense that the very tangible things like someone's going to steal my money kind of thing was very easy to understand. But a lot of these risks are very abstract. I mean, the risk of me clicking acceptor something or at risk of me having a home smart device is quite tricky. So I think that makes it like making the risks tangible. And I know that there are some colleagues who now conducting the study when they actually make people read articles about privacy risks like every day for a few weeks and they try to see if what they call surveillance shocks. So this shock of idea, OK, like there are risks would help them to become more aware. But it's very short term lift thing. You read something and then you worry for a bit and then it kind of wears off often. So that's that's also challenging to make it tangible and that you really have to take action now. A simple device. I use is to immediately then ask when was the last time you masturbated? Because I think that even within family settings, still an awkward question. Makes it very tangible. Yeah. And if you want to be less confrontational, but the same effect. So it's fine if I put a camera in your bedroom and stream it online. No. Because then then they start asking, but who would have an interest in putting a camera in my bedroom? But then that's the point, right? You don't know exactly what the data is going to be of use for anyone else. And there are many examples, I think. The. I think someone with a Moroccan background who went through Spain was arrested there because he probably had the same name as someone else. At least I think that was a problem. I'm not quite sure. But because of this, he was in jail for quite a time. He couldn't prove that he wasn't the right person that they were looking for. Or maybe he was the right person, but it was merely based on certain associations. At least he had no idea why he was arrested. No one else could. And your family might think, well, too bad for him, but it might happen to you as well when you travel around. Thank you. Are there any other questions at all? OK. The gentleman in the blues hand hand shut up before you, I'm afraid. So we'll go to him first. So my question is, as you've mentioned in your talk in the beginning, that there are certain assumptions about the law, which used to be true in the 20th century, but are not true nowadays. And you also mentioned that with that there is like larger regulation needed, not just small changes in the law. And could you please explain what the what do you think of? Or how do you how would you? How would you define those larger regulations? What are like examples of those large scale regulations because something like forcing them to change the password is sounds like a good idea, but it sounds like a very small thing that on its own will not make a huge difference. So could you explain what would you would like to see happen so that huge differences are actually made? I will give two examples from my research, both of which are completely unrealistic, I'm afraid, but you have privacy and data protection, which are very close, but really different animals. Data protection is what most of us have been talking about your personal data and how they are processed and the protections of that. And then you have privacy, which is also on if you remember the slide I used with all these privacy times. Informational privacy was a large box on top of that. That is data protection, but all these other elements you're bodily privacy, the privacy of your mind, the privacy of your relations also have an element that is not quite connected to the personal data as such, but merely the feeling if a burglar is in your house you feel polluted in some senses, which is nothing to do with what he saw or what he took, but simply the fact that someone else without your permission is in your house. So that is to explain you have data protection and privacy. For data protection, I think the GDPR is a bad law in the sense that it's way too complex and it's way too bureaucratical. I would rather have data protection law go back to the roots, namely eight privacy principles that underpin much more why data protection is important, have correct information, give people tell people what you do, give people the right to object eight basic principles rather than these hundred and forty articles, eighty pages of legislation where you need a lot of consultants to do compliance, including many people working here. I mean, it does create a lot of jobs. Not that the jobs would completely disappear, but people would hopefully think much more why should I do this or why shouldn't I, based on principles rather than checkbox thinking of GDPR compliance without any relation to what is actually the harm in there. So that's on data protection. In privacy, my field is largely in the criminal law field. So I've been studying how criminal procedure, how investigation powers of the police could be regulated. And there that's the example of the 20th century assumptions, strong regulations of the police entering your house only with a court order when it's really necessary. But when the police arrest you on the street, they can frisk you and take whatever you're carrying with them and they can look into this because classically you only carry a packet of cigarettes and maybe an agenda with you. They can look into that. But nowadays they can also look into your smartphone. Slowly laws are regulating conditions for the police when they arrest people on the street for looking into the smartphone. It's not completely that they can look through the entire smartphone. But until a few years ago, that was still the case. But that is an example of the law being still 20th century in thinking that privacy is important in your house, in the contents of your communications in your body. And basically that's it. Those are the three major constitutional rights. If you want to protect privacy in criminal procedure, you would have to abstract away from very concrete privacy proxies and have a more abstract sense of privacy. And this would go in too much detail, I think, but we can discuss it over drinks if you're interested. I have a sense of a more abstract criterion for the police to put in the criminal procedure code. Intrude into someone's privacy. You need to have certain conditions. If you have a really large scale intrusion of someone's privacy, you have even higher ones. And you shouldn't in the law say that this particular action has this level of privacy and invasiveness because it nowadays all depends on how the police actually does something, what data they collect, how they bring it in touch with all other kinds of data. So the privacy intrusion is too much contextual in order to say upfront that it's entering the house that is the most intrusive privacy infringement. Long answer and maybe not very clear. Read my papers if you're interested. Or discuss it over drinks. Cool. So if there are no more final remarks, I believe that there's a time for one final question before we wrap up. I was wondering with the cases of smart devices becoming everywhere, basically, putting aside the phone. Is there any way to actually protect yourself when you have one or two of these devices like a speaker and Alexa or is it better to just avoid them entirely? Well, I personally still tend to avoid them. But if you do have them, you can use them wisely. Make sure that you install the updates every time. Make sure you set a good password. Have good discussions with your housemates on what you want or do not want to be known by the device and then make agreements on that. Usually a device has a mute button. Not everybody believes that the mute button actually works and a lot of people don't even know that there is one. But I would recommend using it or switching off the device completely when you have sensitive conversations that you don't want other people to know about. Be careful to choose where you put your smart device. Like don't put it in the bedroom, put it in a room where you are more, yeah, do less sensitive things in the house. Don't discuss your bank information in front of them. I mean, just the fact that you're conscious of the fact that the device is all the time listening usually it's only supposed to store what you ask it after you use the wake up words. I think it also might accidentally or yeah, I'm not really sure on purpose listen to you. We don't really know what those devices do behind your back, so to speak. So just be be aware of that and manage it. Just switch it off when you think, OK, now I don't want to get service from you. Now I just want to have a private life. Just switch it off when you want to, yeah. And of course, you also have visitors in the home inform them that the device is there so that they're also aware of it and they can make their own choice. What to discuss with you or maybe ask you to switch it off for the evening so that they can relax and feel in a private setting. Yeah, those are some things that you could do. Any final remarks at all? In addition to this, read carefully what the device is doing. So I assume it comes with the privacy policy, which explains which data are being collected in which ways and how they are processed and what these different settings are that you could use. This might be difficult and cumbersome to read and there the privacy fatigue enters the picture. You may want to do that for one device that you use a lot and that is very important in your home. But if you have 20 of these digital devices by the 10th, you would say, OK, click. Yes, yes, yes. And you use all the default settings, which is why the default settings are a little more important. But hopefully for each one that you would install, think carefully what it enables and just turn it off when you don't really need it. Instead of having it always on and only turning it off when you have a sensitive conversation, I would rather think that the default would be different only when you really want to ask something about extra than you turn it on. Yeah, maybe to add. I think what we see a lot is that people do these things once they've realized that what you said, the privacy starts to be important once you notice it's gone. So do it at the beginning instead of waiting till you notice, oh, something creepy happened and then you freak out and then you plug it out. But if you start the other way around, then it's going to help already. Yeah, it's also good to realize that companies do not have as their first interest to protect your privacy. Their interest is actually to get as much data from you. So they're not going to be very forthcoming and helping you to protect your privacy. So you have to outsmart them and really be attentive to it and not just hope that it will go fine by itself.