 It's really nice to see all of you here today. My name is Hannah van Den Bosch and I'm a program maker for Studiem Generale. And as Studiem Generale we organize all kinds of lectures like these, also other activities, nowadays also cultural activities. And some of them you can see here on the screen. So definitely visit those as well. And it's also good to know that this lecture counts towards the Studiem Generale certificate that we have. It's a certificate which you can get when you go to five different lectures of ours. You have to write a small report and then you can get that certificate. If you want to know more, all the information can be found on our website. And for this lecture today it's part of all you need to know about lecture series, during which we explain a specific scientific topic in one hour. It will take about 45 minutes, the lecture, and then we have 15 minutes time for Q&A. So any questions that you may have you can ask them then. And today the topic of course as you know will be the Enchanted House and Smart Speakers and Privacy. I think the title Smart House or Enchanted House I should say is a really interesting title because nowadays it may feel like that, right? When you have all these smart devices and smart speakers and the Internet of Things in the very heart of your home. But at most or these days people are also really critical about is asking questions, especially regarding to privacy as well, what it will do to us. And someone who can tell us all about this today is Sylvia DeConca. She is assistant professor for the Faculty of Law, Internet Law at the Vue Amsterdam. And her research mainly focuses on the interaction of new technologies and the impact of that on individuals and also society as a whole. And other disciplines that she researches are data protection and IT law. And also nice to mention I think is that she obtained her PhD in law also at our university last year in 2021. So please give a big applause to Sylvia DeConca. Okay, so thank you everyone. Nice to be here back to my one of my alma maters. And thank you for being here this afternoon. I don't know you guys, but I was very nappy today. Very sleepy for some reason. I think it's either the weather or the season I don't know. But so thank you for making the effort actually of getting out of the house and coming. I appreciate it. So as Anna was anticipating, I think it is a bit slow. Yes, because this is the topic of my PhD research that I carried out here at Tilburg University in the law faculty for slightly over four years. I have dug a lot into smart speakers and what home means for us and what privacy is. But I'm always actually very curious to hear from other people whether or not what they think about this topic. So if you don't mind, I also want to start with a couple of questions just to understand the audience here. And then we will proceed with the rest of the lecture. So how many of you have a smart speaker? Okay, I see a couple. And how many of you... So what are you studying? For example, is there anyone studying law like me? Okay, a few more. Other, you can yell some of the disciplines that you're studying. It's fine. AI, okay. Closer to home, I would say. AI everybody or there are other disciplines. Economics, also very close to home here. Very close to home digital culture studies. I feel that we're all revolving around similar topics. Psychology, very interesting. More or less that? Okay, well, this is actually a great mix of disciplines. Smart speakers and in general, IoT or smart devices in the home are somehow touch upon all of your disciplines one way or another. And I didn't mention all of them in the presentation for obvious reasons. But if you have specific questions, more concerning your field, feel free to ask them because we can actually elaborate on them together, okay? So for my thesis, I focused on the home, the concept of the home and how the law protects it because I'm a lawyer. I had to. No, I liked it. But I have looked into, for example, the human device interaction and how they can be made to manipulate or prompt users. I've looked into their business model, which is fairly mysterious, especially in the case of Alexa. I looked into the AI part with my very limited knowledge, et cetera, et cetera. So I'm sure that we can find something interesting. Oh, and before I forget, in my old office, I was here. I was there before I came. They still had some old, some copies of my thesis from last year. And you can just pick one if you want. They were here in the office anyway. Now, what exactly are we going to talk about today? I want to see the timing of this thing. Oh, yes, small introduction. As Hannah was saying, I am an assistant professor in law and tech at the VU. I did my PhD here. I also lived in Mexico for a while. I was teaching their IT law before coming to the Netherlands. But I am Italian. Well, I mostly work now on manipulation, but in general, on regulating new technologies through the law. And privacy has been a big part of my job for the past years, but not only that. Oh, yeah. And I have a cat that I have to put in every single presentation I give. It's what she wants, really. So finally, today, first, we're going to have a little overview of how these smart speakers work. Then we're going to look a little bit about what kind of values and concepts we associate with the home as one of the sanctuaries of our private sphere. And then what are the implications once we put smart speakers into this sanctuary? If there is some time left or some interesting, we can also look at one article of the GPR just to have an idea of what the law says mostly about the home and focusing more on personal data because of the nature of these devices. We're going to see their intelligence. And then I really want to hear from you guys your questions. Let's start with a short video that is a parody and it's a commercial from one of the Scandinavians and I always forget if it's like Sweden or Denmark. 930. Fire off. Open door. Door open. And we're going to do one more. Wrong voice command. Open door. Wrong voice command. Open door. Repeat that. Open door. I didn't understand that. I think you get the gist of it. Now that is a fake assistant but actually it is how smart speakers work. What are smart speakers? Well, these are the main models of smart speakers that are currently selling in Europe. There are others in China. There are others in Russia, et cetera, et cetera. This has the pointer. No. Does it? Does it have a pointer? No. Oh, it's fine. Oh, yeah. The red thing. Okay. So these two with the four dots, they are the Google ones that are now from, they used to be called Google Home. Now they've been incorporated with Nest, which used to be the smart thermostat. So this is the Nest family. I say family because there is a lot. There are many, many devices. This one is the Sonos, which is kind of gaining with the competition. And then there is the absolute Queen, which is Alexa. It used to be different. Now it's this tiny little ball. And this one is the HomePod for Siri. So the brands are Apple, Amazon, Google. Sonos, I think it's Sony now. I'll forget. I'm not keeping track with all the brands behind it. But this is not the only shape, the only form that smart speakers can take because they can even have monitors. This is exactly the same thing. Some are from Sonos, some are from Amazon, some are from Google, but they have monitors on them. And besides the voice, they can give you visual prompts. They can also be incorporated into fridges, ovens, coffee pots. You name it, really, any device. But so what exactly is inside? Well, let's take the standard smart speakers. Maybe. Yes. This is the old version of Eco from Amazon, so Alexa. This is the new one. Now the real guess of it is these are just speakers. There are buttons on top. You can mute them. You can turn them on. And here are the buttons. And inside there are speakers. You see a 2.5-inch subwoofer, a tweeter, the microphone array, which has at least seven or eight, depending from the model. But if this is all there is in there, mostly, I know the AI person is already like, it's not exactly what's in there. Luckily, there are no engineers, or I would have already been booed. There is a little bit more. But the bulk of it is really just microphones and woofers. So if this is what's in there, then where is Alexa? Or where is Google Assistant, exactly? Well, it's on cloud. This is how the European Data Protection Board represents how a smart speaker works. You ask the assistant, the software, so the voice that you hear, you ask a question. Your assistant turns on. A light starts blinking in some way, depending on the model. Your request is sent on cloud, meaning servers somewhere. And not one server. The operations are scattered towards several server farms. First, your sounds, your words are translated into text, speech-to-text. Then they are elaborated to look for the keyword for the comment for the order that you're giving. Then they understand the service and they activate doing something. For example, if it's something that the software can do by itself, like tell you the weather or tell you what's on your incoming mail or in your calendar, they are simply going to open that function. Everything is happening in cloud. And reply to you. And reply is text-to-speech. So first, a written text is generated, natural language generating, and then it is recited by the voice. If in order to tell you the weather, they need to open another app, just like on your smartphones, these things have apps. Or they need to even activate another device, like the thermostat, if you want an increase in temperature. Or the coffee pot, if you want your fresh coffee. Then the servers of Amazon or Google are going to talk to the servers of the company that runs your coffee pot or your smart fridge, let's say with Weirpool or Samsung. And then that company, their servers are going to talk to the fridge to activate it. It's a lot right behind the scene. Did you know this? I see some nodding. This is obviously a simplification, right? But this is roughly how it works. Now, this also means that roughly where there are these lines, this is where whatever you said, whatever you asked, and whatever data that has been collected, exits your house and goes on cloud. All the time in which the light is on on your device, that means that it's recording and streaming everything live on cloud and keeping it, storing it in cloud. Then it turns off and it stops listening. It only listens for the wake word. Meaning that there is a fraction of a second that is constantly analyzed and then deleted if it doesn't hear Alexa or Google or Siri. My laptop is closed because this has happened so many times during my presentation. But then once the task you've asked is completed, it's going to go sleeping again, only temporarily listening for a fraction of a second for the wake word. Here is a myth that we can debunk. It doesn't listen to all the time. It's fine. However, sometimes they misunderstand. I was in a presentation once and I was saying theory and I'm not an English native speaker, so my Siri activated immediately. So sometimes something can go wrong. Let's put it that way. Now, as I said, if you start having, this is how I represent it. You have the assistant in your house. The assistant also has an app on your phone, but you also probably want to have other devices. You want to have a smart TV, the smart coffee pot, the security camera ring, for example. You want to have the thermostat or the players, et cetera, et cetera. Or the Philips smart light bulbs, et cetera. So you start having a lot of these appliances that look like our traditional appliances, but they are connected to the internet. Some of them are compatible with yours more speakers. Some are not and then it's a mess. And many of them support apps. You can have your horoscope, your news, your music, your instant messaging, et cetera, et cetera. Now it's quite crowded in the house with this stuff. Because whenever you activate one of these smart devices, especially vocally, as we know by now, we go to cloud to perform all of those activities that you saw before. And behind each of these, each order, each command that you give to your smart speaker, there are a lot of people as in companies. There is the company, like Amazon and Google, that is providing the assistant. There are the companies that are providing the services. For example, if you want the news, you might have the New York Times app. So there is the company behind that is the New York Times. Or if you have the smart coffee pot, the smart TV, you also start adding, you know, Philips, Samsung, Sony, whatever, all the big names or the smaller names. So all of these companies are constantly among each other, transferring data so that your smart Internet of Things devices can work and can reply to your commands. Obviously, they're not doing charity. This is convenient for them. And since they are at that, they're also keeping and collecting some of these data for their own purposes. Okay. Now, as I said, these all happens inside your home. Now, it's fairly established that the home is a special place. The famous, there is no place like home. Maybe it's going to appear. Yes. Traditionally in Western cultures, the home is considered a very important manifestation, physical container of the private sphere. Okay. But what does this mean exactly? Well, behavioral scientists have looked into that. They've made a lot of research, empirical research, and they have come with some basic theories. The home is one of the physical containers, let's call it this way, the physical spaces in which our private life, our private sphere, manifests. What does it mean? It means that we attach certain characteristics to the home. For example, at home, we don't want to think about keeping control on certain things around us. And especially, we don't want to think about keeping control on ourselves. We need private spaces, the private sphere to unravel. The language use sometimes talks about remove the mask that we use in public, because we have multiple personalities and multiple images that we project outside. And we need, as humans, space to remove all the masks and just be ourselves. We need that time, you know, with no makeup, not changing clothes on the couch, binge watching, really terrible TV shows. We don't want to be judged externally for that. We want to be free to manifest all these things without consequences. Okay. That is necessary. So these are called unself-conscious actions. At home, we can carry out actions that we know, we don't have to think about, because we know that nobody is, hopefully, observing them. Why is that? Because one of the theories is that individuals need both isolation and participation to function properly as individuals and in society. Isolation means I need some time alone to recharge. Very intuitively. Participation is, however, I don't want to be an ermite. Okay, maybe sometimes. But I don't want to be an ermite. I still want to participate negotiating the terms of this participation with the community around me. And the home is the physical projection of this, because this is why, first of all, our walls at home are not transparent. Although disclaimer, in the Netherlands, frequently they are. I come from Italy, where we do like our curtains. And we definitely do not like to show what's inside our homes for several reasons. And here it is different. Some of you might have noticed, especially if you're not Dutch. Or when you're abroad and you're Dutch, I guess. But at the same time, we can get out of our homes or people can get in. And not just people. All sorts of things can get in. So we have, that's why we have telephones or the television. Technically, these things are also openings that we can use to look at the world outside or to talk to other people. The computer serves the same purpose. Internet serves the same purpose. But what happens is that at home, we have control over who enters and what goes out. Control becomes very important because thanks to control, there is a mechanism called appropriation. We appropriate of a house. A house is just a box. But then we start putting our favorite decorations, our favorite pictures, the color that we want on the wall, the furniture we rearrange it. And that's appropriation. We exercise control on it. And then we feel at home. Okay. These are very important elements of the home in the Western tradition. What happens if we lose the control on this home? The typical example is a burglar enters your home. Now, Per la Cora Sexerfati, a very important behavioral scientist, interviewed people that had burglars entering through their home. It was very fascinating because one of the things they said was, I couldn't help but thinking, what has the burglar thought of me after seeing my home? There was a sense of violation of our private sphere after that. Now, when the control that we think we have in the home, the expected control is lower than the actual control because of events like wiretapping or someone unknowingly taking pictures of us from the outside or burglar, that's when we experience discomfort. It can take many forms. One, it's technically called crowding, which means the psychological perception that someone else is there, regardless whether that someone is there or not. And then obviously we experience the loss of privacy. Privacy as in our private-ness, something that we want to keep private, something that we want to maintain control over. So what do we do when we experience this discomfort? We spend time, energy, both mental and physical money and resources to fix it. We put the curtains on. We change the locks. We put a fence around the garden. We wear sunglasses or have you ever wondered why in all the costume drama a lot of women carry a fan? Yeah, sure, it's a cute accessory but the reason why they do that is for privacy. It was a good way to cover their face. So over time we have invented tons of ways to fix this discomfort when we don't have the expected control. And this applies to privacy too. But that means if we're starting to carry out all these actions, that means that our actions are no longer unselfconscious. At home we're no longer relaxing. We're no longer letting go of tension. Okay, and then our being at home is a bit ruined. Does it make sense so far? Now, then what happens when we have an Alexa or a Google Home or an Apple HomePod? I think you're starting to see what I'm getting at by now. Well, they do look like magic and I was right. It's the main reason why I put the Enchanted House as the title of my thesis is because I thought of the Beauty and the Beast House. But what is rule number one of magic? Am I the only fan here of sci-fi and fantasy? Okay. All right, that is not rule number one. That is a very famous quote. I see what you did there. Magic is technology, we don't understand. The correct quote is, any level of technology sophisticated enough is going to look like magic. Now, what is the rule number one of magic? What happens if you, have you watched The Witcher? What happens when the witches gain their powers? You have to give something in return. You lose something. There is a price. That was a serious question. What is the rule number one of magic? Now you know what my students have to go through. So what is the price? What is the catch or the curse behind Alexa's magic or Google Home or the many more models? Well, first of all, Alexa works thanks to data about the environment and the actions we carry out. If you connect this, the smart thermostat, Alexa knows the temperature in the room. If you ask Alexa to change the temperature, Alexa kind of understands that you're cold. So there is knowledge and this knowledge is very granular because of the voice recognition and the voice profiles. They know exactly which inhabitant is requesting what inside the house. This goes beyond your smartphones. Sure, the smartphone knows it's you. You use your account, et cetera. The smartphone usually doesn't know that you're increasing the temperature at your house, at least not directly. And the smartphone doesn't know if you're home, what you're doing, when you're getting coffee, et cetera. And if you observe these kind of data packages for long enough coming in and out of a house, you know when they're home, what they're doing, when their kid wakes up, when they go to bed, when they're eating, et cetera. This also means access. They are in the right place, your private sphere, your home, at the right time, literally always. Now, for persuasiveness, which is a capability of machines to actually influence our decision-making processes, not necessarily negatively. Simply have an effect on them. Being at the right place at the right time and being reliable are two of the most important requirements, features. And now we literally have it around all the time. Now, these principles were established when we didn't have any of this. As a matter of fact, they were established in the early 90s by B.J. Fogg, which was a behavioral design scientist, the father of behavioral design. Then there is interaction. Voice is convenient. Voice is very convenient for certain people with disabilities. But it's convenient for us to just lazy. It's very convenient for elderly and children, because it is more intuitive. But it also makes us anthropomorphize these devices. Because by now, it's impossible to say Alexa and say it instead of she. The voice is enough. And also the little quirks, the snappy remarks, the jokes, even the mistakes, they all keep us really, really entertained and actually make us go back to them. And also, our own voice can tell a lot of us to the companies, involuntarily, too. And then persistency. So they stay there for a long time. You know they're there. You know roughly what they do. So you're not an expert. But you bought them. You put all the settings in. You know, unless you're in a creepy Airbnb. Gosh, always check the Airbnb for these things. Now, you know they're there. You put them there. You start forgetting about what they do and how with time. Because there is a brilliant news from Kashmir Hill on Gitsmodo called The House the Spide on Me, where she lived for a month in a fully connected smart house. And then she forgot she had the night security cameras and walked naked in her living room and then found out because the husband got the video. Weird, awkward. But they put it on. They forgot because they moved to the peripheral vision. They look like our traditional coffee pot. They look like our traditional TV. You forget that they're internet connected. If you really knew what that meant in the first place, which besides your AI classmates, many of us actually don't. Okay. So this is what's behind the magic of Alexa or Google Home or Siri and the others. I think I'm being too... See, the machine's already angry. Wait, I think... No, yeah, okay. Another short video. My husband and I would randomly joke sometimes like, I bet these devices are listening to what we're saying. Until recently, Danielle's home was completely wired up in every room with the Amazon Echo. Her family used the Alexa app to do everything from turn up the heat to turn off the lights. But Danielle told us via Skype that two weeks ago, their love for Alexa changed with an alarming phone call. The person on the other line said, unplug your Alexa devices right now. That person was one of her husband's employees calling from Seattle. We go around and unplug them all. He proceeded to tell us that he had received audio files, recordings from what was going on in our house. Danielle says her Amazon device in her Portland home recorded a private conversation and sent that recording to a random contact which happened to be the employee in Seattle. At first, my husband was like, no, you didn't. And he's like, you're sat there talking about hardwood floors and we're oh gosh, you really did. Danielle listened to the conversation and she couldn't believe someone 176 miles away heard it too. I felt invaded, like total privacy invasion. Like immediately I'm like, I'm never plugging that device in again. I can't trust it. Danielle unplugged all the devices. It was one of these that sent it. And she repeatedly called Amazon. She says an Alexa engineer investigated. Our engineers went through all of your logs. They saw exactly what you told us exactly what you said what happened. And we're sorry. We're sorry. He apologized like 15 times in like a matter of 30 minutes and said, we really appreciate you bringing this to our attention. This is something we need to fix. And did he tell you why it happened? He said that the device guessed what we were saying. When we asked Amazon questions, they sent this response. Amazon takes privacy very seriously. We investigated what happened and determined this was an extremely rare occurrence. We are taking steps to avoid this from happening in the future. You know, husband and wife when they're in the privacy of their home, they have conversations that they're not expecting to be sent to their address book. Gary Horker, Kyra 7. So this was truly an unfortunate accident. There must have been some misunderstanding and some of the things that they said triggered some commands. Nobody's actually faulting Amazon for this. It makes sense. It can happen. But pay attention to what the woman said. She said, it felt like our privacy was violating. People are saying private things in their homes. Now, a very common reply to this is, yeah, sure. But you put it there. What were you expecting? Which is quite fun because we know now that actually in the home, what we are expecting is a certain level of control on it, on whatever goes in and out of it, and a certain level, a certain expectation of privacy, right? And the fact that we want to have the convenience of a certain device, especially as random users that might not be very tax-heavy, does not mean that because we bought it, we need to expect that one of our employees in another city is going to receive some of our conversations, right? So one of the things that definitely happens when we insert these devices inside the home is the mismatch between the expected control and the actual control. Because if you remember the overall ecosystem that I showed you before, everything that we're seeing while Alexa is on, or Google Home, not just what we tell them. When you see the light blinking, it means that whatever sound they're picking up is going to be recorded and stored in cloud. So all of that is going out of the house and stored in cloud. And also all the involuntary data and knowledge that we're sharing simply by using these things. So every time the smart coffee pot pings the servers of the company that makes them to say, hey, I'm being used, and it's six in the morning, they woke up. Or the smart TV is it's four in the afternoon, someone is watching cartoons, I guess the kid is back from school, right? So all of that knowledge is leaving the house outside of our expected control. Even though we bought these devices, because the technicalities behind them are quite sophisticated. Now, there is a part about what the law can do about that. I can simplify that. Do we want to quickly go through that? Can you make it non-lawyer people? Okay, it's good when you don't understand something, ask. So again, if we look at this, the law can intervene at different levels. One is definitely empowering the user, giving the user more information, forcing the companies to inform the users, giving the users rights. For example, you can ask Google to access all the data they have about you, same with Amazon. Do you have a Kindle? No? Nobody? Everybody here loves paper? Okay, a few of us do. If you ask, if you exercise your right of access under the GDPR and you ask all the data that Amazon has about you, it's going to include all your readings on the Kindle, how long it took you to read a page, every single page. They keep all of that. If you exercise with Google and Amazon and you have smart speakers, it's going to include all your recordings, which I'm sure by now you know, you can access, right? You can also have periodically deleted, etc., etc., all these functions are being added with time. But in general, the law intervenes at three levels. One is the national legislation. You have constitution, you have national laws. The constitution protects your privacy too and gives you certain rights. Then you have the European Union here in Europe. Some of us, 27 of us do. You have the European Union that has the EU Charter, called EU Charter of Fundamental Rights of the European Union. It's like the constitution. It has rights. It has our fundamental rights. Privacy is one of them. And interestingly, data protection is a specific express right separated from privacy. That is an unicom. Only in Europe, the two are split and the two, so data protection has been promoted as an independent right from privacy, personal data. Fascinating for us, low nerds. Then you have international treaties. For example, still in Europe, but not coming from the European Union, coming from the Council of Europe, which is now European Union. There is, however, another Council of Europe that is European Union. Yes, we love to complicate our lives. There is the European Convention on Human Rights, ECHR. That has rights that are substantially the same to the European Charter because the European Charter came after and copied them so that we wouldn't have any problem of compatibility except for data protection that we decided to separate. So there are a lot of fundamental rights. And, for example, in the EU Charter, European Union, we have Article 7 that says everybody has the right to respect for private life, family life, home and communications. Now, roughly interacting with a smart speaker is very interesting from the perspective of private life because, well, they are in our private sphere. Family life, yes, but it's more about private life in this case. Family life is more about not having families separated and stuff. I do think it matters for our home. The right to the respect of the home applies whether you are renting, whether you're temporarily, whether you're squatting, illegally occupying, it doesn't matter. And it says that you need to enjoy your home because it is part of your private sphere, undisturbed. So if there is pollution coming in, for example, they open it in an airport nearby, that's affecting your right to respect of the home. Communication, I guess it speaks for itself. It's about your phone calls, your emails, your letters. It started with the letters, your WhatsApps. We assume also things like voice over IP like Skype. There hasn't been a case about that, but we can roughly assume it's also that kind of stuff. Then in Europe, we separate personal data protection and we say Article 8, so right after Article 7 for privacy, Article 8 says the processing of personal data must follow some principle. It must be fair, lawful, transparent, have a legitimate reason, et cetera, et cetera, et cetera. There are some rules. However, by separating them in Europe, what we are saying is there are cases in which personal data, data about an individual can be protected, whether or not privacy is affected and vice versa. And then there are cases in which both are affected. Now, my position, which is very obvious in this case, is whenever we interact with a company like Amazon or Google and there's more speakers, we are here, because it is about our home and our private life and our communications, and they function, literally, they feed on our personal data, otherwise they cannot function. Remember the whole, you know, you have a voice profile, your voice is your personal data. Whatever choice you make through them, it's your personal data. It's always associated to your identity, right? They know it's you asking things. So this is part of our fundamental rights, even when we are interacting with companies, because the law, the International Treaties or the U Charter, like in this case, they say that the state needs to protect us and our privacy, even when we are interacting between citizens or between citizen and a company, that's called horizontal relationships. So what does the state do to protect us in Europe? They created the GDPR, the General Data Protection Regulation. That one regulates the processing of personal data done by companies or by other individuals on the data of natural persons, meaning humans. It's at European Union level, but because it's a regulation, it automatically applies in all the states, plus in the Netherlands, for example, you translated that with the AWG, Dutch lawyer is here in the house. AVG, okay. It applies when someone is processing data concerning an identified or identifiable person using anything digital, really, a computer, phone, a server, digital, basically. What does it say? Well, it has a lot of basic principles. Then it has some rights for the individuals whose data are being processed. You can access it. You can have it deleted. You can correct them if they are incorrect, et cetera, et cetera. And obviously, the people managing and processing your data then have obligations so that they have to respect your rights. And then there are fines, there are rules, there is an authority created to have this respected at European level, the European data protection board, et cetera, et cetera. It's a big, long, extremely complicated law. The only example I wanted to give you from that, because otherwise it gets really complicated, is the basic principles of the GPR. There is an article that says there are six general principles. And everybody that is processing personal data of another person must respect these six. Transparency, they need to inform the people that they're processing this data and for what reason and how. Purpose limitation. Before you process this data, you need to decide why you're going to process them. Because once you have decided why the purpose, then you make sure that you only collect the very minimum amount of data necessary for that purpose. The data must be accurate. You must make sure that nobody hacks them or you accidentally don't leak them. And you need to keep them only for as long as it's necessary. Very general. And then everybody, there are other rules, obviously, to specify them. Everybody needs to complain with these. But what does it mean? Purpose limitation means that the personal data must be collected for a specified, explicit, and legitimate. You can't do that for crime. Easy. Maybe. Depending on your business. Purposes. And not processed for other additional, further processes, unless they are logically connected. Which he says, unless they're not incompatible. Now, have you ever read a privacy policy? Because with the privacy policy, I love the chuckling. Please. No. Ever. But with the privacy policy, they also inform you of these specified, explicit, and legitimate purposes. But they do that their own way. Which means not very specific, not very explicit, and very chaotic, and complicated. So you will give up. They've got to find for this reason. This is the Google privacy policy that applies to smart speakers, too. It applies to 62 services of Google. It's already not very specific. One of the purposes that they say is, maintain and improve our services. We use your data to maintain and improve our services. Now, if you are a user of smart speaker, these should make you aware of why, if you read this, you might be able to envision consequences for your data. We found out that maybe that wasn't working super well. When in 2019, there was a tiny scandal, because Google, Amazon, and Apple, they were doing something that is very common, which is whenever there is a conflict, meaning their voice assistant, the software has not understood the voice command, that tiny snippet of sound is sent to an employee or a contractor, a human, that solves the conflict and tells Alexa or Google Assistant, hey, this is what they meant. Remember it for next time. Sorta. I am explaining it in simplified terms. These small snippets of recordings were sent to humans. The humans would listen in, fix the conflict, and inform the software. Mostly they were anonymous. Mostly. Sometimes it happened that they accidentally were an assault. Sometimes they accidentally made names. Sometimes they were connected to an address, so there was still some information in there. But overall, technically, we have been informed. This is the way they informed us with this. Now, unless you are familiar with natural language processing, you don't know that it is a common practice to do this. And so when business insider made this scoop, people were upset. There are articles and interviews that say I felt violated in my privacy. This is a recurring thing. I felt violated in my privacy whenever something goes wrong. So the company has kind of changed a little bit. Some put it as an opt-in. Some put it as an opt-out, which is also not compatible with the law. It should be an opt-in. So they said, do you want to participate in the conflict fix now? And you can select it. With Alexa, you need to deselect it. They automatically put you in and you have to say no. But you see what I mean? This is the kind of way in which the GDPR, the purpose limitation principle, has been enacted. And however, we do find some problems along the way. Develop new services is my favorite one. They tell you, we might use your data. Well, we collect the data for the existing services, but we might develop new ones. And then the example is how people organize their photos in Picasso. Maybe you're too young to remember that. Help us launch in Google Photos. That's a lovely example. These are two patents. This is from Google. You can Google it. This is the patent, the existing patent. And Amazon also has confirmed they are developing emotion recognition from voice. Received voice input, determined the user has an abnormal physical or emotional condition, determined the appropriate audio content, reply to the user. This is not exactly Picasso in organizing pictures. But these, these meant these. Also, they are presenting it as health and wellness app. But once they unlock emotion recognition through your voice, they can use it for advertising, which is the true reason for Google. Also, the psychologist and the AI person in the room really need to talk about it because emotion recognition is not exactly a science. It does go into pseudoscience or prejudice or discrimination very easily. And I am an Italian, so I know because people think I'm upset sometimes when I'm not. I might yell. But anyway, these are examples of how the GDPR tries to intervene. It does a good job because it forces them to explain things, but they try to elude it. And you might think, okay, but then everything is lost and the GDPR is not working. Well, truth is, because there was the original rule in the GDPR about purpose limitation, then the authorities, they see this, and then they see this, then they can go to these companies and say, wait a minute, let us review what you're doing here. Okay. So this is still a very important threshold that's being established. Plus, there are also more detailed rules, but I'm kind of skipping them for the sake of your brains because it's a bit boring. This is fine. One last thing, data minimization, remember, once you have established why you want to process the data, you only collect the very minimum amount of data that you're going to need for that, and you only keep them for as long as it's necessary. But we saw that everything here happens in cloud. And if Alexa or Google Assistant need to constantly refine their capability of understanding natural language, then necessity is perennial. You have an indefinite need for these data. So why, how do we make this being respected? Authorities actually clearly said, well, the fact that you might want to continuously refine your service or find new services like emotion recognition, first of all, doesn't mean that you can keep, you can collect more data that you actually need to. Once you actually have the necessity, you might collect more data, but you can't simply keep data because maybe in the future I can do something with them. So that was very important. However, I'm sure that your AI classmate now is, okay, but how am I developing the new services if I don't have the data? Well, you are going to go through R&D and collect other kinds of data. It is important to strike a balance between what the technology needs here and our homes and the data in our homes. That is a very sensitive sphere that we have. And the storage limitation, because of this principle, now there is an option in some of the smart speakers where you can set a periodical deletion of the recordings. It's like, you know what, you have learned so far, remember what you have learned, but I'm deleting the older data. So these are actually becoming very useful with these kinds of constant moving targets that the AI discipline has. That's it. So what do you think? Questions? Yes. Oh yeah, we also have a microphone. Very well equipped. Also, I'm getting blind. Yeah. Hi. Thank you for your talk today. Recently, I've been feeling like a lot of tech companies are really saying they really care about privacy, especially something like Apple, which is kind of taking privacy as one of their unique selling points now compared to Amazon, for example. So what do you make of this? Is this just marketing or do you really see kind of a shift into they really do feel like privacy needs to be more of a concern? Very good question. So I think there are two things to consider within a company. First of all, a company is not one brain. It's the brains of all the people working in it. And for sure, the marketing department is the one that says, oh, this is an amazing marketing slogan. We're going with it, right? But that doesn't mean that some of the engineers and legal department are not on board with privacy for other reasons. However, sometimes their definition of privacy, their definition of what is a processing of personal data, their definition of, for example, necessity when you need a data is different. Because these are, first of all, this discipline, the GDPR, is the follow-up of an older law, a directive from 1995-96. And now it entered into force and into effectiveness in 2018. So it's relatively new for the law and people had to adjust to it. And the adjustment is bigger than we think. So historically, when we talk about privacy for people actually working in the computer science and information science and AI and programming, we were historically talking about confidentiality, meaning protecting your database from hacking and leaks. The idea that privacy involves the home, the family life, the private sphere came way later. So we really need to align knowledge here. For example, processing for the GDPR means every activity you do with the data, the collection, the storage, and actually manipulating the data. But for computer scientists and programmers, actually, without storage, there is no processing. So when you have someone in a company reading the GDPR and saying processing, and they're only collecting them and maybe not storing them or they store them in an unstructured way, they think, oh, that's fine, we're not doing that. But actually, the law means that they are already doing that. So there is an alignment of values also behind those competence and needs to happen more. And then there are also those companies that objectively don't care. I don't name names, but I'm only going to say I don't have any of these devices at home after four years of studying them. No, but there are also companies that have other interests. There are also companies that think that maybe privacy is important between the citizen and the state, but not between the citizen and the company. I do think that there is also a lot of the perception that privacy is lost and it's gone and we don't have privacy. I feel it's because we are actually discussing it. It's not true. We are simply having a public debate as a society, as a community, each in their own region or country, deciding what is the threshold of privacy that we are comfortable with with these new technologies. All right, maybe another question as well? Hi. I want to ask you that when who bought these devices give consent for privacy policy, but there's other people at home also and are they inside that privacy policy or we can have guests at home or workers and the device listens to them also and I found it problematic. Yeah, I find it problematic too. So first of all, the devices mostly work with what is called the household profile, meaning that you have an umbrella account and then each individual that wants to use the device can register their voice profile and they will be recognized, identified through their voice. It's basic biometric identification through your voice. And then if you ask, for example, okay, Google, tell me my calendar for today. In this way, Google is not going to say someone else's calendar in the house, right? So then you participate. There's also a problem with children because technically it's the parents that agree for them, but maybe what if the child growing up decides that he didn't want that? No child whatsoever because they love these devices, by the way. But so there is the possibility at least to have different profiles in that sense. If you are sharing a house with like one of your housemates and they want it and you don't, you really need to have a talk because if the device is running and the light is on, whatever sound is picked up by their sensors, including your voice, it's being recorded. Even if it's not identified, you're still being recorded. And remember what we said about spending resources and time and mental and physical energy to fix a situation that doesn't match your level of control that you're comfortable with. It means that you might start lowering your voice. You might start avoiding certain rooms and that's no longer a good feeling in the house for you in your own private sphere, right? And for the guests, I actually, the risk and it's not clear yet, but the risk is that the person that has the device and has the guest over can potentially be subject to the GDPR as a controller of the data, not as the person whose data are being processed because you are the one activating it. So in a certain sense, you're kind of establishing the purpose of the processing of the data of the guest or of the worker. You're correct. So it's not clear yet whether or not they're going to have these humongous responsibility. My suggestion is we have etiquettes and we have social norms. If someone is coming to your house and you prefer to have their shoes off, you tell them. So it's very important that if you have guests, you ask them, you tell them, listen, I have one of these devices. Look, it's here. Do you want me to mute it? Or are you okay if I use it to put on music? So it's a bit of a courtesy because their voices are going to be recorded accidentally anyway and stored unless you delete them manually. But we don't know yet. It's going to enter the territory, right? And a lot of people don't know. I spent a few days with my in-law in the States and they had one. I had not seen it at all. And I had been cooking several days in a row in the kitchen. And then I was talking about my thesis back then and funny talks. And I said Alex and the Alexa turned on and I was like, oh, it's been there the whole time. Nobody told me, right? And I was like, I wish I knew, honestly, for obvious reasons. So I do think we also need to develop a little bit more sensitivity. In 99%, I mean, nobody's really going to complain, but it's worth asking. All right. I think we have to end it there because it's time. I might have over I promise a shorter presentation. That's okay. That's fine. Yeah. Thank you very much again, Sylvia, for the lecture. I think it's really striking that you say that you don't have one of these devices at your home after four years of studying them. Yeah, that also really struck me that we bring these devices into our homes and to our most private spheres and that we don't really exactly know how they work and what's all behind it. So very good to hear about that today, I think. Also, join us for our next lectures that we have coming up. We have many to come in the month of April and also stay tuned for a night university, which is coming up in May, May 19th. And thank you again for joining us today and have a nice evening.