 Let's see how it goes. It's been charging. Great, go team. I'm going to be reading or describing some of the slides in case anyone has vision problems or is later watching a video and having trouble seeing what's on the screen or unable to see what's on the screen. I would also request that people don't interrupt me during the talk. There'll be time at the end for questions. So let's make a start. I want to briefly mention why we're talking about this. When I pitched this talk, the conversation we were having around internet of things devices was very different than the one we're having today. This is because there has been a number of cases of people having their privacy and security violated by IoT, especially surveillance devices in their homes. However, most of these comments are focused on the individual. They're thinking about your family or yourself. And I think we really need to look at how these things are impacting society at a larger scale and how they're impacting people who are different from us. So I'd like to start by talking a little bit about myself. I work in free software. As we said, this is a screenshot of the GNOME desktop environment. It's pretty cool. I assume many of you use it. At least I hope you do. And I'm a fan. I think it's a great project. In addition to working in FOS, I also do a lot of other things. I'm on the OSI board. I am a member of the Debian project. I volunteer for the Conservancy. I donate to organizations like Public Labs that are working on free and open source software projects and developing tools that can be used by people to create social and environmental change. So and in addition to all of that, most of my friends are people I know from free software. So in what little time I have left, I like to not think about technology, which isn't actually true because I spend most of my time, in addition to that, thinking about technology and ethics and societal impacts. But I like to sometimes think that I do other things. I like to hang out with my dog and my cat. This is a rare photo of them cuddling me. That is the dog Shiloh and the cat Bash. They are both wonderful. I also really like to bake. I bake basically everything I can. In this photo, in the slide, you can see a cake I made. There are some American biscuits, muffins. I made some queen amone that didn't quite work the first time, but it's a lot better now. And pancakes, which aren't necessarily baked, but are still good and kind of count. Specifically, most of the baking, because I work from home now, goes to my roommates. I have three amazing roommates. I've lived with two of them for about seven years now. We have a really good relationship and we get along well, which is why, and we were in Boston, which is why when they bought a house in Providence, it actually made a lot of sense for me to go with them because living together was so lovely. So we have one other roommate who moved into the apartment in early September, being as a baby, not my baby, but still a baby. She's, as of right now, eight months old, as of a few days ago, and that's very exciting. It's really exciting because Bean was born at 24 weeks. To provide a little context for that, in the US and in many places, if a child is born at less than 24 weeks, they aren't given lifesaving, extreme medical interventions. The youngest child that was born, that was known to survive, was at 21 weeks. This is not a picture of Bean, this is a 26-year-old, 26, 26-week baby, that I found on the internet. So if you're putting pictures of your sonograms and fetuses and babies on the internet, I think that's a little weird. But it's cool if that's what you're into. They can't consent. They can't consent to having a photo on the internet yet. A little, that's too many slides. Another note is 40 weeks is considered full-term. So Bean was about halfway finished the process of becoming a person who could survive in the world. Another way, so this means that this child is a special needs child. She has had a lot of compensations, interventions, and medical appointments to make it so she can be okay. Two of us work from home, which is really great because like I said, even though she's not biologically related to me, I help a lot and I'm very attached to this child. She's puked on me twice and I still like her. So we're able to provide a lot of the support and help her do things and keep an eye on her because when you have a special needs child, a lot of nurseries and daycare centers won't take them because of how much care they need. And also it's a bad idea because many of them are immunosuppressed and their immune systems are especially weak so they can't be around other children. On the number of times Bean has left the house for not going to the doctor is I think three or four times since she came home. One of the things she had when she came home was an oxygen tank. This is a picture of a dog with an oxygen tube up its nose. When people are on oxygen, they have a thing called a cannula that goes to their nose. I called it a cannoli for a really long time because I couldn't remember the word cannula and I practiced it for this talk, actually. So Bean was on oxygen and what that means is we had this big machine that took air and converted it from what we're generally breathing to a much purer oxygen level and then that was given to the baby. She additionally had connected to her a big box with a bunch of sensors that let us know things like her heart rate and her blood oxygen level. For one sense, this is actually really convenient because if there's a problem with the child then you know because it starts beeping but it also starts beeping all the time. Therefore, going outside is a really big deal. Leaving Bean alone and actually even being alone with her is a big deal because you're taking on a much greater responsibility than you would with a regular child because of things like it's possible that her little poorly developed lungs would stop working and that she would need to be rushed to a hospital. That is not an uncommon thing. So going outside, we use oxygen tanks. So in the house we had a bunch of oxygen tanks and taking her outside would mean disconnecting her from the oxygen concentrator, connecting her to an oxygen tank, putting the oxygen tank in the special oxygen tank bag that you would carry on your shoulder, carrying the baby and carrying the sensor. This means you can't carry a lot of other stuff. Now I'm gonna describe one of my worst nightmares for when I was alone with her, the doorbell ringing. So we live on the second floor and if the doorbell rings, see, IoT, it's coming up, if the doorbell rings, that means somebody has to go downstairs and answer it. And if you're alone, that means either leaving the baby alone and I don't actually know at what age you can go downstairs without the kid but I'm really scared of killing my roommate's child. So I definitely carry her around the house when I am alone. So something as simple as answering the door would mean connecting her to this oxygen tank, taking the sensor, taking the baby, taking the oxygen tank, going downstairs and opening the door. So I think things like tech could probably help us a lot and make a really big difference in our lives. I specifically wanna talk about smart doorbells. It would be great, smart doorbells are so cool. Ever since there's an episode of the X-Files in season one, I think, where there's this guy who has this fully automated house, that's actually evil, but other than the evil part, I was like, that's so cool, I want one. Also, I've seen a lot of horror movies so I'm a little paranoid about answering the door when I don't know who's behind it because it could always be someone insidious. Therefore, the idea of a smart doorbell is great especially for us, we could see who is there. So we would know if it was an encyclopedia salesman if those still exist, but we would also be able to see how we would need to respond, could we open a window and yell, would we need to go down. So there's a little more space to feel safer in our decision making. So the conversations we're gonna be having for the rest of our time together are not necessarily about tech. They're about the role tech plays in our lives. Specifically, we're gonna be talking about smart doorbells, smart locks, and in general, home surveillance devices and what that looks like in practice, how that affects you, but also how it affects the people in your lives and your neighbors. This is not a dog, but I loved this photo so much because it is a horse with a beware of dog sign on it. I think home surveillance devices are kind of creepy and really terrible. There have been numerous cases of things happening with them that are not good. I think one of the most well-known ones is when a camera was hacked into that was set up in a child's room. We're not gonna get into right now how creepy it is that the child's parents put a camera in their room. Though I have heard arguments that I could understand were compelling, why one might do so. And that was hacked into. And so what happened was somebody was talking to the girl, trying to make her do things and using really inflammatory, uncomfortable and offensive language to her. And that's really scary. And that's really scary for parents. So there's that kind of implication that it has, but there are also ways that it affects the people around you. One of these is I don't know how many of you saw the ring commercial that they made about Halloween, where they used video from cameras that were recording trick-or-treaters. A lot of this is coming from weird licensing agreements that come with these devices that require you to allow them access to some of the footage. As an aside, generally this footage is stored, but it's not stored on a server that you have in your home, it's being stored by a third party. And as we know, the cloud is just other people's computers. So you're making a decision about trusting these actors who have financial and commercial interests in exploiting you to use this very personal information and hope for the best, right? One of the things they're pitching is creating a new neighborhood watch, right? The neighborhood watch is when people in a community take turns keeping an eye on things to make sure nothing bad happens. But when you digitize that, you're creating a much broader scale of what's being recorded. And you're recording things. You're not just noticing things, right? Part of this to encourage this development, we see things like something called the Neighbors app or the Neighborhood app, I forget, I'm sorry, which allows you to make reports to people in your neighborhood who also have the app and ideally also have ring devices. This allows and creates a higher tension neighborhood. But it also creates in general more monitoring. And these sorts of things lead to chilling effects. Something I'm really interested in as like this aside is these ideas of benign acts of rebellion, especially among teenagers. We have these cultural narratives very focused on things like sneaking out at night or sneaking in too late, right? There's one of the things that circulates around the internet every Christmas is about a family that teaches their children to leave people presents around the holidays. And now doing that is suddenly a thing you can't do without people knowing who's doing it. And that might seem a little silly, but one of the things we're doing is we're creating a generation and future generations of people who won't have access to those experiences, won't understand them and in general will be much more used to being tightly controlled and monitored at all times. And once something is set as a precedent, it is a lot easier for other people to follow through on that precedent in the future. It creates a group of people who are complacent with the idea of being monitored and with the idea of not just being monitored but that being recorded, right? So there are these minor chilling effects, but also knowing you're being recorded does discourage people from taking actions. And this isn't just like taking bad actions and being bad actors. And we can see how CCTV closed circuit television is really effective in keeping people from committing crimes in stores. But also it does discourage things that are benign or things that are benevolent and good. Something else is there are police partnerships, right? There are, Amazon with the Ring device is really encouraging police partnerships. This includes police going around neighborhoods and encouraging the adoption of Ring devices, them offering discounts for people who purchase in connection with the police force or law enforcement. So like financial incentives. And there are also police force, law enforcement organizations are being offered information that they're not supposed to be given. And some of that information, A, is things that you say you're not, that they say they're not going to share. One example of this is during their sales pitch, during the Ring sales pitch, there's usually included an example, a demonstration or a slide that gives a sample of what it looks like in the neighborhood. So it alerts the police which neighborhoods have Ring devices. So before they've even bought in to the idea, like before they've even bought in financially or psychologically to the idea that they now have an investment in what's going on and that they deserve the right to the knowledge of who is doing what, they're given like a pre-example of what's happening. One of the things that comes out of this is false arrests and increase in false arrests. There's at least one case of a black man in the United States who was arrested because he was falsely identified from Ring footage. And that's really scary, right? So one of the things is that people in general are much worse at recognizing people from other races than themselves or people from races outside of the communities they grew up in. So in general, when you're trying to connect things, you're going to have a harder time making visual recognition. But you're also going to have problems with facial surveillance. I would say don't, or facial recognition. Now I would say don't even get me started on it, but I really want to talk about it. There are lots of recorded problems that come with especially facial recognition software. One of the most common things is that it's bad at recognizing people from, I'm going to say minority groups, but they're not actually minorities, especially when you look at a global scale. What they really are are people who don't look like the majority of the developers working on these technologies and this software. Kind of one of the classic examples of this is for a while, Google's image recognition software was identifying black people as gorillas. And the last I heard attempts to fix it, what they did was they actually just, instead of fixing it, removed the ability to identify gorillas from their image recognition software. There are also problems recognizing women, indigenous and other people of color. Trans people have a lot of problems with different types of image recognition, especially when they're trying to divide people into a gender binary, which is a thing that happens. So I want to say Google, if you did fix the problem with identification and the gorillas and black people, thanks a lot. And if you didn't, I'm going to ask why you haven't fixed it yet. And I think that really should be a priority because if a technology does not work for everyone, it doesn't work. One of the things that is the dream and the promise of free software is that we're gonna be able to build technology that works for everyone, not just the people developing it, not just the idea of who we have using it, but anyone who wants to have access to it. We're designing software for people in all countries, people of all languages, people of all abilities. When we talk about the practical benefits of free software, we talk about things like the ability to specialize technology. We talk about the ability to make modifications. We talk about how you can make modifications that make it easier for people who have movement issues to use something, right? I learned recently about some things of capacitive recognition, so like using touchscreens, and the way that mobility issues tie into that. So ideally what would happen in response to it is like a community would take that on and make those changes because they're able to, right? So there's this beautiful promise that we have for the future of technology and the future of building this like wonderful equitable world that comes from free software. So when our technology doesn't work, it doesn't work for anyone. When I was researching this, Ring did not use facial recognition software. However, they had this guy. This guy, according to the internet, is the head of facial recognition at Ring. So it seems to me that they have a plan, at least to fully incorporate that into the way that things are working. There could be some benefits to this theoretically where you can have your doorbell recognize the people who live in the house. Oh, that's the next slide. You can have people of smart locks. You can have people installing smart locks and then have those smart locks recognize who lives in a house and give them immediate access to getting inside without having to use a key, without having to store other biometrics. In general, I think a lot of us view our faces and our gates because there's also gate recognition. That's being developed. But our faces, our gates and our images as a thing that is a lot more public and that other people just have access to. In a number of places, you lose the right to privacy and to not being photographed once you're in a public space. So suddenly these things we value less, they're being stored and that's nicer than having your fingerprints stored. One of the games I like to play is the idea is when you look at a technology and you say, how can this be used for horrible dystopic future? And when I think about facial recognition and cloud storage that is famously and commonly extremely insecure, you suddenly have this wonderful collection of faces. And now that we have deepfakes, which I learned what they were recently. I mean, I knew what they were, but I didn't like, I was like, oh, deepfakes. Now that we have really incredible ability to fabricate images, we have the ability to fabricate images with this data we have, right? When, yeah, so we have this ability to use that. And I think that's actually really scary, all sides of this. Also tying back into the fallibility of facial recognition software, we suddenly see a greater likelihood of people being unable to get into their homes because they don't look like the people who are developing these technologies in the first place. Most smart home devices and IoT things in general are installed by men. We're transitioning a bit. And that's just a thing that happens. This is not true of all cases. I certainly want to give acknowledgement to women and non-binary people who are taking on this ownership themselves, but the truth is just most of them are installed by men. This is where I'm gonna talk about abuse and domestic violence for a few minutes. So if that's a thing you're not comfortable with, that's cool, and you should leave the room or like ear muff. The majority of intimate partner violence, IPV, is perpetrated against women. This means that by and large, when you have domestic abuse happening, when you have people physically injuring their partners, it's being received by a woman. And male perpetrated intimate partner violence results in more injuries. So when cases where a woman hits someone in general, in cases of intimate partner violence, there is less physical repercussions than those happen when they're done by men. Incidentally or additionally, the majority of stalking victims are women. I don't have this number in my notes. I forget, but I think it's something like, oh God, I might be mixing up rape and stalking statistics. No, I don't know the stalking statistics. Off the top of my head. I'm sorry, I'll look it up later if you're curious. I have it somewhere. So stalking, that's a thing, happens, especially to women. The majority of rapes are perpetrated by men and the majority of stalking is done by men. And the majority of the victims of these things are women. So in terms of rapes in the United States, about one in four women has an experience of being raped over the course of their life. On the other hand, I think it's about one in 19 men. And those are terrible, horrible statistics and they make me really sad. And they're much worse than they should be, but those are the numbers that I've found. So things like smart locks and IoT devices in general enable this new type of abuse that we hadn't seen before. So some examples of what has happened to people as there's big things, like a woman had a stalker turn off the brakes in her car when she was on the highway. He did this remotely. He did this because he had access to her system. He had hacked into it. He was able to control this. He was able to track her using the GPS in the car. And he shut off the brakes on the highway. Really scary. We have had cases of what might seem like minor domestic abuse, not violence, but abuse of things like people controlling temperature when there are smart home thermostats. So alternating the temperature between being really hot and being really cold, especially at inappropriate times. This creates discomfort. Things about controlling sound systems in homes, which creates a space where maybe you can't sleep, right? Something recognized as a form of torture is preventing people from sleeping by playing loud music or loud noises at regular intervals to prevent them from having the opportunity to sleep. So these are a few things that I knew happened. And here are some things I don't know if they have happened yet, but there are opportunities to do things like, say, lock someone out of their house or unlock the door to their house. So let's say you have a stalker or a violent partner that you're trying to hide from. And they can get into your home whenever they want, right? And this can be to infiltrate your personal space, but it can also be to harm you. A lot of these kinds of things in general are affecting our sense of security in the safe spaces we do have. And we have very limited safe spaces, especially as more surveillance is happening. We have very few places where we can be alone and we have very few places where we can be secure, right? When you lose the opportunity to have that in your own home, you're creating a space for your, like you're giving up this kind of space for yourself. You don't feel at ease. One of the things that women who've been the victims of IoT-enabled abuse have reported is that getting rid of the devices is like the best and is very freeing. So we have this technology that is like, you know, actually really cool and can make a big difference in the lives of people, especially people who have mobility issues, who have vision issues, who have hearing issues. You know, they allow you to connect with the world in these much more like intimate ways. But these technologies also have the ability to really hurt people. And those aren't the conversations we're having and those are the conversations we really need to be having. So right now, I hope you should be asking yourself or thinking, this is terrible, why did I sit through 30 minutes of you making me feel bad about the state of technology? So there are things you can do to make a difference, especially as people who are technologists, especially as people who are invested in technology, who are working for nonprofits, who are activists, who are involved in policy and making changes, but also just as citizens, because as citizens of where you live, you have power, right? So one of the best things you can do is just be thoughtful. Think about what you're designing, think about what you're building, think about your community, and make sure that the policies being created, that the laws being created, that the technology being created is equitable and safe. You can't necessarily make parental controls on an operating system like one of the things that could potentially be a problem with parental controls on any type of system is it provides isolation for GLBTQ youth. So that's not good, but there are a lot of good that come out of it too. So you might not have answers to these questions and that's okay, right? At the moment, we don't have answers to all of these questions, but if we think about them, if we attempt to incorporate these ideas and if we attempt to build things like knowledge sharing and education into both our development and the use of these technologies, like that's a really great starting point. Something else you can do is putting pressure on people and governments when designing technologies and policies. Pressure campaigns can work. They have been successful for different things. I would really like to at this point draw attention to the organizing that Google employees have been doing around their rights and around labor practices. There's like equitability for women, terrible things people are doing, blah, blah, blah. Oh, Dragonfly, that's what I was thinking of. Project Dragonfly was when Google made a deal to build a censored search engine for China, for use in China. So that was the thing that Google employees put a lot of pressure on the company about and the result of that was the project was canceled. When in Massachusetts in New England where I lived until very recently, there has been no facial recognition, there have been facial recognitions banned past in different places in the state. And that's really incredible and really wonderful to see. And the work has been done by organizations like Fight for the Future and the ACLU, but also a lot of the work has just been done by individuals who care, who want to make these things happen and who want to see a difference occurring. You can also just make your IoT open and free, right? All those benefits that we get from having free software hold just as true for IoT. I know that after this, there's going to be a talk about the ethics of AI. And I think that really ties into this when we're looking at things like image recognition, voice recognition, gate recognition, how that's being incorporated into IoT devices. And by having these things be available and auditable, auditable is a really big thing here for me. You're creating a dynamic where people are able to make these changes, to know what they're using, to know what they're consenting to use, to know what they're putting into their homes. And I think that's really powerful. But you're also making space for people to make the kinds of additions and modifications to these technologies to make them safer for their own homes. I think the argument that like, oh, you can modify this technology if it's free is not the best one because that's great. But a lot of us, I don't have those skills and I don't have the resources to pay someone to who has those skills to spend the time developing something for me. But I found that there are people out there who really care about what's going on and who will make those kinds of development choices. Something else that I know Matthew Garrett has been doing is doing IoT security testing, then writing Amazon reviews, explaining these problems. Now, there's provisions to allow security testing and finding vulnerabilities, but it's a lot easier if the technology is available under a free and open source license. Something you can do is investigate and understand software and what's happening with it. So it's like I said, I like to play this game where I think about the dystopic future and dystopic uses of software and technology. You can play that too. We can play it together. It'd be fun. I think it's really fun at least. So think about the way that the things you're building are impacting people. Look at the papers and the research around image recognition technology and where it's failing. Find out the spaces where things are being used for abuse and find opportunities to take that into account in your development. This is largely, I feel very strongly about this because I think a lot about user autonomy and user consent. We want people to be able to choose the tools they're using from a place of understanding what's going on with them. And if we can't understand our technology, we can't truly say, I'm fine interacting with this. That's kind of important. I certainly know people who are uncomfortable being in the homes of IoT devices that listen. Because you don't really know what's happening with it and there's like lots of terrible stories about things being recorded and things being shared and responses and advertising and marketing. So knowing what's happening allows us to make decisions about what we're using ourselves, knowing the potential risks of what might happen by having the ability to remotely turn off the brakes to my car. I might not choose to buy that car. Boop. You can also build freedom into your jobs and into your contracts. Shameless pitch. You have the opportunity when making contracts with employers to ensure that the work you're doing is free and open, right? You can get your coworkers together and use those pressure campaigns and to make differences in the stuff you're building and in your jobs that will also help make sure that the things you're developing are useful for the future. Something you can do is build on top of other tools that already exist, especially ones that are free because this enables like the continuation of license uses especially under copy left licenses. I think that's pretty cool. You can also support organizations that are doing work on this like the ACLU or whatever your local equivalent is. These are organizations that provide support, advocacy, activism, and also legal support in taking things to the next level when necessary. The Electronic Frontiers Foundation is also involved in these kinds of activities. So by supporting orgs, you're helping to make these things happen. You're helping to make sure this research happens. You're helping to make sure this awareness happens, right? Like, it's pretty cool that the nonprofit I worked for is paying for me to be here to talk to you. And we're like educating people, I hope, right? So together we can build a better future for technology, for ourselves, for our neighbors, for society. And that's wonderful and makes me so happy that that's a possibility. And I'm really glad the microphone didn't fly off my face when I did that. So that's what I have. If you have any questions, we can talk now. We can talk later. I don't have my email address up here, but you can reach me at molly at opensource.org or mdeblanc at ganome.org. That's my Twitter handle again. And that's my name in case you forgot it, or how does it wanna know how to spell it? So we have about 10 minutes left. I'm interested in questions. If you want to make comments, please come find me later and we can chat. It'll be great. I'll be around here for about half an hour. Then I have a meeting, I'll be around more time after that. And if you're really nice, we can even plan a meeting. So questions. Anyone? Oh, clapping. I like clapping. I know we already have the first question, but I'd like to especially open the floor and encourage questions from women, people of color, people who come from outside of Europe and North America, and non-binary folks, just because they're less likely to ask questions. Hi, so thanks for your talks. Thanks. I have a question. Do you know any existing solution about home surveillance, which is open source and open hardware? Open source alternatives to what? Or open hardware. Oh, open hardware. So some of that exists. I know FOS Asia is working on a home assistant right now. I don't know a lot about open hardware though. I mostly focus on software. So I'm sorry I don't have a better answer for you. I see a question here from one Deb Nicholson. And I see a lot of people moving around, which is cool too, you know, with his feet. Oh, and a question over there in the corner. Not corner, but side. Hi, my immune system tries to murder me. So this is very dear to my heart. Are there any projects, organizations that have a disability-focused lens on this issue? Because you mentioned a lot of accessibility reasons why people would use IoT devices. And a lot of those discussions are being drowned out by open source and privacy purists who are not even interested in that conversation. I don't know of anyone focusing on accessibility, actually. Especially around the relationship between accessibility and free software and open source software. I have found a lot of really great support and responses to questions from hashtag A11Y Twitter. And someone there might know something more. Hi, thank you for your talk. My question's about intersectionality and how when we're engineering solutions to all of our problems, how we're supposed to prioritize which protected characteristics we kind of go for first, because it's not always possible to have an all-encompassing piece of work. So I think one of the most important things for building solutions is to try and build diverse teams of people who are working on your stuff. Sometimes building diverse teams is really hard for a variety of reasons. And sometimes it can even feel not possible. I have lots of suggestions and ideas and other people do too about how to do that better. A proposal that came up to me recently was consulting. Like not, what was the idea of hiring consultants to come in to your projects to help you understand these things, to do audits and to provide you with the resources to make those differences. So it was a long... Sorry, I'm having trouble hearing you. Yes, it's right now. Yes, for me, I don't share absolutely not your enthusiasm about this kind of things. For me, there's a lot of disadvantages. So you have to say nothing about electrosmog. Electrosmog, you are liking, loving some animals, but the bees are very sensitive to this kind of thing. And all the humans, also, are electro-sensitive humans. So they cannot make anything in that context of electrosmog. I think this is a really good point, but is this a question, do you have a question? Yes, there is also the problem also of privacy. Can you jump to the question? So the power of transnational companies, big brothers in Hong Kong, so the government of China is a good citizen that can spy everybody. So all these kinds of things, you don't think there is some advantage, but so for me, in place, instead of use refee, you can use also infrared communication. Do you have a question? Yes, so I have no question. Okay, great. Let's move on to the next question. That's what I would like to focus on. I see in the front, we've had a hand raised a little, a few times. I do actually have a question. You mentioned that the recommendation when stocking is happening via some of these IoT devices is just to get rid of it. Is that advice from law enforcement, or do you know, is there any work being done to kind of get law enforcement to understand this problem a little better? I would not say that it's like the official definitive recommendation and the only one. It's something that people have reported doing and having success with, in terms of the psychological benefits of getting rid of their devices. I don't know, so I know that Eva Galperin, Galperin from the EFF is doing a lot of work, especially with abuse survivors and abuse victims, on how to do a better job securing their stuff. She gave a TED Talk recently. I have not seen yet, but I'm excited to watch it at some point. I think there are other people interested in it. I know I have had just initial conversations with some women shelters that are interested in talking about these topics more. Hi, so you mentioned about consulting to make better products that make considerations of these things, but so often the responsibility of making things better and more inclusive lies with the people that have had these often awful experiences. Are there any resources or trainings or things out there that people that haven't experienced these things could look to to try and almost simulate or get better at not relying on the people that've experienced? These are great questions because they're hard and most of the answers I have are like, that's a great idea, I don't know. What I've been doing is I've been reading more books, relating to these topics to educate myself. I've been working my way through so you want to talk about race, and yes means yes. Yes means yes is a book about consent, specifically sexual consent, but it ties into a lot of these issues as well. And I think that's a good place for all of us to start. I hate saying social media is the answer, but they're like following diverse people on social media, especially diverse people in tech, because they will talk about these issues all the time since they impact them personally. And this is still those people taking on the emotional labor of trying to make these changes, but it's not you expecting them to give you a 101 course every time. So is there anybody actually compiling a list of recommendations of products that do work well in terms of actually not just giving your information away and helping people to act? We could really do with good recommendations to give to our friends and family on the white products that are not just going to fail in these ways. I know that there have been academic studies on security issues around devices, so that kind of can provide you a list of what not to go for, which is very different from what's better. I know that if a Galpern, Galpern, does anyone talk to me later if you know how to pronounce her name, has been developing resources focused on IOT safety and risk mitigation, and that's a really good place to start checking those things out and learning about it. I'm afraid, we're out of time. Great, thanks so much everyone.