 My name is Jane Prusakowa. I'll be talking about privacy. Thank you for coming. This is the weirdest space I've ever spoken in. I haven't done feather as a cat at all. I usually speak at user groups, which is a small room with desks. So privacy. I'm not going to tell you to drop your Facebook. So this is not that kind of a talk. But somehow, 15 years, 20 years after we've got Google, and Facebook, and Twitter, and all this nice stuff, we have a society that runs on a surveillance business model. So let's talk about that. I work for YSAT. I've spent 10 years as a consultant, and the rest of my career working as a product architect and developer. And I'm interested in behavioral science, which is about people. My experience in technology is that technology becomes easy. People do not. People get more and more complicated the more time you spend around them. So let's talk about why we should care about privacy. This talk has been inspired by a very interesting book, Data and Goliath by Bruce Schneier. Have you heard of Bruce Schneier? Great. He is a scholar. He works, I think, at Harvard University, one of his titles. He writes a very interesting book, and he is an original technical implementer of many security features that we still use. But he's a big thinker about what privacy is as well. So surveillance is a business model. We know all these logos. We all live with them every day. And this is our normal. So this is free. You just have to tell them who you are. It's not new. We've always had digital surveillance. Nielsen reports have been going out since early 20th century where people filled out what TV they like to use and where they lived and what their phone number was, if they had a phone number. You have post office. You have to pay extra to opt out of yellow pages. You will be listed if you don't do anything about it. That's the default situation. And then there's credit reports. These people know what you buy, how much you pay, and when you pay for it. Store loyalty card. Anyone has a card for their user? Anyone put a name on their card? But somehow we went from this to this. Anyone here doesn't have a smartphone. This is a DevOps conference. Excellent. So our smart devices that only incidentally phones at this point and mostly computers that collect a bunch of information collect pretty much everything about our daily life. They capture what we type. They know what we query about and what we care about. They know who we talk to and who we send email, text, or get phone calls from. And finally, they know what we're buying if you're using Google Pay, Android Pay, or Apple Pay, or any other kind of pay that's built into the phone. There is self-sensory on the slide, which means you typed it and you erased it. Software can capture that too. It doesn't exist for real, except it does. It got recorded. It started innocently enough. Back in the 90s and early 2000s, suddenly we got a useful internet which was providing entertainment, which was fun. Let me stop for a second and say that there is no Q&A section at the end. Please raise your hand and I'll try to see it. It's OK to share comments because this is stuff of life. I'm not teaching you technology here. It started innocently enough. We shared what TV we liked with Nelson. We started telling what we like to read. We started researching our travel plans so the sites could collect this information. Now it's everything. It moved to appliances so it knows about our dirty laundry and what's in our refrigerator to full cloud control. Now you can have the cloud controlling if your locks are closed or open, if you have air conditioning, Ryan, if your dogs get food, maybe. And the next one is what you're talking about. Anyone has an Amazon Echo similar listening device? Does it listen a lot? There was a program on the NPR that already introduced an Echo Dot to her three-year-old. Three-year-old started treating the Echo Dot as a member of the family in 24 hours. This is how short it takes to become used talking to an animate object, except it's not really an animate object. Someone on the other side is listening. This data is processed by other people and it can be shared with a lot of people for somebody's gain because, well, possession is ownership. Anyone has bought a similar doll for their kids? This is a famous doll from Mattel from a few years ago. It made the list of the top best toys for Christmas. It's fairly expensive. It talks back to the kids. It's very intelligent. In fact, it captures what it's hearing. It sends the information over the network to a third party processing and it replies back. Intelligent is so intelligent that a person is actually involved. It's not just all AI. The data is going to other countries. It's going to places that are not published. It's basically introducing a listening device in your most private of situations. What the kids are listening to. Not every Barbie doll is like that. This one was actually pulled from the market after there was an appror, which took a few months before people caught on what was happening. But we do this. Samsung TV, another listening TV, you can talk to it. It will do what you want, but it will also capture conversations that are not intended for the TV. Again, it's your living room. It's your family room. Do you want technology listening? Do you want how many people having access to this? Okay, they covered the cameras. Now they are getting the visual in addition to the audio. And the car. The car knows quite a bit. It knows how you drive. More importantly, it knows where you drive. And how long you spend there. There was a study at some point where the government collected metadata on phone calls, which is like the car. It doesn't know why you're going there or which exact door you are knocking on, but it knows how long you spent and the general neighborhood, well, within a few parking spots or within a few streets. Metadata is important. We learned that some people called divorce lawyers. Some people called abortion clinics. Some people called funeral planning services. Some people called about inheritance to lawyers or courts. It's important. Even if we didn't capture contents of those conversations, it's enough to catch up with the person. And it's all out there. So what does this data cost? What's the value? What are we giving out for free intentionally by signing all those user agreements saying, okay, you can have it? Facebook is worth $470 billion. I checked last night. United Airlines. Not so much. United Airlines has property. It has airplane slots. It has licenses. It has equipment. It's got a brand. Facebook also got a brand. But more importantly, it has data for billions of people. This is how much the market values it. It's not the servers. The servers are cheap. It's not the software. The software is fairly cheap too. It's getting people to share. That's where Facebook succeeds and makes its market cap, makes its value. So what are we getting for it? We are getting servers. We are getting a place to type what we think to our friends and maybe their friends and maybe your friends at Facebook and everybody who is a friend of Facebook, not necessarily our friend. We are not really getting compensated for our data other than by the servers. The data is valued at the same nominal minimum value. You can sign up to multiple services and share a lot and you get the same service or you can share a little. You can sign up for everything or just one. All you are getting is bad service, no matter what. Okay, what is on the screen? This is not mine. Facebook is also in a great position and everyone who collects data and Google is the top company actually, not Facebook. You can use that data for good. You can figure out traffic patterns. You can cure diseases. You can do something about market pricing in a smarter way. You can do credit in a smarter way but you can also hunt people and you can collect more money from those who can afford it and you can collect more money from those who cannot shop around. So there are good ways to do this and there are less than good ways to do it. The problem is we don't get to know any of this. There is zero transparency. When you sign up for these services, you give up your data and it took how many years for Facebook to give it back to you? A lot of services still don't have to give it back to you and don't do it. There have been multiple lawsuits, people trying to get Google to remove some information from its stories. Some has been won, some has been lost but it's expensive. It's not in every person's agenda. This is not normal. So many years ago, my Mark Zuckerberg said that privacy is no longer a norm. As a society, we get to decide, is this normal? Is this ours? Who is this guy who decides for us? Well, here is an opinion of someone interested. You have to decide for yourself what's important. I think this, this is what I think kids should be taught. Privacy is important. Don't share everything. There are limits about what you want to share and more importantly, you have to work hard for it. It's not easy anymore. I've had many conversations about privacy with different people. In fact, I had one this morning with my family who likes to share their data about their location with each other. I'm not doing anything wrong. Therefore, why not let the entire world know? There is a reason we want privacy and here I'm going to ask you to read this very long statement. Privacy is not about hiding something wrong. It is about having personal space. It is about having the autonomy and to understand what works for you in your own space before you go out and share it with the world. They're losing the space because, well, they are getting service, but we really need that we cannot live in a society and be part of it without giving our data, without giving up the privacy that we used to enjoy and not think very much about. Here is another one. This one is from a scholar and a lawyer. This was hard reading. Thank you for bearing with me. Pictures of puppets are usually a good thing. And then there is a conversation about what privacy means. Privacy means different things. There is a government argument is if we don't have access to everybody's communication, it makes it harder for us to catch terrorists. There are people who agree with this. There are also plenty of experts who say if you have too much of a pile, it gets harder to find anything useful and you have too many false positives for what you find to be useful. So here is something from James Comey, FBI director at the time. There is no such thing as absolute privacy. There is no place outside of judicial reach. You can be compelled to say what you saw, which is not the same as a recording of what you saw can be produced, right? So they can ask you nicely or not, but they cannot make you. Awareness of the government may be watching chills expressive freedoms. This exists, this is real. You have to watch what you say in public space, except suddenly every space is public because there is a Barbie doll, there is a Samsung TV, they are losing that space that is not limited. So back to what's going up to the cloud and can be retrieved with a sufficiently valid request from the government or someone willing to pay for it. So this is advice how to get your privacy back. And I think this is silly. Yeah, we all have attractor blockers and paint in your face is a bit much. You don't want to avoid other humans. They are part of the society. If you want to be part of the society, this is how we are the most productive. Don't move to remote forest. This is a nice place to be. But this is important. This is something we used to enjoy as default for our lives, our conversations for conversations, except this talk is being recorded. And the conversations that you may have here may or may not get on audio and video records. This is great. We are in a public space. We want to share the knowledge from here, except we should have an option to have a private conversation, which is not being recorded, which is definitely outside of public eye. And sometimes we don't know that. In the building where I used to work and most office buildings, the landlord or the employer can record all audio and video everywhere except the bathrooms. They posted a sign, no phone conversations in the bathroom. So, which was kind of ironic, I'm sure the landlord didn't think about it. But yeah, you have to go outside. And that's not always a comfortable solution. No private conversations during the eight hours you had to work. So, conversations get recorded, interactions happen over text, Facebook, Twitter, email, what have you, everything lists a trace. We have a lot more pictures. So, when secrecy is truly paramount, go back to face-to-face conversations. Phone is better, regular old phone is better than your smartphone. Audio is better than text message because it's harder to pick up. Not everything is recorded, but we are moving there. And we'll be moving there ever faster as technology becomes more available and a space to store it becomes cheaper and more plentiful. We used to have those exceptions to ephemeral conversations. Court proceedings were always recorded since we had court proceedings. Performances were recorded, like this talk, portraits and photographs, which were taken what, three times in a lifetime? Maybe six times in a lifetime. And letters and diaries, which is how we know how people lived. And now we have stacks of emails. Right to privacy is implicit, but it's always been there. It's in the Bill of Rights, Article 4, right, of the people to be securing their persons against unreasonable searches. This is only against the government. There is nothing here against the private enterprise, against the corporations, because they didn't exist. They didn't need this protection because there was no bill in. We need to rethink this. And we don't have a chance because we are busy typing away on Facebook. So, do we have data privacy laws? No. We have ad hoc approach. Laws and regulations apply to very narrow classes of data, like your medical records, some kids' data, and we expect the industry to self-regulate. Well, the industry self-regulates to maximize its profits, which makes sense, which that's what industry does. Can you do better? Yeah, you can do better. In Europe, there is this new set of laws that regulates what data protections should be in place. But this is the second iteration of those laws that took effect earlier this year. And they make it harder for industry to handle customer data, which I think is a good thing because there should be a cost to data ownership. Right now, the data is owned by possession. If you have it, it's yours. You have zero responsibility to those who the data came from or who the data describes. So, for the company, yes, there is negative PR. We have to notify customers if they had a breach. But the cost of this negative PR is not very high because, well, it happens quite a bit. And where else are we going to go? And they don't have to make restitutions. A few times, my data has been breached by different providers. I have received invites to credit monitoring services, which by the way, say that they are going to collect all your data and you can leave them, you can stop paying them or not, or your free program can end, they are not ever erasing your data. They are going to keep it forever and ever. So, if you are signing up for a live log, read the privacy statement, which is subject to change at any point, by will, and read by end user agreement. Sometimes you can get your data back, not always, not in America. So, if we get to decide privacy versus surveillance, privacy may not be absolute, but is it valuable? Is this something we want as a society? Are we willing to put effort to get it back? Because technology makes it easy to make it disappear. It takes work to step up and request it. This is asking for your citizenship activism. This is asking for you to have a conversation, especially with your kids about what they should and shouldn't be sharing, especially if they are on Facebook or on some chat. So, that's all I have, I think. Book is inspired by, talk is inspired by Data and Calliope. It's a great book with lots of stories and specifics. By Bruce Schneier, please read it. I'm not getting royalties from the sales. I just read this book and thought it makes perfect sense. Did you have a question? So, your talk was very aimed at the consumer concerns of our personal privacy. But this is a room full of people who are in the business of violating people's privacy in some cases. I know there are probably some people who worked on Alexa here in this room, or if there's not, it's surprising to me. Do you have any, I would be curious to know if you have any guidance for people working in DevOps, perhaps a code of conduct that you like that people should abide by any guidance for those of us in the field? I would suggest thinking as a consumer, would you like a right to have your data forgotten? Would you like to be able to turn off Alexa easily so it stops listening? Would you like it to test more for it that it doesn't capture what it shouldn't capture than what it should, when the other side, because we tend to test for functionality, it has to capture this, so let's figure it, let's make sure it does, we don't test the other side. So as engineers, we have an impact, but more importantly, talk to your company, talk to the product owners, talk about requirements, figure out how to market this to your customers and see if you get demand for it. So this is my advice to engineers, think as a consumer, how would you like your data treated? And enjoy your lunch, and thank you so much, Jane.