 Okay guys, we are ready to start. Our first speaker is Justin and the talk is about Surveillance Capitalism. We continue on to Moral Improved. So give him a big applause and welcome. Okay, I'll get there. Yes? No? Hear me? Thank you. More or less. Thank you all for coming out. This is really cool that you're here. Really cool that Crypto Village is doing this. Okay, yeah, I'll get started. So really quick, all my slides are already online. Slides.secureutah.org. I'm from Utah. I work in security. I like to promote security. So as a side project, I'm trying to a little bit by bit teach people about security and increase people's awareness about it. Slides are full of links. There's a huge topic, a lot to cover. So everything I talk about, it's linked from you can read yourself. By all means, check my work and form your own opinions. My name is John. I spoke at the Crypto Village last year on HTTPS. Had a great time and so I want to come back this year and talk about something else. I've kind of been just tracking for fun. Surveillance Capitalism. The title of the talk is a play on the phrase, the beatings will continue until Moral Improves. And what it means essentially is you're going to be punished and it's not going to stop so you might as well get used to it. And being a bit fatalistic after researching this talk, that's kind of how I feel about how surveillance capitalism is heading. We're kind of stuck with it. Might as well get used to it. Maybe not. Maybe someone else can find a way out. So real quick, for the time being, got my face covered. We'll get to that in a little bit. But first, I'd like to lay a foundation for the talk. Surveillance Capitalism. What is it? It's a term coined in 2005 by Shoshana Zuboff. She's a professor from Harvard Business School. There's a lot of meaning and consequence packed into this definition. Professor Zuboff founded the concept upon and as a reaction to Google's accumulation, management and processing of immense volumes of data. So this is one way to sum it up, but there's a lot more to it than that. I'm not doing her work justice by reducing it down to just a few quotes here. So I encourage you to go read what she's written. But essentially surveillance capitalism is a way to gather information, gather data and monetize it. Not just for money's sake, but to influence consumers' behavior, to influence market behavior or market control. So as I mentioned, she formed this in response to Google. Here are a couple of quotes from people who work at Google. Ahmed Singhal was a senior vice president of engineering in 2012. Halibarians, Google chief economist in 2014. These quotes are about the immense amount of data that they have to handle. And this is several years ago, so I'm sure it's gone up by an order of magnitude or two. 20 billion URLs crawl the day. 100 billion search queries a month. How do you search across a trillion records in a few seconds? I mean, there's an amount of data that we never dealt with before, and Google has been doing this for years. So from this, her conclusion, data is becoming everything. This big data is plucked from our lives without our knowledge and without our informed consent. The informed consent is central to all of this. Her conclusion, the matrix had it close. They thought we were batteries. We're just apparently data generators in real life. That's what we're here for. So, Machay Seglowski runs a website, Pinboard. He spoke earlier this year, had a talk called Build a Better Munster. He's a great speaker, very insightful. He talked about the same topic. The economic basis of the internet is surveillance. Every interaction we have leaves a data trail. Whole industries exist to consume this data. He talked a lot about Facebook and Google, specifically their leaders in this. But this is kind of what they want to do. They want to know where their users are. What are they looking at? Who are they with? Their purchasing habits, everything about what you're doing, anything else they can discover. These two companies together, they have, these numbers have grown more than 65% of the online ad market. And I think it's something like upwards of $60 billion a year. And why are they doing all this? To sell on advertising and to train their machine learning algorithms. So, let me go back real quick. Again, look up Machay's talk on this. He expands a lot more in detail, paints a much broader picture. I'm just summarizing briefly what he's talked about. So, perform mass data collection in order to understand the behavior of people. This is not a new concept, 1870s. The origin of today's credit bureaus is when they started to collect data on people so you can make decisions about them. And ADA, Roger Clark introduced the term data valence using personal data systems in the investigation or monitoring of the actions of people. 2002, the Department of Defense started their Total Information Awareness Program. This was in response to September 11th attacks, collect all the information and use it to analyze it for suspicious behavior patterns. 2005, Christopher Slobogen talked about transaction surveillance. Governments could go in, law enforcement could go in and get information about people's financial purchases, financial dealings without a warrant, sometimes just asking for it. And we all know about, well, shouldn't know all about Snowden's revelations about the NSA's Global Surveillance Program. So there's been lots of cases of obviously collect data, figure out what people are doing, why they're doing it. So why is this different? What makes this new incarnation of data collection analysis worthy of attention? So to me it's three things. It's the combination of the magnitude, the methods and the motivation, magnitude, motivation and methods for doing so. It's the sheer scale of the data collection. It's achieved via an uncomfortably pervasive penetration into the previously mundane activities of our lives. It's the unavoidability of being a data point. It's the commonality and concealment of the collection methods. It's the unashamed application of incredibly advanced algorithms that are designed to learn from and then shape human behavior all in the pursuit of profit. It's not in the pursuit of national security. It's not to raise the infant mortality rate. It's just to influence consumers so they buy what you want them to buy when they want you to buy it, and you can be there often the best deal. So again, being a bit fatalistic, to me it seems like a weird offspring of late capitalism and Moore's Law. It's an odd outcome of dirt sheet processing power in the service of greed. So from my talk, just remember this. There's a lot more background to it, and it kind of showed the way that big data accumulation analytics can be highly profitable, and since then a flood of other companies are desperately trying to copy their business model and collect all the user data that they can and monetize it. So in a minute I'll run through some of the many current forms that surveillance capitalism or the current methods that it uses and offer some defensive options. But first I want to talk about privacy. No talk about surveillance can be had without talking about privacy. That's why you're here today. So, well, yeah. So Daniel Solove wrote a paper in 2007. It was a rebuttal to the phrase, I've got nothing to hide. And we've all heard the phrase, if you've got nothing to hide then you have nothing to fear. What do you have to fear? And he talks about that. He dissects it. The question implies that an individual engages only in legal activity that if you engage in only legal activity you have nothing to worry about. When it comes to the government collecting and analyzing personal information many people contend that a privacy harm exists only if skeletons in the closet are revealed. And the way Daniel Solove put it, the problem in short is not with finding an answer to the question. If you have nothing to hide then what do you have to fear? The problem isn't the answer. The problem is the question itself. It's the underlying assumption of the question that privacy is about hiding bad things. And it's not. And maybe to the older generation, maybe in the past, privacy was about secrecy. But privacy now is about control. It's about consent. I'm going to illustrate that a little bit. First, a quote from Michelle Dinity. She's a chief privacy officer from Cisco. It does wonderful work, very pro-privacy. I highly encourage you again to follow her and follow her work. But she talks about this specifically. When you say privacy is dead, if what you mean is secrecy and hiding away and not being connected, I agree with that. That version of privacy is dead. But privacy has a function of how we define ourselves, how we live, what we can expect in morality, respect, ethics, and security. It's a new version of privacy that's alive and well. Bruce Schneier wrote about this several years ago. Privacy is about control. We don't mind sharing our lives and thoughts, but we want to control how and with whom. Privacy failure is a control failure. When your health records are leaked or stolen, that's a privacy failure because you've lost control of them. Another way to put the same thing, privacy is the right to consent. It's the right to also to withdraw consent, to only provide information to the people you want to provide it to when you want to provide it. Sarah Jamie Lewis is a security researcher from Canada. Also does amazing work. She just put a book out called Queer Privacy with stories and about privacy from the margins of society, the people who aren't normally protected or considered when privacy solutions are created. So how do you recommend the book? If you want to come talk to me afterwards, I have a few copies I can give away also. But she pushes this a lot and I totally agree. Privacy is the right to consent to information being released and shared. So privacy is about, privacy is evolving. Privacy is about control. It's the right to consent. And from what I've seen, it's a battleground that most people aren't aware of. So a case study, privacy. This is from 2014. This woman became the subject of a paparazzi and got lots of media attention due to her involvement in a situation that's not important. What was interesting to me was that in public, she had to go to court a few times. She started wearing a sun visor. These are sun visors, popular, kind of grain of popularity in Asia. They're UV coating, keep the sun out of your face when you're outside. What was interesting is that she had more than one. She'd coordinate with her outfit. She'd go outside in public when she knew the media was out there. She wasn't unknown before the event. You can search for her name online, find photos of her all over the place. But this article that someone wrote about it dubbed it the face privatizer. Not to protect herself from the sun's glare, but rather from the media glare. She was misusing the visor rather effectively. I hope that being hackers, you all can identify with that. We misuse things in interesting ways because we can because it's interesting because it's fun. But I saw this and I'm like, it's totally genius. It's pretty cool. So early this year, I'm from Salt Lake City. We have a B-Sides there. I gave an earlier presentation of this talk. And wearing one of these has been in my mind since 2014. And I did it as kind of a proof of concept. It's a surreal experience. I parked down the street. I walked up to the con building with this on my face, down the sidewalk, walking by. Everyone's looking at you like, what is that? That's weird. But it's like sunglasses for your face. I can see everyone else. They can't see my face. It's a shield of privacy. And a weird way to get it because everyone's looking at you, but I still have that sense of a barrier. So it's kind of interesting to try these. You can order these online from Amazon. They're $20 or so. Yeah, so I'll get to that in a second. So again, afterwards, I've got a few extra with me. You're welcome to come meet me outside, try one on, see what it's like. They're pretty cool. And they get a lot of looks. Real quick, let me take this off. Okay, that's better. More fun coming in. Oh, let me make sure you can hear me. Okay, I'll try to speak up. You good? Okay, yes, no. Things aren't the way I want. Okay, I'll give this a try. So, let's talk about surveillance capitalism, the methods. In the summary for the talk, I outlined some of these. We'll get to all these, I guess, some examples of each of them. First one, the most fun one, the most interesting to me, facial recognition. We'll run through all these and talk a bit more about the defenses with each of them. So, facial recognition. Let me, this is a comic that illustrates it really well. Here's Lois Lane. She's on her Facebook page. She just posted a picture of her and Superman. Right away, Facebook says, hey, do you want a tag? Clark Kent. It matched the face pretty quickly. Her reaction is quite appropriate. So, the illustrator, Max Karsten, he had a little interview about this. They gained quite a lot of attention online. The question in the interview said, was there an incident that spurred the idea to draw this? And he said, for him, he posted a photo on Facebook of himself making a rather goofy expression and to a surprise, Facebook's recognition system still identified it as still identified him. So, he thought it was funny that he made his weird face and still got identified, so that led to this picture. But to me, it illustrates it rather well, but it's kind of funny that in the 21st century, Facebook has become more powerful than Superman. So, there are lots of examples of facial recognition being used. 2015, Facebook had a research paper where they could identify people without seeing their faces by things like their ears, their clothes, their posture. 83% accuracy, I don't think it was. When you have a picture of someone and you have this big data set, you can start to match them even if you can't see their face in subsequent pictures. FindFace is a startup from Russia. VContact is a social media service in Russia. Most everyone has a profile there. It's like the Facebook of Russia. FindFace developed an app that integrated with VContact where you use the FindFace app, take a picture of someone. It would find their social media profile pretty quickly. So you could see the obvious privacy concerns with taking a photo of someone on the subway and getting their social media profile right away. And early this year, in Russia, there's many protests that are unsanctioned that the government doesn't like. So you go to a protest and you make it home safely. You aren't arrested. You aren't beat up great. But if someone takes your photo there, well, photos were taken there and some people started pulling the photos out of the protests, pulling the faces out of the protest photos and finding their VContact profile. So you can see very quickly, you thought you were anonymous that a protest, a picture is taken and you get a social media profile found and bad things could happen. But facial recognition is being used in lots of other mundane or kind of normal ways, mostly in the government law enforcement area. Chicago and New York both wanted to add it to their city-wide camera systems. Oh, yeah, Caesar's Palace. So if you go to their privacy policy and you search for facial recognition, they got a section there on self-exclusion. Self-exclusion is if a person has a gambling problem and they say, hey, I shouldn't be here. If you see me here, please escort me out. Caesar's Palace uses facial recognition to identify those people and help find them quickly. I've asked them to security guards here. Assuming your face is not allowed. I haven't tested it. I'm not sure I want to. But I actually found references to back in 2005. Casinos were using facial recognition to identify people who shouldn't be gambling, or self-identify as they shouldn't be gambling. They also say they use it to help find criminals, to help police when they're looking for a criminal. So the capabilities of the technology that's there, assume your face is being recorded and tracked at casinos. Lots of other examples. What's most interesting to me is 2017, early this year, there's a startup in Missouri that sells facial recognition solutions to gas stations, to jewelry stores. If you want to go to the gas station after 11, you have to scan your face. If you're a known shoplifter, if you're a known people warrant out for you, they won't open the door for you. Cuts down on crime makes lots of sense, but the fact that the technology is that affordable, that cheap, that franchise store gas stations are going to install it, it's pretty commonplace now. A big explosion this year of airports using it for convenience, make boarding faster, check your bags in faster, scan your face and you get on right away. And in airlines, airports don't roll this stuff out just for the fun of it. It's expensive, it takes a lot of time. The fact that they're testing a few airports means they like it, they're probably going to go through with it, and they're probably going to expand further. DHS has a program called Biometric Exit where everyone leave in the country, if you're not a citizen, you get your face scanned. You have to have your face scanned. No choice about it. If you want to fly out, your face is getting scanned. Lots of examples of law enforcement doing this. Most, well, I say interesting, but kind of the most uncomfortable one is Taser. They make the Taser devices, they make body cameras. They incorporate live streaming of body cameras to back to the police station so you can watch live what the police officers are seeing. They want to add live facial recognition to these streams sometime next year. They have all this data, all these hours and hours of video of people doing things. They want to apply machine learning, AI to it, all that fun stuff to figure out what leads to criminal behavior. How do you track people across time? No, there's an article earlier this year about facial recognition in China. It's part of daily life here. They gave lots of just mundane examples. You jaywalk across the street, the camera takes your picture, puts it up on a big screen, you get shamed for jaywalking. You go to KFC, the camera scans your face, guesses your age. We'll go with that. It guesses your age, it guesses your gender, and it serves your customized meal based on what it thinks you might like. It stops toilet theft at parks. It scans your face, you're limited to two feet of toilet paper every nine minutes to stop toilet paper theft. According to the article. The stated purpose is to influence behavior and identify law breakers. All these worries about facial recognition creeping into daily life and how it's going to play out. It's already happening in China and when you have that kind of control, you can implement it everywhere. So, some defense against it. Don't have a social media profile. Don't get arrested. Don't travel by plane. Don't live in China. Don't go out in public. That's kind of where it's heading. You can wear a mask in the U.S. except in 13 states, there's laws against wearing a mask in some cases. Most of them are like this. You can't wear it when you're in the commission of a crime, when you're trying to run away from a crime. If you're wearing a mask, it's like an additional charge. Some states, some laws in the south, they made these laws in response to the KKK. They always wear the silly hoods. So, they passed laws saying you couldn't cover your face like that out in public. So, there's... I think that DC has some laws too. There's some places you can't cover your face at all in public areas for security concerns. Sorry, I'll get to that in a second. But this is... In some countries have even more restrictive laws. So, masks can't cover your face at all. So, other ways to thwart facial recognition that don't involve masks. A gentleman by the name of Adam Harvey did some work several years ago. He looked at how the facial recognition software does its thing. How does it identify a face as a face and match it to another face? And it's essentially... I want to make sure I get this right. Yeah, so, it's a software that's looking at it. It's not a person. It doesn't interpret faces like we do. Differences in color in the areas of the face. You know, it goes from an eye to a nose to an eye. The color changes at such a degree. And the two sides are symmetrical. So, he did some experiments with hair and makeup to confuse facial recognition. You have high contrasting areas. You have hair or something that comes down and covers the bridge of the nose and breaks up that symmetry. And they're all highly effective. And he spoke earlier this year at a conference, the 33C3 conference in Berlin. He watched the video. He kind of explained how it works, goes through it. Super interesting to see how it's done. But this is one way you can wear makeup and not wear a mask, but still go out in public. 3D printed face cages. Same type of concept. Just a little concept thing. But again, it's how to make a product that's easy to use, easy to make, and does the same thing. Just break up the image of the face. We did some researchers that, again, they saw the facial recognition software works. What can we do to thwart it? And they designed these glasses that had a color pattern that wouldn't fool a person. But the way the algorithm works, it took a picture of the top and it returned the bottom as the match. Yeah. It's not like the same pair of glasses works everywhere. You have to know how the software works to do it. But it's just kind of demonstrating something I'll touch on later is feeding bad input into an AI to fool it, to trick it. Lots of other possibilities. I was doing some research into how do you thwart facial recognition for this talk and it came across just some cool ideas that might work. All kinds of weird masks, but maybe it would work in some places. T-shirts you can buy. Yeah, you can buy them on eBay. They're awesome. There's like 30 to choose from. I imagine this would give Facebook fits. You have a picture of you wearing this. Like 15 bucks. Build privacy into clothing. I'm wearing this shirt now. It's pretty cool. It's like this. It's down. If you want to just pull it up, it pulls right over your head into a hood. Would that count as wearing a mask? Wintertime, you'll probably be okay. Summertime might be a bit weird. But I have seen just like a little bit of this, of fashion geared towards privacy or a jacket that's highly reflective. When you take a picture, it reflects all the light back into the camera and overexposes it. You can't see the person's face. So it's kind of a new and interesting niche of fashion niche. Okay. So let's imagine a world where everyone cares about anonymity so much. You wear masks out in public all the time. This is a really cool comic book. It's called Private Eye. And it's a detective noir story. But the world that it's in is in the future. When someone's out in public, they're always wearing a mask all the time. You only take it off in front of your family. And it's not really central to the story. It's kind of the background to it. But it's super interesting just to envision that kind of a future where, in public, you're always covered. You can get a PDF copy, pay what you want. You can order a printed book online. Again, really cool reading. Really fun to envision that kind of future. So, yeah, the facial recognition. That's why I had the other experiments in legally thwarting facial recognition. These little filter masks you wear in public to prevent SARS and so on. They make black ones, blue ones, whatever. Honestly, I have like 40 of these. If you want one, come see me afterwards. I talked to security guys at the casino. They were like, yeah, these are probably okay to wear. How are you going to tell someone not to stay healthy when you're in public? Novelty glasses. I mean, they're kind of silly. But when I saw these at the store, I held my camera up to take a picture. I couldn't find my face. The blue and green and black ones, I found my face white and gold. Couldn't find my face at all. So, again, just the high contrast, the weird lines, throws off basic facial recognition software. And it's not going to get you arrested. So, I hope that if facial recognition spreads, then an awareness of how to get around it and thwart it, research into that spreads. Also, the legal and policy complications of, do we outlaw masks? What kind of masks? When can you wear it? When can you not wear it? That may become an issue in the next 5, 10 years. Okay. Moving on. Geofence content delivery and user identification. Geofencing. A quick history of it. In the 90s and early 2000s, it was used for very basic stuff, location-based tracking, monitoring your vehicle fleet, sending out emergency notifications to everyone inside of an area. Geofencing is basically, you put an imaginary circle around GPS coordinates and any phones, any devices in that circle get some kind of special notice, some kind of special activity. So, when someone enters a circle, maybe they get a text message when you're leaving, maybe you get a text message. When a fleet vehicle drives out of that circle, the company gets a warning, hey, this vehicle just left your area. So, very utilitarian origin. 2002 was first started to be used as a way to identify mobile device users. And let's send them a custom content. Let's send them a message, hey, Bob's Pizza has a special. Write down the road for me, come on by. And only the past five, eight, well, seven years, five, seven years, it's becoming something more, an enhancement feature. Your apps do something extra when you're nearby a location, an area. Businesses can do really focused targeting to people who come by where they are. But there's this example of how it can be, well, depending on what you feel I feel about abortion, whether it's used or misused, but it's certainly a controversial use case for geofencing. In 2015, an advertising executive had an idea. Instead of using his mobile surveillance techniques to figure out which consumers might be interested in buying shoes or cars, what if you could use the same technology to figure out which women were potentially contemplating abortion and send them ads on behalf of anti-abortion organizations. The targeting of women seeking abortion presents a serious, well, sorry, to myself. So, Giron's advertising technology company, he can target people, send ads to people in a certain area because apps on your phone are free and they send data back to an ad company. This person has this phone, they're at this GPS location, and they tie that in with other information to build an advertising profile so they can say, hey, there's a woman inside the GPS coordinates around this abortion clinic. She's there, she's sitting there, she might be there considering abortion. Here's the guy realized, I'm gonna sell ads to anti-abortion groups so that this woman, when she's on her phone, looking at web pages, she gets ads to go see the anti-abortion clinic in her city, you know. So it's a very, very creepy thing to do, very targeted thing to do. You're sitting there, I'm not a woman, I can't imagine it, would it be like to sit there and be in that kind of situation and you're getting ads targeted to try to tell you to go do something else. You're vulnerable, you're there for something medical and personal, emotional, and you get that kind of messaging. So it was legal but ethically questionable. So, okay, so he got called down on it, they stopped doing it. There wasn't really any ramifications for it other than just kind of some bad publicity for them, but he showed it was really possible to target people in specific ways at certain locations. The Attorney General of Massachusetts didn't investigate him, she did something proactive, she said, you know what, this activity is not allowed in our state, consumers are entitled to privacy and their decisions in medical conditions, decisions and conditions. They had a settlement with a company saying, you can't do this, you can't specifically target people going for medical procedures in our state and the company agreed to do that. So it wasn't punitive, it was just a proactive thing and I think they're the only state that's done anything like that. Okay, so other examples of using geofencing. Uber has had a bad year for a number of reasons, a bad press year, but one thing they got called out for doing was trying to identify city regulators. The people who go on and investigate is Uber following good business practices or not. So Uber wanted to know when these regulators were using their app. So they had a variety of techniques to do it. One was geofencing government offices. When someone in this government office called the Uber car, they're probably someone who works for the city. Let's look at a few more things to see if they're a regulator and Uber did things like matching the owner of the credit card to a city organization. The city regulators have city budgets. So if they want to get a phone to use Uber, they would go out to the nearest store to get the cheapest Android phone. So Uber sent people out to say what Android phones are being sold at the stores near their offices. And it collected a list. So when someone installed Uber on one of these devices, again, another red flag for Uber saying, okay, cheap phone available near the city regulator office. They just called an Uber car from this geofenced area. They're probably a regulator. So we're going to do our service differently for them so we don't raise any red flags for them and so we fly onto the radar. So with geofencing, it hasn't used a lot, but I certainly see some malicious potential there. Spear fencing, geofishing, choose the term you like. But again, kind of playing off the advertiser who targeted women at abortion clinics. You can easily take that a few steps further and target people at a polling station. Rehab centers, domestic abuse shelter, a church of whatever kind of church you don't like. And not just serve them advertising, but serve them malicious advertising, serve them phishing websites. Target journalists, target legislators. Put a geofence around the capital and serve up whatever kind of ads you want when ads are in session. All your legislatures are on their, are legislated on their phone. The chance that anyone using ad blockers are probably very, very low. Unfortunately, we should educate them, but that's another argument. If you haven't seen the examples of this, maybe one of y'all can do this for a talk next year at DEF CON. So, defense for this. Disabled location services, don't install the free apps. If it's free and it's a game you're playing, woohoo, you might see ads, but it's usually collecting information on you about your phone. You can turn off the permissions if you want. You can install an ad blocker on browsers. UBlock Origin is kind of the running suggestion in the infosec world. I suggest it. Or just be aware. Hey, I'm getting a weird ad I haven't seen before. It's related to where I am. Okay, well, let me not click on that. Okay. Device fingerprinting. So, this is something, again, well, identify a device over and over again. Collect information on the hardware, the software, how it connects to your web server. Mostly used to prevent fraud. It's really, some of the research I've seen about this comes out of India, but I know companies in the US and elsewhere use it to identify when fraudulent transactions are happening. How do we know it's the same person? How do we know it's fraudulent? You have to have some kind of warning or trigger or red flags. One way to do it is to identify a device. It's the same device being used over and over again. Extension of that is browser fingerprinting. Identify the same person coming back to the same website. If they can't, there's two ways to do this. You can you order a website. It runs HTML5 code that maybe in the background you can't see it, but it tells the browser to draw a small image. Every computer is different. Every GPU is different. Graphics drivers different. All these small differences can make the image be rendered differently. And the website measures how that image is rendered. And it says, okay, this browser renders this image this way and it has these characteristics. We also saw those characteristics a week ago. We think it's the same browser, the same person coming back. Another way to do it. Let's say you use Firefox for some tasks and Chrome for other tasks, i.e. for other stuff. How does an advertising company correlate those three browsers to one person? Same type of thing. All these browsers are running on the same computer. They all use the same hardware resources. So again, they're all going to have the same kind of signature once you understand how browsers work and how they render images. Do some complicated coding and correlation to find out, hey, yes, we think these two three browsers are the same person, actually, on the same device. And I say unblockable because it used to be erased your cookies. Then it was erased the super cookies and then clear your flash settings. And then it was use a proxy, use a VPN. But it's definitely a cat and mouse game of websites and advertisers trying to identify the same people coming back. That's their business model. They want to identify who's using it. So it's gotten, I mean, there's these are just a few small examples of websites we can go to and they'll demonstrate what can a website learn from your browser? What kind of information can collect? What is your browser leaking? And there's just hundreds of hundreds of data points on here. And you can block some and that there's still enough they can collect on you to make a correlation. It's not an option. You have to understand how it works before you can provide data that actually looks fake and can fool them. The last website, click, click, click dot click is a gamify version of this. You go there and it has a scrolling screen, kind of a voice that reads out loud what you're doing. Oh, you've moved the mouse to the top right. You've been there for five seconds, very good. You've clicked five times, very good. And it has like this little score, like it has achievements on it too. Just click a hundred times. Click five times in less than a second. And it's just a way to demonstrate how websites collect everything you're doing what you're doing on the page. So, defense for this. You can try to play the game. Compartmentalize even further. Have different browsers and different VMs and a different VPN for each one. And switch up your configuration of your VMs. Try to figure out what they're measuring. And this is all again. So you're not going to be tracked by advertisers. You can, there's a few extensions that are kind of built for this EFS privacy badger. Again, you block origin, which does a lot more than blocking ads. It blocks all kinds of trackers and lead generation sites. And other stuff as well that makes just web pages a whole lot cleaner, a whole lot faster. You can block JavaScript, you can use a Tor browser. You can also visit your local library and use their computers for your activity. Lots of people use the same computer, but you're going to get lost. Plus, local public libraries are very cool. OK. Cross device user tracking. Extension of identifying a different way to identify people. Instead of just trying to identify different browsers on one device, you want to track the same person across multiple devices. A lot of us have a phone, a laptop, a second laptop, a TV, a camera, all kinds of devices that are connected and we're using different ways. Advertisers want to track that same person across all the devices they use at work, at home, on their work phone, on their home phone. So a couple of ways that they go about this, probabilistic and deterministic, you collect large amounts of data and you say this device was coming from this IP address and it has this browser fingerprint. And it's probable that this person use this laptop on a home IP address and then with some same websites so we think this phone and this laptop are the same people. The other way is deterministic. I log in with my email address at site A, I log in with the same email address at site B. It's obviously determined that it's the same person on the same website or on the two websites, even if one was on the laptop and one was on the public library computer. One way cross device tracking is done is ultrasound beaconing. A device, usually a TV, has an ultrasonic audio signal. There's an advertisement that plays. It's an audio that we can't hear but an app installed on your phone or code that's part of an app on your phone is sitting there waiting and listening for and the app, here's an ultrasonic signal and sends a message back to the advertising company. Yep, this person with the advertising ID such and such, heard this ad at this time on TV. So we're going to wait to show and we're just watching this show at this time. We're going to add the advertising profile and we can better sell our advertising services because we can paint a fuller picture of who you're selling to. In 2015, the Center for Democracy and Technology, a whole bunch of really cool and smart people called attention out to a company called Silver Push that did this and this is kind of the new story that broke this activity. FTC produced a big long letter warning against it. Silver Push is named the company, but this kind of technology, from what I've read, hasn't caught on much in the U.S. because of the FTC and others, but here's pretty popular in other countries, in India, it's used elsewhere. Other countries don't have the same protections that we do or the same concern about this kind of tracking. So that kind of, as a quick aside, a lot of the stuff I talk about, a lot of the defense and the collection methods are really U.S. centric, there's not a lot of data on how collection methods are done in other parts of the world or what kind of defenses and technologies people there have. It's really interesting, the website UbeekSec is a lot into the privacy and security research into the ultrasound tracking ecosystem. The FTC had a report earlier this year. Again, looking at cross-device user tracking and just user tracking in general and their conclusion is kind of obvious to anyone who's been paying attention. A lot of tracking takes place without our knowledge. We have limited choices to control that tracking and the collection of all this data results in more and more sensitive data that needs to be protected. There's an article, how much is your consent worth? There's a company that says, hey, we will pay you if you install this device on your TV and let us track what you watch. It's not Nielsen company. Their tracking is not as effective anymore. This company said install this device and we'll pay you five to twelve dollars a month to record everything you're watching. That's the going rate for allowing yourself to give consent to be tracked. Defense against this. Again, ad blocking. You block origin as the way to go. If you want to, you can add some background audio noise to your life, ultrasound audio. Don't know how effective that's going to be. There's been some talk for phone manufacturers to disable these frequencies. But that's up to the hardware manufacturers. You can also get the earbuds. They have the mic in them but just disable the mic. That way your phone says, hey, earbuds are plugged in with the mic. All audio goes through the mic. Then you break the mic physically and then your phone can't catch these signals anymore. I've heard you can buy devices that do the same thing. Okay, retailer and municipal location tracking. Wi-Fi tracking. We have sensor spread across an area and you record the MAC address that's broadcasted from your phone, from your laptop as it moves around an area, a building, a subway station, the public city. It records the MAC address and the signal strength. It does this to map a device's movement. Imagine, well, I'll have some examples here, but essentially a store wants to know where you move through a store, where you're moving through Nordstrom, for example. Where do you stop and look at? A certain display. Which displays do you skip? They care about that stuff so they can make their displays better and make their selling more effective. Not a lot of examples of these, but it definitely happens. Some kind of good and bad London subway authority did this at 54 locations. They put up signs, full disclosure, let people know what they were doing. But this kind of use, it makes sense. How are people moving through our subway stations? Where are they stopping? What services do they use? They didn't track websites, they didn't track usage or anything, it's just to map people's locations. There are some definitely useful applications of this kind of technology. 2016, this is interesting. It's not Wi-Fi tracking, it's tracking via your phone. The light bulbs in a store would flicker very quickly. Our eyes couldn't pick it up, but a smart phone, the cell phone's camera could. If you have an app that has access to the camera and is looking for those fast flickers, it can track where someone's moving through the grocery store. Again, to see how consumers use your store and how to make it better. There's a really cool book about this called The Isles Have Eyes, how stores track and profile customers. This expands on this topic a whole lot. The defense to this is pretty obvious. Turn off your Wi-Fi when you're out of the store, when you're out in public. Turn off your Bluetooth. They aren't on, you can turn off your camera as well. Don't use the store as free Wi-Fi. Well, yeah, I go to a store's Target, Nordstrom, Maces, whatever. Use our free Wi-Fi. Well, read their terms of service. Read their privacy policy. They're going to say that they're going to monitor it in some way, click the data in some way. Some people think Mac randomization is the way to go. There's a research paper that I found early this year that kind of showed the way Mac randomization is done is kind of broken. It's not truly random. You record long enough you can figure out which device is which. Other options where how to disguise when you go in the store, don't go to the same store again, don't even go in, choose to boycott it, buy everything online at Amazon, but then Amazon tracks you so what are you going to do? So behind all this, all this data is collecting in all different ways. Data brokers. The companies that do business and data. This really cool paper, data brokers in an open society. Make sure I get this right. I believe the company is called Upturn. I'm having my notes here. Okay, so really great report, but they define data brokers this way. A company or business unit that earns its primary revenue by supplying data or inferences about people gather mainly from sources other than the data subjects themselves. The question you fill out is data gathered about you by someone else and it's identified to you and it's then sold to someone else who wants to use it. One application of this is micro-targeting. This article talks about it, but it kind of just summarizes everything I've been talking about. The marketing industry is trying to profile and classify us so that advertising can be customized and targeted. Thousands of other companies, Google, Facebook and so on are doing business to profile us just to make a better sales pitch. Again, this is all just to sell us better stuff and increase their profits. Which isn't necessarily bad, I'm just saying. That's their motivation for doing so and should be considered. A few examples of this. Target. This article came out in 2012 and actually the story starts back in 2002. Some marketers at Target went to one of Target's statisticians and said, if we want to figure out a customer is pregnant even if she doesn't want us to know can you do that? Can you tell us that information? The Target statistician starts to work on it. Target collects obviously vast amounts of data on their customers. They want to know which ads do we show at which time to get people to come into the store. Target and the statistician there created a pregnancy prediction model where if someone is buying these items we think they're pregnant. Prenatal vitamins diapers, baby clothes, stuff like that you all buy those items in a short time that's a big red flag for Target they're probably pregnant. It isn't important to Target because there's a few times in our lives when we're really open to switching brands to try new things. Pregnancy is one, marriage is another one I could be one or two more but most times we go the story by the same stuff over and over again. We want to understand and try something new something kind of big. That's why Target cared about why we want to know when people are pregnant so we can send them the flyers in the mail the emails that sell them the new baby stuff that we want them to buy. We want it to be appropriate. There's no sense in selling me Prenatal vitamins obviously it's the one who the women are and when they're pregnant. They did this model they started rolling out these ads and it backfired horribly. About a year after they did them all and they roll it out a man walked into a Target outside Minneapolis and demanded to see the manager. He was clutching coupons that had been sent to his daughter, his teenage daughter and he was angry. My daughter got this in the mail she's still in high school you're sending her coupons for baby clothes and cribs are you trying to encourage her to get pregnant? The manager had no idea what was going on he apologized profusely he said he'd find out what happened turns out a few days later he apologized to the manager turns out there's some things in my house that have been going on that I didn't know have been happening. The teenage girl had been pregnant she had been buying some things at Target I think it's with her with her parents card and so she gets the ads in the mail because Target said this person is pregnant without realizing it's a teenager Target knew the teenage girl was pregnant before the dad did that's obviously not the way you want those things to happen so the article goes on to say that the reporter when he was talking to the statistician Target cut off his access to them partway through his research kind of once he got to this stage I found out this part of the story and Target since then they made some statements they've changed the way they advertise they've refined their algorithm for example they'll send it to the flyer in the mail it won't be all just baby stuff it'll be bottle of wine, baby shoes long mower so it's not as obvious so that was kind of the fallout for that so it was just a horrible example of micro-targeting gone wrong so a few more examples and there's lots more out there one problem I had with this talk was trying to fit in everything that I wanted to talk about Snapchat came out this year they made a statement data from offline purchases like the stats from a loyalty card the grocery store card that you swipe to get a discount those companies sell that data and they have advertising profiles on people this person buys this thing with this thing and that thing usually and Snapchat buys that up so they can use that data to better target their advertising when they sell ads to the people who advertise on Snapchat so again just this weird roundabout way of how is Snapchat going to sell better ads it needs better data on people it gets that data from the things you buy and combines it in by what you like apparently and everyone's happy a couple of examples of micro-targeting Gizmodo and both are the press foundation earlier this year they made ads targeted to federal employees so Gizmodo what they did Facebook has a way to choose to select the demographics you want to target I want to sell ads to these people in this area with this kind of income and so on and Facebook adds for federal employees in the DC area specifically targeting them saying hey please leak this information about President Trump freedom of the press foundation wanted to do the same thing so they did it on Twitter Twitter is the same thing you can really use some fine grain advertising which market do you want to target so they want to be even more specific employees in the EPA and the NOAA we want to target just them because they try to find out who their Twitter accounts were and they want to serve ads just to them to target them to leak information to the freedom of the press foundation so whether you agree with their methods or not the fact that you can target ads specifically in just an area again can be used for good for bad or for whatever but it's happening and it's possible and these aren't even like that new of an example of this going on 2007 article well one topic of discussion after the election last year was a Muslim registry should the government build it or not can they do it is it legally allowed the government doesn't have to it's already been done it's a company that exactdata.com you go to it you sign up to it they've already collected data on 200 million US people you can filter down by 450 terms to get specifically the kind of people you want to target 1.8 million names at a price of $140,000 I think it was Amnesty International that did the research and went in and said we wanted to advertise to people who identify as Muslim well that's the price to get a list of Muslim people in the US and advertise to them there's no need for the government to do it companies have already done it for years I've had this information and this type of collection is happening in all industries all verticals identify who owns a phone number you all see the people search websites that you can't escape from it's happening with medical data with insurance, children's toys, your porn viewing habits scraping profiles or Facebook profiles, scraping Twitter it's data is being collected lumped together and cross correlated across services again just to sell you things so if you don't like it what do you do don't install the free apps I'm repeating myself at this point restrict what your apps can do use an ad blocker buy everything with cash don't install apps live in a cave just deal with it this is the fatalistic part I'm talking about it's been going on for years it's legal there's not much oversight into it it's just kind of here a few more examples I'll try to run through biometrics it's easy to extract biometrics from a photograph the bottom example HSBC they had this big ad you can use your voice to sign into for phone calls they've obviously never seen a movie sneakers they don't get that reference my voice is my passport they say your voice is your password voice is not hard to fake UGA as a university in Georgia they add iris scanning to the cafeteria you go in the cafeteria instead of scanning your card which takes 10-15 seconds you now scan someone's eyes to make the payment it takes 3 seconds so what was interesting to me was that it's a cost of like $78,000 a university will invest in iris scanning just to make the cafeteria line go a little bit faster that's how that's where the technology has progressed that's how easy it is to use it and access it IOT the P and IOT stands for privacy there is no P and IOT yes this topic needs no introduction it's something we're all familiar with the newest example is Roomba selling maps to your home to IOT manufacturers so they can better design the devices to use your home better the fact that Roomba makes maps to your house and can sell that data there's a market for that data it just kind of illustrates the silliness of it all this is just a big wall of text there's links for articles that come out this year we're talking about the data economy the surveillance capitalism how all the things that the AI is going to learn from us all the stuff that is relevant but I just couldn't get to you couldn't talk about there's just so much research so much talk about it so much developing in the space right now that all I can do is link to it and say here drown yourself in it and see what's coming good news is it's not entirely bad but not 100% bad a few fun examples of this technology being used in cool ways Syrian refugees well the UN distributes money to refugees how does it track that money how does it audit make sure the same person is getting the same amount of money with ATMs and debit cards it's hard to do so with iris scanning the same person can scan their eyes the UN knows this person is getting these funds we can track it we can account for it so it's a pretty cool use of iris scanning to do something good toaster burns your toast the toaster checks the internet checks your cloud reputation online reputation if your reputation is bad it burns your toast that day it makes your coffee really weak or something there's not a lot of these good examples there's some fun silly ones but there's unfortunately not too many of them collection and misuse are they inevitable some people don't think so what's really cool I've seen in the past year is people talking about how to avoid this type of stuff you do things like it's privacy by design privacy engineering you build it into the product from the beginning because you care about that as the outcome privacy tech is getting some investment there's terms like data ethics data protection data avoidance that are being talked about by people in the space I mentioned Michelle Dignity the vice president of Cisco she talks a lot about this and pushes us a lot so there definitely are people who care about this and are trying to change it so if any of you if you're one of them please read up on it I mentioned earlier adversarial input this is feeding bad data into the AI and machine learning in order to fool it optical illusions for machines Instagram has a filter you can't show nipples so this person started an account just a close up of just a nipple you can't tell if it's a man or a woman man nipples are okay so Instagram's filter doesn't know how to block it or not so kind of a fun silly example but just a way to when you rely on AI to do something to make a choice there's people who are going to subvert it and put bad information in and adversarial input is kind of it's a really cool interesting field that's just now developing because we've only got to the point of where AI and machine learning is actually functional so now the developers have to start caring about okay it works now we should worry about people breaking it and of course Calvin and Hobbes filling out a reader survey for chewing magazine he fills out the little survey card salaries I spend $500 on gum I'm 43 years old I love garlic and curry gum this magazine should have some amusing ads soon and Calvin of course I love messing with data hopefully all of you do too find some new ways to do so okay so wrap this up really soon I'll talk a bit about fears versus wants and kind of the future you've seen all the madness that's happening and things going on out there lots of ways, lots of reactions I've had as I go through this and prepare for this here's the first world reaction to it I didn't install an ad blocker so I'm seeing ads for shoes on Facebook that I shouldn't see I felt a lot of this going through all these concerns about being tracking our first world problems or someone tracks us and tries to sell us something better is that something we've got to worry about when there's much bigger concerns out in the world you take a philosophical approach to it you care about the privacy violation of it the principle of the matter it's bit by bit our privacy is being violated and released and eventually it's going to cause some major damage likewise you worry about government surveillance we're here, we're the government trust us we're going to watch everything don't worry it's okay things that we worry about but here's the reality of what we want I want to make sure I get this story right okay so the story from Utah a five year old child reported being sexually assaulted after entering a bathroom at a library surveillance video shows a man whom library employees say frequency of the building followed the boy to the restroom a few minutes later a man leaves the bathroom the boy comes out tells his sister and his mother what happens absolutely horrible and this is a case where 90% of people saying yes we want cameras we want to identify his face we want to identify his gait who bought those clothes let's find this guy right away we want the surveillance to data so we can find it so we can stop situations and prevent stuff like this so we've got these two extremes of worrying about ads of surveillance but then we want surveillance to help us in situations like this we've got this spectrum what are we okay with so you've got the two ends to it where are we right now the reality of our world is the online world forces us to make irrevocable decisions about the online footprint what you put online stays online the internet remembers the choices teenagers make now follow them for years and years in ways that teenagers from the 80s and 90s don't have to deal with the surveillance economy works on information asymmetry companies know everything about us we know very little about them there are hundreds maybe thousands of companies of all sizes involved in this space collecting data reselling it trying to find ways to gather data that are unheard of or are known of to most everyone I've got a small chunk of it doing the research for this but there's this huge economy out there that people don't know about I mentioned DHS's biometric exit this is a quote from the privacy impact report that they did to assess the necessity for this service the only way for individuals to ensure here not subject to the collection of biometric information is to refrain from traveling so if you don't want to fly out to the country you don't want to leave the country oh sorry if you don't want your face scanned by DHS you don't leave the country you have no choice in the matter is it just happening that's actually not true this is specific to leaving the country I could be misremembering but I thought it said when non-citizens want to leave the country their face has to be scanned okay yes I should be speaking about it thank you yes this applies to non-citizens again please check me on this the point I'm trying to make is government authorities are putting these collection methods in place and making them mandatory with very little knowledge and input from the rest of us so this is kind of the reality of the position where we're at right now yeah please okay so maybe I'm mischaracterizing it they do go through an assessment but how many people here know about DHS's PIA handful, small handful so how many of our legislators know about it and give input to it do we feel this is correct and this is our country right is this something we want for our country to do and I'm not here to pass judgment I'm just saying this is the reality of what is happening and this is the path that we're on Zeynep Tufecci is I hope I say her name right Zeynep is from Turkey talks a lot about social media use and how it affects revolutions and how technology affects our lives and this quote was in response to Amazon's new camera the camera we stand in front of it Amazon takes a picture and it recommends new clothes for you I didn't talk about it much but earlier in my talk I kind of alluded to it you can get someone's age their mood, their demeanor maybe a medical condition what's in the room next to them AI and the algorithms can identify what else is the objects in this space so a lot of information you can capture from a photograph there can be a lot of data conclusions that can be teased out from just a photograph so her take on this was we're just walking into surveillance capitalism we're just walking into it without noticing what's going on and it's evolving into data and computation driven authoritarianism one cool service at a time we buy these things, we use them because they're neat because they're fun because it's convenient but we don't realize the implications and what it could lead to so that's kind of it here's what's happening here's where we are here's where we're going to go from here I don't know but it's interesting and thank you all for coming out hope it made sense