 So, welcome to the stage, Esther Payne, who's going to be talking here at Fosse Asia about IRFC 1984, or why you should start worrying about encryption backdoors and mass data collection. Good afternoon, the audience of tomorrow, literally tomorrow, because I'm currently recording this the night before, and the reason for that is over there. So safety is very important, so we'll just get started from last night. Before we begin, I want you to have a look at the picture of this lovely French frog. No frogs were harmed in the making of this presentation, but can you concentrate on that frog and a metaphor with putting that poor frog right there in a pan of water and then slowly heating that pan up while I say a few statements to you. So, do we need privacy? All of our family and friends are using social media, and they're putting all sorts of information up about you and their friends and other members of your family, so everything's just up there. And teenagers, well, since the early 2000s, teenagers have been putting everything up about themselves and communicating with each other using social networks. So it's pointless. Throw privacy out the window, we don't need it, we're in a connected world now. And because of all the threats to our way of life, like terrorists, like pandemics, we need to give up a little bit of that privacy for our own security. So yeah, is that water feeling a tiny little bit warm now? Maybe. So we've got a very difficult problem space. We've got to try and explain privacy in ways that people can understand, how what they put online can affect them in real life. Now, this isn't the first time humanity has had to face this. At the time when cities were being built, civilizations were continuously improving. How did you convince your general populace that perhaps they want to follow the rules? And ancient Greece had it with things like Esau's fables and to some extent with great mythology, which leads us on to this guy. He's not Greek, he's actually Roman. And his name was Ovid. And he was famous because he was a poet in the time of the latter reign of Augustus. Augustus, who had learned from his predecessors not to go and openly grab power and declare yourself a dictator, had gone and hoodwinked the Roman Republic into thinking that they were still a democracy and everything was all right because he was a first citizen. Ovid tended to write about gods and goddesses and how their whims and decisions affected those in the lower pecking order, humanity and lower orders of supernatural entities like Nimps. Now you might think that this is just mythology. In Roman times, mythology was intrinsically tied in with the state. Julius Caesar had been recently made a god and Augustus after his death would also become a god. So when Ovid was writing about mythology and criticising the father of the gods, Jupiter and Roman, Zeus and Greek, he was explicitly critiquing the regime. And unfortunately, he did that and it ended up with him being exiled. The main reason was because of a publication he wrote, which was called the Ars Aramata, which was about how to get women's interest, pick them up and then get rid of them again when you just were bored of them. And there was also a little bit of a rumour about him having a good time with Augustus's granddaughter Julia, no one's really quite sure. He ended up in exile on the Black Sea in modern day Constanza and he kept occasionally trying to ask to be let back in from exile and Augustus said no. And his success or Tiberius also said no. So Ovid died salty and pretty upset about it. But why am I telling you about Ovid? What Ovid did was he remixed Greek myths and made them palatable for a Roman audience. And with privacy, there's only really one lehmbal myth you want to consider. Ayo Nargis. Ayo was a beautiful nymph. Zeus, doing what Zeus does, decided that he wanted to have a good time with Ayo. She said no and then there was a bit of a me too moment. And Zeus then decided he needed to cover the sub quite badly. So he did that by covering the entirety of the land in a massive cloud. Now that's what happens when any someone in authority decides to cover things up. Someone notices and in this case his wife Heron noticed and he had form. So she went on down to investigate what was going on and she went, Hey Zeus, what you up to there? And what she saw was Zeus standing there with a cow. Because for an extra bit of covering up, he turned Puraio into a heifer. So Hera went, hey, she's ever so pretty. And Zeus went, yeah, yeah, yeah, have the cow absolutely. So gift her you my lovely wife. And she looks off. And she went and being the victim blamer that she was, she decided to keep the cow but put it under constant surveillance using a giant called Argus Pinottis. Argus Pinottis was a massive giant who was a shepherd and had 100 eyes covering the entirety of his body, all around his body, which meant he had 360 panoramic view. And he never really needed to go to sleep in order to rest. He only had to close two eyes at a time. So Puraio, she's a cow and she's under constant surveillance after having been violated. So she got the word out to her father, who's a river god, who along with all the other members of her family, finally took enough pressure on Zeus to be help. A tiny, tiny bit guilty and thought, oh, I should try and get her out of the captivity where she's being surveilled constantly. And so what he did was he asked another god, Hermes, to sort it for him. And how Hermes did it was he did an exploit on Argus. He came up in disguise as a shepherd and told him a big long story, which bored Argus so much he fell asleep. Hermes then killed him. Exploit done. Puraio was able to run away, but Hera being like so much of our society, who likes victim blaming whistleblowers and other people, made a gratified pursuit of the cow all the way to the banks of the River Nile, where Argus fell over and then was restored to her natural form. I've been talking about a great myth. What exactly did the Roman poet bring to this thing? What he brought was peacocks. Hera, being incredibly upset that her tool of surveillance had been murdered, decided to memorialise him forever by putting all of Argus' eyes on the tail of a peacock. So that was Ovid's extra little finesse to the legend. And even though peacocks aren't actually a Roman bird, he just thought it was a cool metaphor. But is this actually relevant to nowadays? Well, it is, because if you're a security firm, who say likes doing video surveillance for companies, what you do is you call yourself Argus Security, and you bang a big eye right in the middle of your company logo. Argus Penoctis has stayed within our public consciousness with famous television reality shows that explicitly reference that. But, I mean, how relevant is it nowadays? Well, I mean, it's a great myth. Peacocks aren't really that appropriate around the world. They can mean different things to different religions. So we'll leave that aside. We'll try something else, because the trouble is everybody focuses on the technology. They're not focusing on the cow. We've forgotten to focus on the person who's been surveilled against the poor cow. So we'll try George Orwell next. We'll try 1984. Now, everybody remembers about the telescreens, about Ingsoc, about the idea that we've always been at war with East Asia. But we're missing one of the main points about this story, because it's not just about totalitarian regimes. It's about a philosophical exploration of an idea from the 18th century by a philosopher called Jeremy Bentham. And what Jeremy Bentham designed was a pinopticon. And it's a very simple idea. You have a dome or a building of cells all around a central tower. And within this tower, you have one individual, and they have a light that will shine on individual cells. Now, you have the apparatus of surveillance there. But there's one slight problem with that, because we're focusing on the technology. We're not focusing on what the idea was meant to do. Bentham designed that pinopticon with the idea of controlling the human factor. He was around at the time of the Industrial Revolution in the UK, where they were trying to mechanize industrial procedure. So when you're having to control a large body of people, be it factory workers, be it prisoners, or rather relevant now, quarantine people within a pandemic, you want to maximize your resources without overworking them. And with the idea of the light being shone into your cell is that you're aware of being surveilled. You may not know when or where you're being surveilled. And that's the idea. Because you know you could be looked at at any point in time, you don't know who's looking at you. It gets you to modify your behavior, which is explored quite deeply within the book. But it's very hard to focus on the person being surveilled, because Winston Smith is part of that apparatus, but he's still a human being, and he says, don't ditch water. So it's relevant, but we have to think about other ways to explore this idea. And Orwell did not just inspire popular culture, which shows what they've got. It also inspired the IETF to notice that there was a very important number coming up in their RFCs. And it came up at a time when the US government were trying to do a trade embargo on encryption. And they were incredibly worried, they've been worried about this since the 60s, that as the network of information grew between consumers and the government, that there could be a very real threat to the privacy of those consumers. And the US were very much trying to limit the creation of more available sources of encryption for ways to do that. And in 2015, rather impressively, given what happened later on that year, they decided to make it best current practice with the meeting show that they felt that people had been using this RFC as best current practice for years. Let's just make it official. So the IETF and the International Architectural Board were very concerned about the need for increased protection of international commercial transactions on the internet. This was the birth of e-commerce. And they realized that they needed encryption in order for things like credit card details not to leak everywhere. But the US government didn't agree because they also knew that what else happened on the internet that perhaps needed encryption were private messages between people. So they decided to go and restrict escrow for a start, which meant US firm who was trying to export encryption had to leave clues for their client who were in Europe through commentary, through instructions because they had to strip of the encryption first and it needed to be encoded back in later. And the US government were also trying to do things like the clipper chip, which was kind of like a kind of beyond ear of a virtual crocodile clip so that you could still listen in conversations. But in theory, the information was still encrypted so that bad guys couldn't get at it. Just the government could. And sometimes they just said, why didn't even me to make the encryption that strong? You don't really. And of course, some regimes just don't agree with encryption entirely. So why am I so worried about this? Why do I think there's such a threat to privacy? Well, we have it at the moment with various terrorist attacks people trying to plan out shenanigans and governments want to know about that, but it's not just targeted at individual organizations. They want to drag net for everybody's information that they can just search it for their convenience. Snowden has shown us this and there's a heck of a lot of intelligence that can be gathered from your communications within Facebook. What you put up there, what you react to when people post memes. And your friends and family, they just, they're so curious about the idea of DNA services and where do they come from and are they really from Ireland originally as their family history says? And they don't understand, they think of it just as a physical DNA testing that will be stored somewhere. They don't realize it will be encoded and compared later on. And as I said, the government's very, very interested in knowing what its population is up to. In 2015, both UK Prime Minister and the Australian Premier didn't want encryption to be a factor in communications. The Australian Premier, in fact, said that the laws of mathematics were subject to the laws of Australia and both the UK and Australia had metadata bills which are designed to force large broadband providers to collect data that you give. And the data that's being collected are things like when you're connecting, where you're browsing to, what are your IP addresses? These are all bits of information that on their own don't seem that bad, but they tie and they profile you. And of course the US from 2015 and now in 2019 and 2020 are incredibly keen for weakened encryption and back doors into encryption so that they can have it easier. What they don't tell you though is that there's a spectrum in terms of their access. A vice article found that sometimes the police can very easily get into your mobile devices. They just want it easier. And the NSA has had access to America's domestic communications for a while and they're not meant to because the NSA's remit is meant to be for international, not domestic. And we have things like the online harms bill in the UK and the Ernaut Act which are designed to try and protect the most vulnerable in our society, children. From child trafficking and child pornography but these bills will not protect them. What they will do is threaten those vulnerable children's privacy in the first place because it means that children as they grow up will not be able to investigate the parts of themselves that they'd like to. They will have cradle to grave surveillance. And political parties want to know how they can make you vote for them. Now most of us were pretty sturdy in what we believe and how we think a party should be and what we should vote for. But you have voters that are in the middle and can swing from side to side. And when political parties are collecting your data they're really collecting it to find those voters that they can be manipulated. And with us being in the middle of a pandemic at the moment this is where the pinopticon really comes into place because governments do need to have, your details need to have some way to track you in case you are infected with COVID-19. The difficulty is that we're giving them all of these powers but we don't know when we're going to get back to the power to not be surveilled. And we have to consider what happens once that data is all gathered up. Is it going to be kept safely? Is it going to be destroyed? Is it like with the NHS in the UK going to be sold on to pharmaceuticals and health firms? And are hospitals actually storing that data properly? Because German researchers found medical records on an open file store online. And that was particularly a bad thing in Australia because that data is not meant to leave individual states. And of course, political officials can touch commonly shared data stores and use it inappropriately. When the sites of the UK government officials are supposed to be the UK border force except if there wasn't public servants in a way that we don't like them, it was third party contactors with firms like IBM who are in their $3.50 country. Which means that that data from the Schengen information system has ended up in US hands breaking GDPR. And the Republican National Convention were also briefly very careless with voter data. This demographic data included the normal things like age and where you live and what demographic you are. But it also included things like magazine subscriptions. These are things that give political parties a really good insight into who their potential voters are. And in a very 1984 S&A, the Home Office in the UK destroyed the landing cards of the Windrush boat, thereby deleting any evidence that members of the UK population who'd come over from the West Indies had to prove that they had the right to stay in the UK. And that, all of these bits of individual data are being used by commercial firms to help officers of a government to track illegal immigrants or to track anyone who's maybe just not part of society yet who is considered they should be thrown out or discriminated against in some way. And you definitely do not want to trust UK officials. On the Wikipedia article at my last check, there were about 30 instances of UK officials at all levels of government being plain careless with individuals' data from a laptop being left in a taxi to someone handing in a thumb drive going, does someone drop this? Officials are not trained to think about data in the way that we think about data. It doesn't feel real enough to them, so they don't care. And we have another threat with CCTVs all around us. They've been around us for years, we've grown up with them. They're innocuous. We don't notice them in restaurants. We don't notice them really when they're out in the street anymore. We're told they're there just to stop crime and it's for our own good, for our own safety. And that's all very well, but we have a network of cameras that is growing every year. Particularly with devices like Amazon Ring, which are being installed willy-nilly all over the place, and Ring has a very cosy partnership with law enforcement. Over 800 agencies in the US and they're getting very cosy with UK agencies as well. And the way that they get sold in is the police can sell them on to customers who've been burgled. And they go, here you can have this at a discount. And what's insidious is that unless you explicitly opt out, that data that your doorbell collects will be shared with that police force. Or you can choose to just upload it to a neighbourhood social network where everybody's paranoid and sending in reports for innocuous things like someone delivering a package. You're putting into individuals' hands the power to effectively be judge and jury. And when people have the power to do that, there are some very real consequences for those who are on the margins of society. One fear that I have is we have this network of CCTV cameras and doorbells, and the next logical place for this to go is, you've got a picture of someone. What can you do with that picture? I know. We can make things frictionless for door entry. We can do facial recognition. How happy does she look, that lovely white woman? She's just so happy. Everything's just so much easier for her. But there's a slight problem with facial recognition and certainly with AI comparisons. There's bias built into these systems. In the US, the programming and the data sense are mainly of white people. So one is, it means that the African-American and other minority communities aren't identified properly by this system. But Google allegedly thought, we can solve this. Let's go get some third-party contractors. And according to the third-party contractors, they were told to build up the dataset of minorities by any means possible. So they did it two ways. They went to campuses with universities and got students to play a game, where they got a Starbucks card. And they did the same thing with the homeless population as well. Where they got a card and they were allegedly told to focus on that homeless population because they were less likely to ask questions. So there's a mismatch in thinking there. Technologists are thinking about the benefits of frictionless facial recognition without thinking about the consequences of this. The government, they want facial recognition as well for you to be able to access public services. In France, the ALISM got launched in October and you have to use an Android phone to take a picture of yourself from several angles so that the government now has a biometric record of your face. And amazingly, they've rolled this out to French citizens, not to people who are third-country nationals. But I honestly thought they'd do it to them first. And schools in China, and to some extent in America as well, are starting to implement technology. One of the systems that was trialled in China is being tested on two-year-old toddlers in Japan to check if they're engaged with the education system. The high school students that had the system trialled in China, AI was combined with facial recognition, to send a report home to parents if they're not paying attention properly. So students aren't paying attention, but they're doing the appearance of paying attention and they're knackered. They're not learning anything. And facial recognition will very soon be coming to Amazon Ring projects because Amazon at the moment is developing recognition, which it's going to try and trial in its brick-and-mortar stores first so that you can have frictionless shopping. And it will be able to detect your emotions if you're happy or sad. Isn't she so happy? But yeah, let's continue the discussion. I'm about biometric and DNA and medical information. So DNA testing kits, as I've mentioned earlier, they're pretty innocuous in theory. GED Match in the US has a very cosy relationship with law enforcement. They've recently been bought by another company that's even closer in with law enforcement. They've never had a problem handing over a massive data set of DNA information to help law enforcement because they've always believed that will help solve crime. And it did solve a crime. It solved the Golden Gate murders in Los Angeles because the perpetrator's relative had uploaded that data. Other DNA testing services like 23andMe and Ancestry.com, you still need a police warrant at the moment but I wouldn't feel too complacent about that because 23andMe recently sold its DNA data set to a pharmaceutical company to develop more drugs. Now you can see how that's lovely. It's helping people and it's like, you've got a president there. You've just gone and sold a data set to a commercial company. For now it's for developing drugs. Where else could they sell that data now that the president has been set? And of course, with more and more of our healthcare system becoming private, there's a concern about your own medical records. We're gathering so much data. We're digitising so many medical records. And in theory, if these are at least two researchers, the medical data is de-anonymised. But last month, Professor Tate from Melbourne University got sacked because she'd made people aware of the fact that the data for 2.5 million Australians, going back to 1984, can possibly be de-anonymised and individuals can be identified by their medical conditions and other bits of data. And everybody gets really excited about being able to identify people by how they walk, how they express themselves, fingerprints and gates. But these are things that you cannot change about yourself. Or if you do change something about yourself suddenly, you're going to have some awkward conversations with the system when it doesn't recognise you. So, on its own, the collection of medical data, putting in your DNA data into these sites, is innocuous. But once the data is up there, it's up there, and that data can be combined with other data sets. And all of a sudden, you've got a medical profile and a DNA profile of an individual. And imagine what health insurance companies can do with that. This is the modern penacticon. This semantic map. That little stylised brain there is you. All these little data points. All the people that you're connected to and their data points. You have a series of information and profiling and pressure points. How is that water feeling? So, do we need privacy? Your family's on it? Yeah, they are on it. And there's still a risk. And we've been... been as bad. It's too late for us. We have to think about the next generation and the next generation after that. Teenagers don't care. That's not true. Teenagers do absolutely care. They've been used to working grounds surveillance. They've performed a form of stenography in order to code messages to their friends that their parents won't recognise. And with tracking systems and facial recognition systems being rolled out across campuses in the US, students are starting to protest about it. And organisations like the Electronic Frontier... the Electronic Freedom Foundation are giving them tools to be able to do that. So, yeah, they do care. And we need to give up a bit of privacy for our security. Well, I'm going to refer to a long-dead American president for that one. Because Dwight D. Eisenhower was incredibly clear. He said that if all Americans want security, they can just go off to prison. You know, they'll have enough to eat, go to bed and roof over their heads. Maybe not in the US now. But if an American wants to preserve his dignity and his equality as a human being, he must not bow his head to any dictatorial government. And he's right. What do we value more? Do we value being in that penoptic self? Or do we value being able to have a bit of space to ourselves? Have a little bit of thinking time to ourselves. Do we really want to sacrifice our freedom and our personal dignity because we're a bit scared? Because I do not want the future to be that penoptic. I want free speech. I want free debate. I don't want censorship. So is there hope that there is there's hope? The mainstream press are starting to wake up, mainly because journalists are starting to realise what the consequences are. And some cities in the US and in other places are considering a monitoring on facial recognition. And you're all here listening to this talk, whether you're in the room or you're doing this remotely or you're watching it later. You care enough. So what can we do? Well... It's all online from the same spot that I found it in the back page. The mainstream press are working on it. Now you expect to see more of this sort of coverage. But it's good if you go and do your own digging as well. This needs to be a great effort in order for us to make a difference. Another thing about building that penoptic self in the day, don't see legislation work. If you did do something like that, see where we would get Americans in time to go over this in depth. There aren't people like that who care about the privacy if you want to do something about it. If you design a privacy from the beginning, you're making life a lot easier for you and your customers in the future. Follow the privacy-focused organisations online. Privacy International, the Electronic Frontier Foundation, and certainly organisations like the AACL, you provide lots of material for you to understand it. Downstairs, for example, there was the glass dome stand, which just explained quite a bit about what your routine can be like when you're doing things online. But like anything, you've got a bit of a job to do yourself. You've got to do your own data detox. You've got to consider all those ways you've been interacting with the virtual world yourself. It might be you've had a Google account for years. And in order for you to be able to make the argument to your friends and family, you've kind of got to have an idea about how to de-Google for yourself and what they're going to face as difficulties. And if you want the world to change, help support existing decentralized networks. Like Macedon, like diaspora, look at the activity pub standard. Look for applications that provide real world alternatives. And to some extent, you can't just expect to point people at a video like mine for them to understand. It might work for a lot of people, but everyone's an individual. You want to think about ways you can freeze personal stories, shared fandom moments in order to frame a privacy argument. You've got to try and give some empathy to your friends, your family, your chosen folk in order to get them to understand why this is a big deal. It's not enough to go and negotiate with them not to put your stuff online. You have to get them to care about what they can put online. Because you want them to focus on the cow. You want them to realize that they are aisle, whether they realize it or not. Don't focus on the surveillance technology. It's cool and it's seductive and it's all about visuals and things. But it's a lot harder to focus on people than it is on tech. Because I don't want the future to be that panoptic. Think about simple stories and ways you can phrase individual conversations about privacy. You don't want to come at it from you're wrong, Google's evil, they're just swallowing up your data. You've got to try and phrase the thought. Acknowledge the use that they've been using these tools for years and help you find a way to do it. Use another tool. For example, I posted two posts on Facebook around the theme of this talk. The first one didn't get a lot of reaction on Facebook because it's quite dull, doesn't have a picture of me. And it doesn't really say much about what I'm doing and it doesn't really bring empathy into the post. The second post I acknowledged that tools like Facebook and Twitter are community tools. People interact in their communities using these tools. And people did watch the video and they interacted with it on Facebook. So there was a huge difference in take up with that. And everybody's culture is different. Everybody has different myths and legends and family history to help people interact with the subject matter. This is the reason why myths were developed in the first place. It was for social engineering. And of course, once you've got your family and friends on site, they're all on board, you can start contacting your political representatives and start to push against the power of big data lobbies. Because political representatives often aren't that tech savvy either. And because you've been getting your friends and family to do this, they'll put even more pressure if you ask them the right way. And the more people that put pressure on our political representatives to consider the people that they represent, the more likely they are to consider other sides in terms of the data debate, in terms of privacy. Because perhaps we can use that people as a symbol. Perhaps we can use it for the reason that Ovid thought. We can use those feathers to represent a monitorium on facial recognition on being able to stop companies like Clearview scraping your data. Because human rights violations are happening right now around us and it's only going to get worse as this pandemic continues. And we cannot normalise this data collection. This huge massive profile of each of us that is the virtual panopticon. And use that hashtag. The RFC was by the ITF that the point of RFCs are to have a common framework of things like protocols. And we need to consider a framework for how we handle data and privacy. And this could be one way to do it. But this needs to be decentralised to use it. Thank you.