 Thanks for coming out, guys. So my presentation is called Beyond the Technology, Privacy Trust and Security in the Cloud. It's going to take a look at the social and political aspects of the cloud. It's mainly going to focus on the public cloud, not just in terms of infrastructure as a service, but also the sales layer. And it will touch on the internet and privacy in general. So given the potential controversial topic, I'm just going to throw some disclaimers out there. Any opinion expressed in this forum presentation on my own and do not represent the views of my employer? Also, to avoid any awkwardness, I've tried to remove most negative references to companies that actually participate in OpenStack. So you'll need to kind of do your own digging there. And lastly, I did not fact check every single item in this presentation, but I have references. And I think they're listed as well. So you can follow up on that. So just to introduce myself so you can kind of figure out if there's any biases in this presentation, my name is Goren Chung. I've lived in Canada my whole life. I'm an engineer, not a law major or anything that actually relates to this field. I'm not an activist or anarchist. I don't have like a black bar tattoo on my body. I've never participated in a protest. And that said, I'm not completely apolitical. I've taken a single university course in Canada's role in global development. I did pretty well on that course. Granted it was a first year course and I was a fifth year student at the time. But I did pretty well. And lastly, I'm aware of the optics of having a privacy talk coming from a person that works for a Chinese based company and who's linked in profile just says working on data collection dot dot dot. But that said, let's get started. So I don't have to tell you about the cloud. It's the Cloud Summit. Everything's being adopted in the cloud. New businesses are being built as cloud-only enterprises and it's a big business opportunity. Microsoft, Google, Amazon, they all saw huge spikes in revenue in their cloud divisions. I think in their latest earnings from a few weeks ago they all declared over 20% increases in their cloud revenue. And the research firm Gartner estimates that over a trillion dollars in IT spending will be directly or indirectly affected by the shift towards cloud during the next five years. And this is echoed by Bain which is a management consulting firm which estimates that public spending alone will reach about 390 billion by the year 2020. So over the last few years, everyone's been hyping up the cloud and they've kind of told you that same statement how cloud is amazing. And the reason for this talk is not to tell you the cloud sucks and to avoid it, I mean, I work in the cloud. But I do feel like there's another story to be told and to be told for you guys to consider. So we in technology, we're always talking about disruption, Uber disrupting the taxi model, Theranos disrupting healthcare, cloud disrupting IT spending. And it seems to be ingrained in us as technology technologists that we can do pretty much anything in the cloud or on a computer. In the book, The Hacker is part of the hacker ethic they describe as some of the items is that you can create art and beauty on a computer and computers can change your life for the better. Often as technologists, we take this to the extreme and we have this unbreakable belief that we can do anything on the cloud and everything we do on the cloud is beautiful. To put it bluntly, sometimes we as technologists are kind of full of ourselves. So now that I've kind of alienated myself. So I wanna talk about cloud. Cloud. Also because of our arrogance, we don't really consider our ramifications and side effects of what we do. So Justin Kahn, who's the co-founder of Twitch said that there are things we were optimizing for that had unintended consequences to describe how social networks had inadvertently created these hives of tribalism. Even Stephen Hawkins gave a similar warning recently. I should add that that quote has nothing to do with cloud technology from Stephen Hawkins. He just said that we as humans are innately kind of aggressive and technology kind of helps amplify that. But acknowledging our flaws aside, let's step back and consider what the biggest concerns people have about the cloud. So in the same Bain report regarding IT spending, while there was more confidence in the shift towards cloud over the last three years, they found that 35% of individuals cite data security as still their biggest concern regarding cloud computing. In addition to data security concerns, a lot of concerns about how corporations are undermining users because they're unsure what level of privacy is maintained because everything that's referenced in the cloud is kind of hidden behind pages of legal lease. If you look at the iCloud terms of service, it contains a lot of text about how Apple reserves right to remove or access anything they deem objectionable or reasonably necessary. Regarding governments, as a side effect of the NSA leaks regarding for prism, many people are unsure of how far reaching the arm of the government is. It raises fears that using a central public system enables those with authority to shut down anything that they deem unfit. And there is a lot of precedence for this. Recently, Turkey blocked access to Wikipedia. And previously in 2010, the Egyptian government shut down all internet access so citizens could not communicate and discover news during the Arab Spring. And this is also done in the Western world as well. For vastly different reasons, during the underground bombings in 2005, the UK government shut down all mobile networks to avoid any bombs being detonated remotely. So with all these actors, it causes the public to be unsure if the public cloud is something they can fully trust. In a survey done by Intel Security, they surveyed over 2,000 professionals for its annual cloud security research study. And they asked the question, to what extent do you trust the public cloud to keep your organization's sense of data secure? And while they found a growing percentage of respondents had more faith in the cloud, 29% of respondents voiced some level of distrust about the public cloud. And only 23% said they fully trusted it. When asked about what cloud architecture they had deployed, they saw a greater shift towards a hybrid solution instead of a fully private or a fully public solution. So all these concerns were kind of summarized by Chantelle Bernier, who's a Denton's lawyer. She reiterated that the four biggest concerns regarding cloud computing are accountability, complexity, lack of data sovereignty, or loss of data sovereignty, lack of transparency, and the challenges of safeguarding information on the internet. So let's dive deeper into each of the concerns. First, we'll take a look at data security. So over the past year, there have been multiple studies that have revealed a marked increase in cyber crime. The most famous breach would be the one that affected Yahoo. So in 2014, they had a reported 200 million accounts breached, which they disclosed in 2016. And they followed that up by disclosing another breach this year of about 32 million accounts. And as we move more and more critical information into the cloud, such as health care records, passports, IoT devices, like city infrastructure, the details are kind of sparse about exactly how the organizations are going to protect that data. So research from insurance specialist Lloyds suggests that 92% of companies in the EU have suffered a data breach in the last five years. It should be noted that Lloyds is an insurance company that wants to sell you cyber insurance. With all the breaches that's happened recently, cyber security has become a huge business in itself. In 2013, the Wall Street Journal estimated the cost of cyber crime in the US loan was approximately $100 billion. In 2015, Lloyds estimated that number to be $400 billion. In 2016, Juniper Research means that the cost of data breaches may reach $2.1 trillion by the year 2019 globally. And recently, IBM CEO echoed this. She said that cyber crime may be the greatest threat to every company in the world. So in addition to cyber crime, the role of corporations also raises issues. Not taking into account the active participation of many corporations in the surveillance program Prism, corporations have been actively undermining the privacy of citizens for business. So in the PBS documentary, United States of Secrets, the state senator for California recalls the time she met Sergei Bryn from Google. She says, quote, we walk into the room and it's myself and two of my staff, my chief of staff and one of my attorneys. And across from us is Larry Sergei and their attorney. All of a sudden Sergei starts talking to me. And he says, Senator, how would you feel if a robot walked into your home and read your diary and your financial records, read your love letters, read everything before leaving the house, it imploded. And he said, that's not violating privacy. I immediately said, of course it is, yes it is. And he said, no, it's not. Nothing's kept, nobody knows about it. I said, the robot has read everything. Does it know if I'm feeling sad or if I'm feeling fear or what's happening? And he looked at me and said, oh no, that robot knows a lot more than that. So in addition to just software companies, it's also ISPs as well. So in a very recent SEC ruling, allowing ISPs to sell customer information to advertisers without consent, the CTIA, which is the main lobbyist group representing mobile broadband companies such as AT&T, Verizon, T-Mobile, they argued that web browsing and app usage history was not sensitive information. It also went on to suggest that the consumers should not be wary of their newfound ability to sell customer information and that the more substantial privacy threats for consumers was not the ISPs, but the largest email search in social media companies. It suggested that the official response from ISPs themselves was that they do not sell customer data. But previously in 2013, AT&T had charged their internet customers about $29 extra per month unless they opted into a system that would scan their internet traffic and deliver them personalized advertisements. This was stopped by the FCC. So while corporations do exploit their customers, they also do try to protect them from the government. So over the last few years, especially in 2016, technology companies have started to encrypt their networks and communications. Being in 2013, Google has responded to the United States leaks, they encrypted all their internal traffic. Last year, Apple refused to create a backdoor for the FBI to bypass their iPhone encryption. They have also done good things like they've refused or pledged to refuse to identify individuals to the government want to suppress because of race or beliefs. And they've also tried to help us directly. So Facebook has started using machine learning algorithms to leverage their data trove to help identify individuals who were suicidal. And suicide is a very important topic. It's a second leading cause of death among 15 to 29-year-olds. And it's definitely a promising idea. So researchers at Flores State University use machine learning to predict with 80% to 90% accuracy whether or not someone would attempt suicide as far off as two years in the future. That said, it was discovered last week that the Facebook was also targeting ads to these depressed teens. In sales pitches to advertisers, they would boast about how Facebook's algorithms could determine and allow advertisers to pinpoint moments when young people needed a confidence boost. So it's talking about governments now. Let's see if my computer around the explodes. OK, so justification for surveillance is always the same. It's for your own security. Following the recent attacks on the Westminster in which four people were killed, the Home Secretary, Amber Rudd, told the BBC, we need to make sure that organizations like WhatsApp don't provide a secret place for terrorists to communicate with each other. It is on that reasoning that governments try to get corporations to work with them. In 2015, the head of China's Cyber Space Administration proposed a pledge to American tech companies that they will comply with Chinese information policies. And they include a lot of good things, protecting the user privacy of their citizens. But there were also parts of the pledge that were similar to NSA's prison program. So currently in the US, a little bit closer to home, currently the US is under legal bow over the validity of warrants of data in foreign countries, specifically data in Ireland hosted by Microsoft's email systems. So at each level of the judicial system, there have been different verdicts awarded, sometimes in favor of Microsoft, sometimes in favor of the government. So right now there is no concrete answer on whether the government has the right to access information outside of the US. So in addition to that, currently there exists no single comprehensive set of rules to govern how to protect cloud data in the United States. There is an email privacy act proposed, which would update the current privacy act from 1986, but it would not cover cloud computing specifically. And it's not likely to pass given the current administration. And because of that, a lot of Americans consider Canada to be like a safe haven. And as a Canadian, I'm here to tell you, it's not really. It's an estimate that 90% of all internet, all Canadian internet traffic runs through the US at some point. And CSEC, which is Canada's NSA, has been found to have very little oversight in a report from 2013. They found that their oversight committee was a single retired judge who produced an annual review of CSEC conduct. And he's never found a single issue. Also, as Canadians, we're not Americans. So all the NSA surveillance tactics are actually legal against us. So the biggest problem, or one of the big problems with government is that often there's a belief that they are infallible and that they have complete foresight. In the book, Super Forecasting, Philip Techlok gathered a big group of experts, academics, pundits, and the like. And he had them make thousands of predictions relating to economy, war, elections over decades. And the results he found was that the average expert was roughly as accurate as a dart throwing chimp or basically they were no better than random guessing. There were conditions he found that could improve that, which would improve an individual's prediction accuracy. But he found that the intelligence community in the US had not, didn't have an environment that was optimized for this. Also, there's a really stupid person in your Congress right now that said that. So even though the government does do a lot of things to kind of remove privacy from us, it does help sometimes. So in the EU and Canada, there are legislations set up to have rules on consent for the sale of personal information. And it also imposes limitations on what can be collected and for what reason. The EU legislation specifically was passed in 2016 and will be forced starting May 2018. It includes conditions where if a company fails to comply with protecting the user's data, they can be fined up to 20 million euros. And there's a talk about that specific, this legislation specifically later in the week. So with all those concerns, here are some ways to kind of address them as users and as administrators. So you can start by defining policies and educate yourself and the company on these rules. You can decide the value of specific data and whether it's safer to store them in the cloud or internally. So Frank Abnail, who is the person that the movie Catch Me If You Can is based off of, he believes that technology alone will never defeat a good social engineering game. And he says that the only answer is to absolutely educate your employees about how to protect themselves and how to protect their company. We can also improve transparency of our agreements. So the Electronic Frontier Foundation publishes scorecards that are easy to understand to help consumers figure out how well a company supports their privacy. Similarly, there's the ISO 27018 standard, which was created to be applied to companies that adhere to a set of privacy and security requirements. Collaboration does help. So by building an open agreement between multiple parties, it makes it more difficult for a single bad actor or a single company to do bad things. It also makes it more difficult for those who wanna do harm as they have to attack multiple companies now. And this also works as a user by leveraging multiple cloud providers. So Boeing uses a technique that called shred and scatter, which applies across, where they send data across multiple clouds so no single cloud has all the data. It makes it difficult as an outsider to piece all that data back together in a comprehensible way. So if you're like me and kind of apathetic to a lot of things, but still recognizes there are issues that exist, you can get others to fight for you. There are multiple organizations set up for this. The EFF, for instance, is working on lobbying tech companies to protect and encrypt and delete their user data that could be exploited for policies like mass deportation and increased surveillance. The ACLU is working on passing C audiences that would rein in police and use of hacking and social media monitoring tools. And there's stuff you can do with technology as well. So just doing basic good deployment strategies like ensuring virtual networks remain detached from each other. So Google also has a custom chip that enables authentication at a hardware level. Similarly, Intel has a chip called SGX which does something similar to a hardware level encryption. So you can use technology and kind of, instead of the government monitoring you, you can monitor them as well. So there's a company called Digital, an organization called Digital Democracy which created a bot to record every single word a politician says so it can be fact-checked. It also pulls in the financial ties that the politicians have so to provide additional context for their choices. There's also an app called Stance which lets you record messages to your elected officials and it will send that message at night time when the lines are less busy and it'll keep sending your message until it actually gets through. Or you can go all in. So Kevin Mitnick who's a hacker so outlines some good policies on how to go admissible online by using Tor and encryption and then he goes on further to say that you can get a burner from Walmart but instead of going to Walmart yourself you hire a homeless man to go to Walmart to avoid being captured on cameras. Or you can just buy fashion accessories that will kind of hide your face from the face detection algorithms. So even if we address all our concerns we have about cloud, does it even matter? Is it actually worth the trouble? So in a report resamming insider threats by force point they found that 14% of European employees claimed that they would sell their login credentials to an outsider for 200 pounds. In the same report they found that 32% of the employees were unaware or unsure about the potential consequences of a breach. Similarly in a peer research survey they found that 69% of us don't actually worry about our own digital security. So they found that 25% of us will knowingly use a weak password. 54% of us use that insecure public wifi. In another survey by Pew Internet Research they found that depending on your age about 40 to 50% of people believe that the government should be able to access encrypted communications when investigating crimes. And maybe that's not our fault this ambivalence to privacy. During the Second World War psychologist Eric Fromm in his book The Escape from Freedom he thought about why large chunks of the Western world were embracing authoritarianism. He said it was attempting to say it was a fault of a few madmen who gained power over the vast apparatus of the state through nothing but cunning and trickery. But Fromm argued that there was something inherent in humanity that feared true freedom and that we would prefer to be dominated. In other words, Fromm thought that our ambivalence and submissiveness was a feature of humans in general. So why does privacy matter? And this is where it's gonna get a little bit opinionated or even more so. So the government's determination to get its hands on internet data in the name of national security is eroding trust in other nations. And this is why it's important to corporations. As a reaction to the revelations of NSA snooping companies in the UK and Canada have added languages to their contracts stipulating that their data must not be stored on US soil. British defense contractor BAE specifically revealed that it had killed plans to adopt Microsoft cloud services because Microsoft couldn't guarantee that their data wouldn't leave Europe. In a survey done by Cloud Security and Alliance in 2013 we found that 56% of companies are now less likely to consider a US-based cloud provider and that 10% had canceled a project to switch to a non-US-based cloud provider. And according to the force of research, because of this lack of trust and as a result of the prison program, it will cost the cloud industry about $47 billion in revenue over the next three years, which is not bad considering the original estimate was closer to $180 billion. So why does privacy matter to us as individuals? One of the most referenced papers is entitled why privacy matters even if you have nothing to hide. It's by Daniel Solove, who's a law professor at George Washington University. And in his paper he references the book by Franz Kafka called The Trial, which centers around a man who is arrested by bureaucracy but never told why he was arrested. It describes how surveillance needs a sense of helplessness and powerlessness that alters the relationships between the people and the government. He goes on to explain the relevancy of the argument of if you got nothing to hide, you've got nothing to fear. The problem with that, the nothing to hide argument is that the underlying assumption is that privacy is about hiding bad things. It also makes the false assumption that those prying are not doing bad things themselves. And that is one of the main issues that in surveillance, those doing the surveillance do so because they don't trust the people, but the people have to trust those doing the surveillance. So in a documentary Do Not Resist, on a ride along with the police officer, she showcases the camera system in her vehicle which scans all the vehicles nearby to see if they're under investigation and scans bystanders to see if there's any warrants out on them. And she says, if you're out in public, there's no expectation of privacy. People say, you can't just scan my plates for no reason. Yes, we can, she says. You just have to hope that everyone who runs them is running them for the right reasons. And there have been many cases in recent times which show that they aren't running them for the right reasons. And that's a concerning issue because there's a lot of data to be misused. In a report issued by the Office of the Director of National Intelligence, the NSA collected, it revealed that the NSA collected over 151 million phone call records while tracking 42 individuals. In previous years, that was actually billions of records that could have been potentially misused. So after the NSA leaks, Obama famously said that you can't have 100% security and also have 100% privacy and zero inconvenience. We're going to have to make some choices as a society that there are trade-offs involved. James Comey, who's the head of the FBI, said in reference to encryption, even our memories are not absolutely private in America. Any of us can be compelled under appropriate circumstances to say what we remember, what we saw. The general principle is one we've accepted in this country. There is no such thing as absolute privacy in America. There is no place in America outside of judicial reach. That's the bargain. We made that bargain over two centuries ago to achieve two goals, to achieve the very, very important goal of privacy and to achieve the very important goal of security. Widespread default encryption changes that bargain. And even though I'm giving this presentation, I kind of agree with them. So in the book, The Walden or Life in the Woods by American philosopher, Henry Deirdre Thoreau, the philosopher goes into the woods to live for a few years, to see if he could not learn what he had to teach and because he wanted to live deep and suck out all the marrow of life. It's a very quotable book. But even living in the middle of the woods in the 1850s, this dude was running into people left, right and center. And that lack of privacy in the woods still exists today. So at a conference in 2014, Arthur Van Der Wees of Arthur's Legal in Amsterdam said that privacy is a matter of person, himself or herself. Person should be educated of what the implications are when they share. And I think that simple statement kind of sums it up pretty well. Privacy isn't in absolute condition. All of us in this room have different opinions which fall into the spectrum. So Richard Clark, who's the former national coordinator for security and counterterrorism in the United States, said during his keynote at the Cloud Security Alliance summit, quote, I believe we can have both security and civil liberties, but we can only do that if we keep a very close eye on the government and demand transparency and oversight and tell them we are not willing to trade our civil liberties for greater security. So going back to the paper by Solove, he warned that privacy is rarely lost in one fell swoop. It is usually eroded over time, little bits of it dissolving almost imperceptibly until we finally begin to notice how much is gone. So there may come a time when too much of your civil liberties are taken away from you in the name of security. And when that time comes, I think the movie Captain Fantastic has a good response to that. So when teaching his children the realities of life, the protagonist says, we have to do what we're told. Some fights you just can't win. The powerful control the lives of the powerless. That's the way the world works. It's unjust and it's unfair, but that's just too bad. We have to shut up and accept it. Thank you.