 So, as you see, if you have, I love to have very good qualified questions at the end, because at the end, that's where all the things that I've forgotten to say is what you're going to make me say by stating very good questions. Also I tend to surprise myself with what I say sometimes, and then it's kind of funny if that happens, if that ends up on Twitter. So if anyone likes to eat Swedish licorice, you will get that if you do a good question or if you're doing a good tweet. There, this one. Who am I? I'm a software tester. I'm a security tester to be. A little bit of DevOps, a little bit of everything. I also am a maker. I really, really love making LEDs like those, wearable LEDs, and I have a chip implant, and sometimes I go and talk about that as well. It's too bad. I had a really nice presenter mode before, and I don't have that anymore. I will have to do it. This piece of art here is very well-known in Sweden. It's painted by an artist called John Bauer, and it's been said that the model for this little elf in the middle is his wife Esther Elqvist Bauer, who was also an artist. And in 1918, they were going to take a boat over the big lake Vätten in Sweden up to Stockholm. They had the option of going by train, but about one month earlier, there had been a huge landslide. Because of the landslide, the railroad simply disappeared, and 42 people died in a fiery inferno train accident. Therefore, they felt it didn't feel really safe going by train. Boats are much, much safer, and so they bought a ticket for that boat. Another person who also bought a ticket for that boat is Viktor Fong, which is my great grandfather, my mother's father. He was a yellow metal worker, founder. Basically he's making chandeliers for churches. And one of these days, he was going to take this chandelier up Lake Vätten on the boat. He also bought a ticket, and it was very bad weather. It was in November. Sweden in November is well known for not being like Thailand, but more kind of awful. And the boat sank. And the people on the shipping, in the shipping company, they went through the manifesto and they saw Viktor Fong on the manifesto, went to his wife and said, we are very sorry to inform you that your husband is dead. And this would be devastating for them because they had seven children. And the wife, Ellen Fong, said, no, well, no, he isn't dead, he's in the workshop. It turns out that he was the only person who was denied entrance, although he had a ticket. For some reason, one skipper said, you're not, this is not the boat that you're going on. And he, we don't know why because that skipper also died. So Viktor Fong was the sole survivor of the boat accident in 1918. And what's also interesting about this story and the reason why I want to talk about it is that 360 days later, my grandmother was born. So that's kind of crazy. There was an investigation into what had happened in this accident. As much as two thirds of the cargo was put on deck, this cargo was also unlashed. So it wasn't anchored to anything, but it was just standing there. It was sewing machines, it was potatoes. It would have been a chandelier, but it wasn't. They were standing on top of the boat instead of downstairs. And I think also at the time it was common knowledge that if you're having a boat, you need to put the weight down below, because if you don't do that, it will tip over in bad weather, which is also what happened. This accident investigation blamed the captain for all of it. They said it was bad seamanship. But in reality, the shipping company had a bonus system that, and it was put into system throughout the industry, basically, that you do put cargo on top of the boat instead of down below. And if you look at this boat, like, I don't know, who would blame them? In 1918, Sweden was one of the most poor countries in Europe. We didn't have democracy, we didn't have, people didn't have food. Fourth of the population was in America. So everybody is working, is working very, very long days. And naturally, like, without knowing, but it looks like, you know, they wouldn't even have stairs to go down with a cargo, but you have to go by a ladder. And if you're having a big sewing machine, how are you going to take it down there? It's very, very hard if you're designing a system, or in this case, a steamboat. A 100-year-old steamboat at the time, too, because it was born in 1970. It was built in 1950s, something. It's just very, very hard to do the right thing. These people are already working very, very much for a very, very little wage. And these tickets are already very expensive. So there is a best practice, a best practice that will save lives. But still it is disregarded by most. So as far as I know, shipping didn't become very much secure until we have a technical solution that makes it drastically cheaper to unload and load ships. You don't have to balance with ladders, but you're solving things with technology. And it's much, much easier to operate these kinds of boats safely. Therefore, we also have much, much more cargo, because it's much cheaper, much better. So I would argue that the IT business is as mature as other businesses, like civil engineering or shipping engineering, were 100 years ago. And my hope is that the general data protection regulation that is coming into force now in May will be one of many steps into making IT safe and secure. But we don't know this. Like, I'm very enthusiastic about GDPR, but that's also simply because it's not there yet. No one has been fined yet, and we don't know how it will be interpreted. We do know what the law says, however, and I think that the law in itself is quite radical. So since we don't know how this will actually be interpreted, I can give you absolutely no answers about what the GDPR actually is, but I can define a problem set to assess your own systems by. But for that, one of my biggest idols is called Bruce Schneier. He's an IT security guru or whatever. He's writing a book right now on how we can regulate the internet of things. He thinks that there are two paradigms of security. One that is the highly regulated security of dangerous things. Dangerous things are things that will kill you if it goes wrong, like boats, or medication, or buildings, or electricity. This works with harsh legislations and pre-certifications. For example, if you have a medicine that will be miraculous and cure cancer and HIV and everything at the same time, you still have to go through a ten-year process before you let it go into the market, whereas we in the IT business, we can ship anything all the time. This paradigm of highly regulated security inhibits innovation, but it prevents failure, and therefore it also prevents death. Whereas we have the agile, patchable security of previously benevolent things, and that's software and stuff. Or for example, when it comes to environmental seals, we also very much see this like a voluntary approach or a approach with shaming or with trying to pressure an industry into doing something. And the agile, patchable security paradigm, it promotes innovation and it accepts failure. That was okay. As long as we didn't have internet-connected cars. That was okay as long as, you know, for example, Tesla, during one of the storms in Florida, they simply sent out a patch for people that were driving Tesla cars so that they could drive a longer distance on their batteries. So that was really nice in that case, but they didn't ask for permission in advance. Is this something that we really want? Do we want companies to be able to update our software over the air just like that? I don't know. So how come dangerous things are reasonably safe? So in the case of the boat in 1918, I think one of those boats, those tickets must have been very expensive. It's something that most people couldn't afford. So there must be some kind of equilibrium where it's secure enough so that people buy tickets at the price that they can afford it. Then there's an accident happening. Most of the times there is no accident happening, but some other times there is. And when that accident happens, usually there is a cause that is determined and hopefully the cause that they find is actually the one that was actually causing the accident. These markets and people, they push for better security, and we get better tech, or we get a more expensive product, or we get some kind of pre-legislation. Most of the time we get all three at the same time. And we don't really know for sure which one of them necessarily is the one that causes things to start working or things to get better. The new technology will also make things worse, and very often the new legislation will maybe solve that thing, but it doesn't solve another, it causes another problem. Anyway, we will end up with a new equilibrium of profit versus caution, and we will probably have something that is a bit more safe. So we can see now that medicine is probably more safe than it was 50 years ago. We don't have a new contra-gun accident every week, and this feedback loop works because we people care about dying. We don't want to die. So there is a market pressure because people don't want to die. People, politicians think that people don't want to die, therefore they want more better legislation. Whereas in IT, we still don't really care about the fuck-ups that happen. But hopefully we'll do very soon. So let's go over to the GDPR. The most important thing about the GDPR is that European citizens own their own privacy data. And in order for us to invoke our rights as owners of our data, IT systems need to be built with privacy by design. Both of these things, what are privacy data, what is privacy by design? We also need to have informed and granular consent. And we who collect and process data, we cannot anymore say, oh, I didn't know that an unencrypted S3 bucket was unencrypted. I'm sorry. But you will actually be fined. But you won't be fined before. An interesting thing is that you can have two equally bad systems, and only one of them will have a breach. And that's the one that will be fined, probably. And the other one will just, oops, repatch this. No one noticed. Great, we won't be fined. That's something that might happen, but we don't really know yet, because as we know, hasn't happened yet. When it comes to the GDPR, we often talk about personally identifiable information. And this is actually something that is not inside the European legislation, but personally identifiable information is something that they can talk about in the US, for example. And there it is very clearly defined what is and what isn't personally identifiable information. Whereas here, accounts should be taken to all the means reasonably likely to be used. What does this mean? Is the data that my air conditioning is collecting, is that personally identifiable information? Let's see what an air conditioner may know about me. Well, it must have three sensors to function if it's an, we need a temperature sensor so that we can regulate up or down whether we should heat or whether we should cool. We need a carbon dioxide sensor so that the air inside will not be too bad, and we need a humidity sensor also so that we don't feel super dry or super humid. And what can we learn from this data? Well, we can learn since a person is expelling both temperature and carbon dioxide and humidity, we can see where, when people come home, we can see how many people are in the room or in the house. We can see where they are. We can see when they shower, there's a very, very interesting peak when people shower. We can see whether they're sleeping. We can see when they are working out on their indoor treadmill. We can also see when they are having sex, how many people that are involved and in what room. So I would argue that looking back at what is personally identifiable information, all the data that your air conditioning is collecting on you will be regarded as personally identifiable information and therefore is data that is included under the GDPR. Yeah. I just put that there. So if you're going to look into what your own IT system is doing at the moment to see whether or not your GDPR is compliant, you first need to know what privacy data does your project handle at the moment. And it turns out to be kind of hard for very many people to do if you have a legacy system, for example. But you can try by starting to do an inventory about what kind of data your system is currently collecting on people, what you have in your mailboxes or whatever, depending on what kind of company you are. You have to know, yeah, you have to know your system, what it does. You have to know how it's supposed to work. If it does work in that way, then you also have to apply a hacker mindset and try to break it and see whether it actually, whether side channels or whether it actually works the way it's supposed to and only that way. And if you find out that this kind of assessment is kind of hard to do, well, there's a very simple solution. Just throw everything away and start over. Because if you don't have these things in the bag, it's very, very hard to become compliant. But if you do have these things in the bag, it is easy to be compliant. Next question to ask yourself is how do we ensure privacy by design? Who can access it? And why? And can we limit the amount of data we keep? And I think the two first questions, I don't know if anyone here have heard about this journalist that was using Google Docs to write an article about violations against wildlife. She was blocked from using her work tool, Google Docs. They said that while there's a violation of terms and conditions. And this turned out to be a bug and Google very soon said, oops, here you go, here's your account back. But it raises a very interesting question. Why did Google go through her data in the first place? Well, of course, there's not a person doing this, but this is automatically happening. This is a design paradigm that I would call privacy invasion by design. So that you intentionally give the company that is owning the software the possibility to automatically read what anyone is putting into the system. Another municipality in Sweden, Gothenburg, where I come from, they were doing a rollout of Office 365 to tens of thousands of employees. The problem is that they hadn't told the information security people about this, and they pulled the emergency brake and said, stop, stop, stop, stop, stop. The Microsoft Office 365 is not designed with privacy by design, but anyone who has access, has high privilege within Microsoft, can read sensitive data about the citizens of Gothenburg, and that's not acceptable to them. So these two examples are systems that are made with that design paradigm. And there are obviously many nice things about keeping a lot of data. You can do a lot of quantitative analysis. But what can also happen is, of course, that if an American company has the possibility of reading the data, that means that the company can be subpoenaed to give out that data, and that will break other laws, like secrecy laws in Sweden about citizens. If you're using this kind of design paradigm, you will also have a lot more data to protect and to ensure that it's not actually being used in a bad way. This is the design paradigm that we've been living under for a long time, saying that data is a new oil. On the other side, we have systems like Signal Messaging App, which I think many of the people at this conference is using. I use it every day, basically. And here we treat data as toxic waste, and almost everything is thrown away. Signal knows two pieces of information about me, basically, when I signed up and when I sent my last message. They don't know to whom. They don't know the content. They don't keep any kind of metadata about anything. This makes them virtually immune towards subpoenas from the American state. But that also makes them not being able to collaborate in fighting crimes, which obviously is something that you have to think about. But one good thing about this is also that since you don't have data, that data can't be compromised. It can't be taken over by hackers, and therefore this will be GDPR compliance kind of by design. So over to the question of informed consent. I like to think of end-user license agreements that we are used to, the ones that we simply click I accept. They don't really work under the GDPR. And I think of them in a way like martial arts versus assault. When you are doing martial arts, you have a limited scope and a limited of in time and space where you're allowed to beat someone up. When that limited scope is over, you do not have any rights at all to fight that person in any way. And in the same way under the GDPR, we are allowed to give away our personally identifiable information to another entity for a limited scope in time and space. And we have the right to revoke those. We have the right, yeah, because they are basically ours. So even if someone is a boxer, they don't have the right to beat someone up in the streets. Or even if someone is a boxer, they don't have it coming if they get mugged. In the GDPR, the kind of data that we do collect under consent and after we have been asking ourselves what do we really need and what do we really want, we have to protect it. We cannot claim that we didn't know anymore. And how do we do that? Well, it's easy. Encrypt all the things. My last question is, will any of this even matter? Will the law be circumvented anyway? I don't know. I think that the Keybridge Analytica scandal came at a very, very good time. Because now it's on the top of the mind of very many people that privacy data, when used wrongly, can be used in a very, very bad way. So I hope that that means that what will basically happen is that, since this is on the top of people's mind, people will care in the same way that people care about that they don't want to die when they're taking the boat. They also don't want their privacy data to be used for nefarious purposes. So I really hope that I am right about thinking that the GDPR is awesome and that the GDPR is going to be a part of the solution for making the IT business a little bit more secure and safe. Thank you. Well, thank you. Are there any questions? I think the implications for businesses are something we are looking forward to as private persons and so on. But what does it mean for open groups? Like you have a topic group and you have a mailing list and you have a website and so on. Basically, this applies there too. Do I have to shut down these services or what's about third party services? Like I have a third party tool that I use and so on. Is there already something that can be said about it? Well, I think it's a good idea to simply do an inventory about what you have, what your system actually looks like. And, you know, it's a very simple thing. Just start with a checklist of what are we doing? What do we need this thing here? When was the last time that we asked people on this mailing list whether they wanted to be in there? Maybe we should ask people to opt out or maybe we should even say opt in in order to stay on the list, for example. That said, I don't think that or I sincerely hope that open groups are not going to be the first ones to be fined. Also, simply because there is no money to find for. We don't know. More questions. You were talking about the feedback loop because people don't want to buy and how to improve security on ships. What do you think about the difference in the feedback loop regarding security for people and for technologists? Normal people think, okay, I'm writing and set measures to something. I don't care. I use it. And technologists think differently. Do you think this is a problem that the feedback loop does not work for normal people? I'm not fully sure. I do not understand the question, actually. I'm sorry. The feedback loop is easy to understand with ships and I don't want to die. But security and protection of my personal data, there are many customers of services like chat systems and so on that don't care about it. So we have a feedback loop for the customers, what they think and how they react. We have a feedback loop for politics and for technic people, affinity people like us. So there are three different loops and all react differently. So how do we want to bring this together? Well, I think that... Well, I don't know. I don't have to know everything. I'm sorry. We are all different. We all react in different ways, obviously. And I'm also guilty of going into the fallacy as a security person to think that... Because I think that entering into encryption is the most important feature of any kind of product, then naturally everybody else will think that too. And then all of a sudden it turns out that... Why? Like, oh, it turns out that usability is the thing that people actually care about. I think that it's important that we security people collaborate more with user experience and that we throw away in the bin the fallacy that security and usability are things that don't go together. I don't know if that was an answer, but there was another question, yeah, there. Good evening. Until this moment I didn't know it came, or it's planned, this new policy of the data regulation. I have many questions we can talk here another two hours. But a small question and answer is possible. How long is prepared this new policy regulation by EU? How many years or how long is planned? How long they have been planning it? Well, the old legislation, the old data protection regulation is from 1994 I think, something like that. And since then there has been a lot of things happening. I don't know, but it has been the legislation in Europe for the past two years, but no one has been fined for it. The deadline is in a few weeks. The deadline is in a few weeks indeed. I have a lot of questions. I'd love to speak to you afterwards and answer any other questions. So, more questions. Good evening. I'm an old IT guy and I'm looking in the IT area for the hype of storing the data in the cloud. Big companies are offering us all features to store our data in the cloud. Office 365 you mentioned in your presentation. What do you think about it? I think in the US they all have no problem with this, but I see big problems here in Germany and in Europe for this business. Well, the privacy law that precedes the GDPR, it stated that the one who is collecting the data is the one who is responsible for it being stored securely. And then after this regulation, the legislation was done, we migrated everything into the cloud, meaning that the ones who had the legal obligation to secure the data were the ones who were oblivious of the way it was secured because they had been giving over that kind of control to the hosting providers. In the new GDPR, this responsibility is shared between the one who collects the data and the one who is storing the data. Was that some kind of answer to your question? Yes, but it's not directly attached to the GDPR law now, but it's the overall view of storing data in the cloud. If you store your data in a cloud, in a cloud provider, he is located in the US, I think the US government has access to your data. Yes, so any kind of system that is not designed with privacy designed in mind, or privacy by design in mind, or that doesn't allow for privacy by design to happen, I think is a very big problem. And we don't know yet whether or not these systems that are not designed with privacy by design in mind, but by privacy invasion by design in mind, whether they are going to be even legal or not. I think that they are going to be legal because these companies are kind of big and have a lot of lobby money. But I also think that we are such an innovative community and because we don't have to work with pre-certifications, but we can just try things and see if it works. We're using the patchable agile paradigm of security. We can just try something and see what happens and many of these systems will actually be very, very good. So I think that even though things like Google Docs and Office 365 will still be there and they will still be very lucrative to use because it might be more usable, for example, there is still a possibility for anyone who does care about privacy to actually create systems where you have a hosting provider and the hosting provider knows nothing about your data. There are lots of these systems. We implement them every day and they are also in the cloud. They are also on someone else's computer, but that someone else doesn't have access. First, thank you for the talk. I really enjoyed that. Thank you. I especially like the feedback loop and the example with the ship at the beginning. The GDPR is the regulation, so it's a law. So there are basically two ways to tackle that for companies. One is IT, technology, encryption, whatever, hosting your own data center. The other one is contracts. So if I have a contract with my cloud provider and the cloud provider can agree to certain terms, it's perfectly fine that I store customer data in the cloud if the cloud provider is not allowed to access the data. So if the cloud provider breaks the contract, I cannot be responsible for that. But the question is the two venues, IT versus contracts, do you have any hunch on which way it will go? So will it all be like contracts will be different, but IT stays the same? Or will, you know, really work more money, go towards good privacy in IT? Are you saying that, okay, so we have systems that are not designed with privacy by design. And these systems still have a contract saying, we will not look at your data. So there is no nothing that is technically prohibiting them from doing it, but we just have their pionierie rembord. They just promise that, like, they're not going to do this. We promise. What do I think? Since I'm a security geek, I will of course think that everybody will think that end-to-end encryption is the only way to go and that we will stop creating systems that gives people the possibility of unauthorized access so that we don't have to trust people's words, that we can assume breach and so on. Of course that's what I think, of course that's what I'm hoping because I'm an optimist. Just one last word. I think our job as IT specialists is to show companies that the cheaper way is to make it really secure. Because in case of, you said signal, if they get a data breach, they couldn't care less. But if other companies get a data breach, contracts don't help. And then it could potentially get expensive. I think that signal would care if they would have a data breach. But they are thinking in advance and they also have processes in place to mitigate when there is a potential... That's true, but GDPR-wise they wouldn't care. Actually I wanted to come back to your question concerning the cloud topic. So I would argue that it's actually a lot easier to comply with GDPR if you rely on a cloud system because imagine you want to run let's say a multinational company and you have customer data in Russia, China, Germany, wherever. Then it's actually a lot easier if you have a cloud provider that has data sets that are separated instead of building and running your own on-premise system in all these countries. So would you agree that the cloud helps or do you see it's a threat to GDPR? I always use the cloud. I think that the cloud is here to stay basically and it makes a lot of sense to make that people that are specialists from my experience most programmers aren't very interested in making systems administration and why should you have people be generalists and do things that they think are boring when they can do things that they think are fun. So I think that the cloud enables very many people to focus on what they like so that cloud providers can do this thing that they nerd out about and people that do products can do what they nerd out about. I'm pro-cloud with primacy by design in mind. No more questions. I have more candies there. But you don't get two candies. Is it not cloud as design? I will formulate my question. When we talk about Google, how Google used the data is not cloud as a design to use these places to access this data as an next stepping stone in extracting data about people. So somebody had a great idea in Google Office. We make our services, we make smartphones, we make Gmail, etc. and then we say we need to have a diesel or a Sprit. So we use these cloud services and from these cloud services we use this data for analysis. So are you saying... Yeah, analysis on design. Are you saying that there is nothing that technically prohibits a service like Google Docs from extracting industrial, from doing industrial espionage? Exactly. Yes. I'm definitely not saying that Google is doing that. I am saying that I have heard of companies that have decided against using this kind of system because of the possibility that they're not technically prohibited from doing industrial espionage. Yes. Will GDPR make data more valuable? Well, I think that this paradigm of data is a new oil, collect everything just in case and it will be what could possibly go wrong if we collect all these things and we sell them to a third party, whatever. Well, anecdotally, I've heard very much about how having so much data that you're collecting just in case you might need it will instead inhibit you from doing the interesting machine learning that you actually wanted to do in the first place with this data in the new oil paradigm. So if you're actually going through the privacy impact assessment first and you see what do we really want to know, how can we really find these things out? We may find out that while this quantitative assessment maybe wasn't what we wanted to do in the first place, maybe we want to do more interviews and then you end up doing a better analysis on the people who are using your product but at the same time you're also not invading their privacy. Also, if you're actually looking at what is my research question and collect that kind of data, then you might actually not end up with a hoard... You won't hoard data anymore but you will actually collect the data that you need and you will actually develop the minimally viable product that you wanted to do in the first place. So, maybe?