 Okay Welcome everybody. I'm very happy to see you all back. It's a hardest slot of the conference I would say right after lunch on the last day on a Friday in Santa Barbara So I'm really happy that everybody could make it back to the panel discussion My name is Axel poshman, and I'm going to host this session for the next 60 minutes When I'm not moderating panels. I'm actually leading the security concepts teams at NXP and When I discussed with Benedict my co-chair About the necessity of a panel versus a second invited talk We very quickly concluded that we will go for panel discussion And the reason is that we believe that the changes in the next ten years and the challenges ahead are so profound And so diverse that a single person will have a hard time to address them all So that's why we assembled a panel of four Very different persons with different views. So we hope and so we get different views on all the challenges that are ahead one example what we believe is a Very important paradigm shift is offline versus online or isolated versus Complex systems for example if you look at this payment device was engineered for this purpose Stand alone offline and secure I would say and then you look at the modern online payment device that has so many more features and a huge additional attack vector is also very usable So there are opportunities and threats to this paradigm shift The threats are that the attack surface is really really much bigger Well at the same time the opportunity is that you can also mitigate the risk by online updates or Anomaly detection or these kind of things and finally If data is really the new oil then privacy is a new green That's a very good quote I think because if we collect all these data and make use of it to enable new use cases Then there's a risk and a threat to privacy same. What's true if you want to do an identified fraud Then we need data that could also Potentially violate privacy Now I would like to introduce our panelists I start an alphabetic order first on on the panel is Gemma Calvin Clavel She's a director and founding partner at Attica's consulting and research She's also research at university Barcelona in Spain in the sociology department She used to work in different places one I'd like to highlight is the United Nations Institute for training and research She holds a PhD on surveillance security and urban policing and in her role as a policy analyst in Etica's she's working on surveillance social legal and ethical impacts of technology Should I go for my few minutes? Okay, so I'm the social scientist in the room And I hope that more of people like has is going to start coming to these conferences but just for the sake of Waking you up a little bit. I Think that I'm here and I think that people like me from the social sciences are becoming more and more relevant to what you do Because you screwed up We left you to your own devices engineers Mathematicians working on technology IT experts. We trusted you with our digital futures with the future of technology and You promised us a great new world where technology was going to make the world a better place and What do we have is a world riddled by? security Breaches, but also by algorithmic discrimination that sometimes worsens some of the divide in the divides of we're already Concerned about we have a world where privacy is eroded We have a world where some of the devices that are being produced do help us lead better happier lives But we also see a lot of junk a lot of devices that solve no real life actually existing problems But create new problems a world of data breaches as I Mentioned so there's things about the promise that don't really stand And you here are very aware of that because you work on those security issues So there's one part of these dark Other side of technology that you are very familiar with you realize that some of the promises of these technologies don't hold because Security is an issue but while focusing on security Sometimes you forget that issues like privacy and data protection or some of the ethical debates that we need to have around whether Technology is even desirable Should we build some of the things that we are building? What futures are we creating for our sons and daughters? Do we want to be part of that? How do you build your own values into the technology that you are contributing to? Construct these are the things that you should be thinking about and at the same time we should be understanding better How to build our concerns into what do you do? So we need to talk more and that's why I'm here And that's why I hope you get more people to talk to over the years and in your different environment so if you ask me what the Challenges for the next 10 years are I think that there's one that stands out Which is the need to create? socio-technical architectures for technology we need to make sure that whatever we build Has it hard not only the latest in terms of technological capabilities But also the latest in terms of our ability to understand the world around us We need technologies that understand the needs and the rights of the of the users technologies that listen to users And not only because we want to do good But also because not listening to users can lead to reputational and financial losses So if that's what it's going to take for you to be convinced that you need to look at that then pay attention to the Privacy disasters that keep happening every time there's a security disaster a security a data breach that goes hand-in-hand with a Privacy breach as well We need to build technology that is accountable technology needs to be able to be audited We need to open the black box of algorithms if we want to develop technology for the people the people need to know What's inside of that technology? We need to predict The world that may emerge because of the technologies that we are building if we are to Substitute many jobs with new technology then what's the future of work? What will our grandchildren work on? Have you ever thought about that? What is the future of sociability where distance disappears when making friends? What is the future of autonomy or the future of growing up in a world that does not allow you to forget In a world where memories compulsory because every single thing you've done in your life has been tweeted Recorded or stored by someone somewhere What will growing up mean in this new world that we're all contributing to to create So ultimately I think that we need That social analysis finds its way into the technical specifications that you are designing but you also need To find more allies in people like us to reach Community organizations policy Agents government, but also corporations to make sure that we address security risks Adding these other layer which is not just about security It's also about how technology Contributes to building new worlds. I'm gonna leave it here because I only had three minutes And I hope that's enough for you to get you to to be interested in what the panel could bring about Thanks, Gemma. Thank you. So What I'm going to do is I will introduce each and every panelist individually They have their opening statements then afterwards we discuss a bit and then we open the floor for questions Next on the list is Alex Gampman. He's a vice president engineering at Qualcomm He holds a master's degree in cryptography network security from UC San Diego and his current role is Involves that he oversees product security support for mobile computing networking automotive health care smart home variables and IOT Thanks, Axel. So That was quite an act to follow I think Mike's statement does a lot more prosaic But still probably quite a bit depressing because I'm assuming none of you came to a security conference for your dose of optimism So when when Axel asked, you know, invited me to the panel and I thought about the next 10 years I First thought about the previous 10 years. So How many of you were at chess in 2006? All right, do you remember any papers from chess 2006? All right There were no hands. I think so 2006 just to set the stage. So this is More than a year before the first iPhone is announced This is two years before the first Android phone comes out. So that's how far back That is that's how much changes in 10 years. So I'm a little bit skeptical about our ability to accurately predict what's gonna come in the next 10 years But also so how many of you worked in the world already working in the field 10 years ago, all right things and So how many of you so the problems that you were working on 10 years ago? How many of you have solved them the solutions have been deployed? You know The world sort of got it you moved on to the next challenge Paul is half raising his hand. Okay. All right. Okay. How many of you have realized it was a debt it was an idea before its time and You know the world wasn't ready you decided to move on to something else Okay, one or two hands and how many of you are still working on the more or less the same challenges? Okay, a lot more hands. All right. So that gives you some indication as to what you can look forward to for the next 10 years And I think I think part of the reason that is the case is a Lot of think a lot of the challenges that we're attempting to solve Their real roots are not in the technical essence of the problem right and the solution is not going to be purely technical And I think as a community we like to come up with new and you know more and more technical approaches But it's not ultimately that a lot of times the technical solution that really solves it and we have to be cognizant of You know the social factors the business factors That impact the problem and understand how the technical solution fits within that context I Think that's oh, I guess Did I are you not depressed enough you want to one more okay one more since I'm looking at Paul so So Paul coaches I think original Differential analysis research is now 20 years old right 96. I think was the first paper You know stack smashing for a fund and profit also 96 almost also just past its 20 year anniversary So 20 years ago we discovered you know side channel attacks and memory corruptions for buffer overflow and We're basically Know really much closer to addressing either of those I mean solutions exist as far as actual Elimination of the problem in deployed products Anyway, okay, thanks for the Uplifting words Then next on the list of the panelists is Daniel Perito. He's a senior software engineer at Square in San Francisco But he leads the security and anti-fraud team on square cash He was a postdoc at UC Berkeley where he researched among other things on machine learning and security and he holds a PhD from Inria Rhone alps That's in France on software based at a station of embedded devices All right So when I was invited to this panel I thought that it was going to be very focused on hardware security And I haven't been touching that subject for a long time And so I felt I was going to be out of water But actual assured me that you know different perspectives were welcome and in the past few years have been working on Product security and making sure that how do you design a product that is secure? And how do you make sure that people cannot get in and how to make sure that there is no fraud in the system? You know squares of financial companies. So these things are very important to us so I think I have a perspective from a addition security field and security area That can perhaps be useful to this panel So just as an insect in few inception points that I think are worthwhile discussing about what's going to be important in the next 10 years I think that there are a few like high-level teams that I sort of like saw and So first of all the amount of code that is getting online is staggering So every year there is more code written by more people that is getting connected to the You know to the to the internet basically and I think you know, everybody knows and it has internalized that Defenses and attacks are asymmetrical, you know, you only need to attack one point and you have to defend every point But I think with the amount of people that are writing code and the amount of code that is being written there is another part that another point that is very important which is By definition the engineers that are writing code at average, you know And you need to attack the least common denominator of these vast amount of people that are writing code So and the least not common that the more things are written and the more people are writing code The lowest the least common denominator you expect there to be because you know, that's how these things work And on the other hand attackers are very talented and very knowledgeable in what they do And so like the top 1% of the attackers are attacking the bottom 10% of software engineers in a sense So and this only gets worse as more code gets committed so so to me like As a type of solution that I would encourage and I would like to discuss about this is the fact that you know Safety by default framework and languages and libraries are the way to go like it needs to be as simple as possible to make things secure and So and there is this one thing I want to discuss and the only thing about that I wanted to bring about is the fact that You know, historically, I think the biggest motivators for attackers have been either intelligence or financial gay and With some of I you know internet of things and cars attacks We've seen more attacks that are potentially motivated by mayhem, you know Well, you know the thing is okay. What will happen is somebody hijacked that hijacked that car Sears them, you know and kill somebody I fundamentally think that these attacks are less likely because you know Financial motivation is a much much More enticing incentive that people pursue and then intelligence is also something that people pursue because nation states are interested in this Mayhem is less in you know, there is less likelihood this it's gonna happen for sure, but just less likely I think what that means for us is that attack research has a much higher likelihood of Finding vulnerabilities before they're actually exploited in the wild because if if attacks that pursue mayhem are less likely It means that you know, it's much more likely than that like a white hat researcher or like an academia research Can find his vulnerability and make the world safer. So attack research. I think it's gonna be very important as well Okay, thanks so the force panelist is David Woods, he's the own president of Trillium Automotive startup company based in Japan. He has more than 25 years of experience in High-tech arena and senior executive positions and previous stints include free-scale AMD Dell amongst many others and before that he worked on management consulting at Delaware and douche and He believes the future of cybersecurity is software based for embedded devices Hi, so yeah, I'm here from Japan. I'm with a company called Trillium and we Entered the market in cybersecurity with a solution for encrypting and authenticating on classic can So most of what I'll talk about is about automotive or transportation systems And we'll write what are the security challenges that we're gonna see over the next 10 years? First and foremost, who's gonna pay for the cybersecurity? Automotive makers don't want to pay for it tier one makers don't want to pay for it governments aren't gonna pay for it So cybersecurity is not free. Who's gonna pay? Who's gonna own the responsibility in the case of a breach that turns fatal? How much regulation is going to be required in order to get the security deployed in vehicles? Key management the who what where when why and how much of Key management and how that's going to be managed by the industry and Then the trade-offs between hardware and software based cybersecurity and the fact that a vehicle Consumer vehicle has a 20 year life expect 20 year life expectancy and a commercial vehicle has a 35 year life Expectancy if you're plugging in chips to solve the problems You're plugging in problems that can be reverse engineered and brought to bear on those systems over time So I think you also be the major challenges that we see at least in the transportation industry for cyber going forward. Thank you Okay, thanks. So now I'd like to encourage the panelists So I guess you took notes and have comments on other statements. Maybe we can pick on one of your Statements who is going to pay for cyber security and I would be interested to see the Policists point of view where should the incentive be put do we need regulation law what is efficient? How do we get there on the metrics to measure the impact these kind of things? Maybe you can start and then you guys can jump in sure There's a lot to say about about this and let me maybe Start with something a bit more general There's a lot of people saying that one of the problems that we have is that Technology Moves a lot faster than the law and so we don't have the necessary legal tools to address the challenges that technology is posing That's a lie We have all the laws that we need what we lack oftentimes is the political decision And drive to use them and mobilize them and we also lack judges and lawyers that understand The nature of the laws and not just the letters of the law Sure, we will not find a law that addresses smart cars for instance, but we do find laws that talk About what are the rights that need to be preserved when you're on the road? Laws that define privacy laws that defined lots of little things that intervene in this So we we just need to mobilize existing frameworks, so we do have those frameworks That's that's something that you need to be very very clear and then on the basis of this we can provide answers But I would throw the question back at you and say what is the problem? We are trying to solve With automated cars if you give us the problem We can start looking at whether the solution is adequate and how we need to regulate around that But we haven't found anyone in the industry that is able to identify the problem because identifying the problem will Determine what solutions we get so for for smart cars for automated cars If the problem is that we want to avoid casualties on the road And we are developing these cars because we want less people to die on the road That would be one problem to solve then the ultimate goal of these new cars would be to save lives And we would regulate around them on the basis of this So the most lives you save the more you contribute to the common good and to the goal that you that you have And then regulation has to follow this if the problem is pollution you want less we want to reduce pollution Then we would have to promote collective vehicles for instance not so much Individual vehicles and more effective routes and then on the basis of this we can provide the regulation if the issue here is how the Car industry can make more money if that is the problem then the solution is different You need to make sure that you encourage everyone to change their car and to buy this new technological Discovery or this new this new tool this new thing and then we build regulations around that But oftentimes the problem for the regulator is to understand What is the problem you are trying to solve and the industry is often not very clear about these and the reasoning around technological innovations Seems to change depending on the audience you're talking to So they want to convince everyone that this is desirable from a social point of view And so you mobilize any reasoning That will fly with us with a specific audience But that's not that doesn't help us have answers So we need specific problems specific challenges before we can actually respond that but we do have the tools So can I jump in so so that seems like a very maybe appropriately, but very Policy-centric view of things right I think the problem that the government would want to solve with regulation May have nothing to do with the problem that the business is trying to do with releasing a product, right? So I'm not sure it makes sense to sort of to base regulation To to yet based on what the business was trying to do. I mean, I think you kind of you hit it on the head with your last Hypothesis that the business most likely is just trying to generate revenue, right? Satisfy market need whatever it is right and if it can do it by decreasing pollution saving lives, you know, that's great But that's not what should be driving government policy, right? So, you know government should have its own position on what's important and If business sort of goals or business direction is at odds with that that's when regulation comes in Shouldn't be the other way around What do you mean by the other way around that? It's if you the proof of concept is on the industry in a way So we have a legal framework and if you want to Create a new tool that you feel Stretches the limits of that legal framework then you have to determine How to make your tool legal truth But that I think that's a big if there wasn't a parent to me in the beginning that we're talking about something that is currently considered to be Not allowed within the legal framework Okay, but I didn't realize that that's the kind of case that we're describing discussing I think I challenged the idea that regulation is sufficient is already there and the frameworks are really in place Because I think fundamentally security is a new Thing so let me explain Say that there are regulations around building safe bridges and those regulations are quite precise And they're precise because it's you can precisely define what a safe bridge is in terms of engineering In terms of forces and things like that There isn't such a definition for security. So that's that's a very big big thing is that, you know We only have precise definitions for very narrow things about, you know, very narrow protocols and cryptography But like when a system is very complex, you just don't know what you're actually defining is you cannot mandate You shall make your system secure because what does that mean? So I think For things like, you know vehicle to vehicle communication and Things where there is different companies that need to participate to a network. I honestly think that Industry self-regulation may be more effective The reason is because the industry wants to create a platform where it's safe to be in and That's not unlike what car processing, for example is Where there are self-regulating standards of a security and security is a very hard thing to measure as I said or even to certify against but I feel like an independent body can keep up with like policies Versus the government would have to like keep, you know, updating the laws or have like an independent agency perhaps so I think that I Can see both models work. I can just feel like one model has a highlighted work I challenge you to convince Taylor Swift Where she had her naked pictures stolen from her cloud and said that self-regulation is enough Think that everyone who's been the victim of a security or privacy breach realizes that self-regulation Hasn't worked. You're making and therefore there's and there's maybe a need of some help You're making the busy assumption that regulation would have helped. Yes, it would have It have forced you to explain the things that are happening in terms of security in the cloud for instance But I guess right now users don't even have the access to the information on what's actually happening with their data So they cannot make an informed decision And that is a problem. I just think that the problem is of such complexity That it's hard to even state in Language what it is that we're talking about and I just feel like there is that impendence mismatch and therefore It's hard to imagine an amount of regulation that would have prevented somebody's phone to not leak photos, you know, just don't see it as No, you know with the technical background and maybe you're right that the industry is not doing a good enough job Like, you know, educating the the people about the dangers of what they're doing and like the limits of the security of the systems that they use But at the same time We have a harder job than say, you know builders of bridges because we cannot actually define It's like a touring complete problem. It's like this problem. There's like he's got no solution really so I think I mean To add to that I think Like I've been working on secure on the defensive side of security for 15 years now And I've never ever heard or anybody come to me and say, you know what? We need to add this additional security mechanism because without it we don't meet the regulatory or compliance or Certification standards. I've heard the other a lot, right? Like you don't need to do anymore Like the vulnerability that you're pointing out is clearly not a real vulnerability because we already meet the bar set by the Certification or by this regulation, right? So it's more often used as a ceiling like you don't need to be more secure than this because you've already met certification but I guess the other point is I think There is something unprecedented in terms of the legal framework and security and that is I Can't think of another example where we expect consumer goods to withstand A Sustained malicious attack from a well-resourced adversary, right? There's nothing in my house that can survive a five-year-old with scissors Right yet yet. We expect our software to be able to withstand Attacks from nation states and it you know, there's nothing like that that we own right like my car is not baseball bad proof Right But I don't get me wrong. I don't think that Regulation has to solve all our problems. I run a business So and I think that we have a stake in solving some of these problems And I think there's lots of other business models that can contribute to these to this debate One way maybe of changing the game would be to have an insurance company for data So that every time there's a data bridge that creates liability So maybe an insurance company for data could change the rules of the game And you don't need regulation and I'm just hoping that these kind of Spaces emerge that force change because what we have right now is very difficult to To defend in a way because we're finding you were you were saying that security often acts as distinct that you just need to comply with in in a book and that's it You don't care about it anymore. We find the same in data protection Some people only care about compliance. Just make sure we take all the right boxes I don't really care about ethics or anything that goes beyond a strict understanding of data protection and that is a problem But we're seeing more and more how because people's data is so much abused Legal systems systems that are fully legal are being rejected by the users and the users are the clients And so we get more and more industry calling us saying can you please help me improve the privacy of my system because otherwise? I run the risk of a reputational crisis or financial losses because I'm gonna have to address these issues once the product is completed We're still working with the early players in this field But there's it's more and more of a concern and certainly those that are doing things Right are worried about the impact on users and on clients of the actors that are not dealing with data in a secure and privacy Protecting way. I think that's important to understand. There's different things that play here IOT and automotive is very different from the financial world. The incentives there exists for a Industry-wide type of solution because it's a money the business is money I mean IOT the device might be a five-dollar device or a two-dollar device But it might have video capture of your child sleeping in its crib or your car could being hacked as life-threatening consequences So it's it's different about you know who it's not about who's gonna pay at the end the consumer who is gonna pay up for cyber security Cyber security is not gonna is not optional for IOT devices or automotives are going forward But we you know we didn't get seat belts until the government said they had to put them in the cars We didn't get airbags to the government said they had to put them in the cars and cyber security is gonna be largely the same the government is gonna have to put some type of Regulatory structure around cyber security and there's already been a bill raised in the Senate last year in 2015 I think July called the US spy car act But that's the start of a certain level of regulation that will be hitting cars and with a $5,000 per car fine for not meeting those requirements by a certain start date to be determined, but these things are these are really important things to think about and You know if we don't have a if we don't have a Cyber systems in place that will allow us to Update our solutions that are in the field. I mean my people buy their watches their fit bits And they wear them for three months and they throw them away So the cyber security wasn't a big issue there on a phone your phone has got a two-year life Expectancy usually I think is the average people use it for two years So hardware encryption and hardware cyber security can play there But in a 20-year life cycle or a 35-year life cycle of a transportation equipment that has life-threatening consequences It's not a matter of who pays it's when and how fast we can get it in the in into the transportation equipment I'm elevators and drones and planes and trains and and trucks and factory equipment and robots You know ubiquitous encryption is going to be a part of the future and messaging that has anything to do with Actuation is going to have to have both authentication and encryption built into the solution from the ground up I think every time there is a network. There is a Higher incentive for self regulation because you know say vehicle to vehicle The vehicle in front of me selling my car that there is an accident in front that I can still not see and Let you know what happens if the lie to me and you know if the lie again crash So in those circumstances different parties need to trust each other So a Ford car needs to trust needs to trust that the Chrysler car And I think this is a natural breeding ground for them to understand that they need to like Certify each other before they can speak to each other So why I do agree that you know there is potential for government regulation as well like also think that Things that look like that where mutually distrusting parties must cooperate are naturally You know prone to like self regulation in a sense as well But in any in any technology that deals with data one of the things that we find is that in the end your decisions Don't don't only affect your privacy Privacy becomes a collective issue and that's what regulation is needed. So every time you download an app on your phone It's not just your privacy that you're giving up But also the privacy of everyone in your in your phone book for instance or all the interactions that you have Therefore, you're not only making decisions on yourself So we do need an other body that is bearing in mind the needs of everyone involved in this transaction Because they might not be aware of what is happening with that kind of data. Yeah, I think what I was referring to is mostly applicable to security. I don't know that privacy follows the same things because that's sort of like a shared good like, you know You know the quality of the air you breathe and things like that. It's sort of like this shared good But I think every time you say security in most times you could say privacy I think they really go hand-in-hand when we when we work on data intensive technology That's not all technologies out there, but the knowledge is the deal with the deal with data any security issue is also a privacy issue Okay, I think this might be a good point Breakpoint for the first time to open up the floor for questions I have a couple of questions that we could ask and discuss later in case we don't come up with questions But if you have questions, please come up and make your way to the microphone Hey, so I wanted to ask You're talking you're just now making this distinction between security and privacy and it feels to me like maybe the distinction here is There's security particularly in a financial system, right? The security is on the things that the organization decide making the security decisions cares about If that makes that's a kind of convoluted way of saying it if I'm a bank. I care a lot about whether or not money gets stolen Okay, if I'm you also like if I'm a car if I'm a car manufacturer I definitely care if my cars if there's a if there's some sort of security or safety problem that Kills customers and ruins my reputation. It's maybe a little more removed when you talk about privacy About 99% of the time you're talking about my company assuming I have a company is dealing with other people's data and My incentives usually go exactly the opposite to theirs, you know, like oh, it's easier I make more money if I can sell their data Right and at most I might just say well, I'm not going to sell it But I don't really care that much if their privacy is violated It feels like and they're probably other places like it but it seems like there's a real split there between you know If the you if I have an incentive to protect something then it's probably pretty likely that I care about it Maybe I still don't know how to how to do it. Maybe there's some other Ex-mule role for regulation or standards groups or something there But in the other environment where you have you know, I'm protecting something that I don't really care whether it gets violated That's a much you know That's a place where I think the case for either regulation or some sort of industry standards or something's a lot stronger And it's just a distinction. Maybe it's useful But that that's why we have a problem with incentives and that's where regulation can play a role So when you find that the incentives that the the self-regulation creates Affect people's rights, then you need regulation So if you find that the incentive that most companies have it is for them to sell the personal data of their clients because there's no There's no protection Then you need to step in because something's gone wrong in self regulation And that's that's what so we need to look at this exactly this chain of incentives and decide as a society What do we want to see happen and then intervene at the moments that we can intervene? I can intervene through my company a government can intervene through regulation Somebody else could intervene through a different what I would I said the the the idea of the insurance for data There's different ways you can intervene But we definitely need to make sure that the incentive structure feeds societal goals that we are contributing to something that is desirable and within the limits of the law The I guess the sideline of that In the crypto community, we were kind of worried that this is going to come back as a big struggle before about a Desire you've in the US the FBI directors talked about this desire to want to put in critical put back doors and encryption You're put some sort of special access for the government into encryption and This seems like sort of the other side of the bad side of this of the same trade-off that the you When you say okay, well, we don't we want to kind of control some of this technology It's getting a little you know, it's you to make sure Matches social goals. There's a downside, which is the people making the decisions about how to control it They may not have your interest at heart either, you know, I mean You're probably actually less destructive to have NSA have all your personal data than like ad companies But it's you might not actually want to give either one Definitely, but I think in the fight depending on what color your passport is Very likely For encryption, I think we're fighting the same fight and that's a good thing That's why I mentioned the the need to build bridges and to find new allies because we can really benefit from having you on board And I think you can really benefit from having us on board because we are really fighting the same fight Encryption should be by default Yeah, I think just to add to one more point to the regulation comment. I think As David mentioned, right? So we have the safety regulation with seatbelts and airbags But in those cases the outcome was clearly measurable, right? the The regulatory agencies could measure the impact on fatality rates and show the benefit of this regulation in sort of Objective indisputable terms and I think you know because regulation in many cases ends up being kind of a nuclear option Right, it's written by lawyers negotiated by lawyers and forced by lawyers becomes fuzzy It may not always do what you want it to do So unless you have that kind of feedback loop and you can measure the outcome and you can see that it's actually Getting you the benefit that you intended it to get it's very dangerous to unleash it without that closed loop of being able to Measure the benefit Thank you So there's obviously a huge number of topics raised here I'm interested in the question of regulatory failures in particular in the United States There's one cloud company which is not allowed to sell its personal data that companies Netflix All the other companies don't have these statutory regulations There's a history behind this particular regulation on video rentals and likewise in rate with the financial industry We've seen how the financial industry standards committee Has been used to put backwards in widely in crypto systems that luckily weren't widely deployed But it did in fact happen at X the X9 committee was Instrumental and bring dual EC into the standards And also with regulation. I think there's a third sort of question which is We talked about it seems like this form of regulation the sense that there's an agency that makes rules And you have to comply with the rules Well, there's another kind of regulation particularly in the United States That's torqued where the car company makes the car and the cars defective and crashes you sue them and That's that's Different from what the National Highway Safety Administration does so we like to comment on any of the things. I've just listed I can to give other people time to think I can sort of pose a hypothetical problem about the car sexual knows this one, so So self-driving cars, so let's let's say deliver they deliver on the safety promise, right? But you know, there's a huge attack surface. Obviously, it's connected, you know lots of software lots of computers So but as I said, let's say they deliver on the safety promise and the number of fatalities and I think the number of Driving fatalities in us right now is like 30,000 a year or something like that. So let's say it goes down to 10,000 so 20,000 lives a year saved but then somebody hacks a car remotely and four people or even a hundred people would say die from those hacks is that Like a horrible disaster and we should not have allowed those cars in the road or is that a huge win and we should have released them Even sooner that's exactly why I prefer torqued regulation because the regulator has an incentive to prevent this whereas tort law Liability is spread the 30,000 accidents. Some of them are uncaused human error Those will you know people will read by auto insurance companies will say hey if you drive a self-driving car We're lowering your rates and manufacturers can charge more for that. So I think getting the economics, right and Tort is decent at that. It's not perfect is probably more important than positive regulation where I think there's incentives to react politically Yeah, I think there is like two lines of the spectrum when making software and You know on the one end of the spec you're probably on the one end of the spectrum where software is Written but not rigorously proven to perform the task that was supposed to and then there is like the way that NASA writes the software control the shuttle and you know those are probably like the two reasonable ends of the spectrum and You know the industry a large Technology and services industry everybody has decided that the brother preferred writing software fast versus taking years to make small changes and I Think it's an interesting balance because Who knows maybe like if you think a very long horizon in a hundred years time You you would have been better off trying to write things very very slowly and very carefully But The reality is also that if you'd write things faster you can experiment more You can see what works and what doesn't We probably wouldn't have an iPhone if we had written every piece of software since the 50s as With a some amount of care that NASA takes to write the shuttle software So I think it's you know, there is a fundamental technology technology of China and perhaps We saw this with research Which is like provable frameworks where everything is just like comes to you and you don't have to like spend years and years and years Making sure that the software is correct is sort of like you code it There are some checks that can be study galley or like dynamically done and you have a higher Level of assurance. So I think bringing back a little bit the discussion on the technical side. I Think Clearly the industry has been doing a lot of work pushing more on the safety of the things that we use the right software. So You know C++ has become a much safer language than it used to be The libraries that we use have safe defaults that are built so that even the people that don't understand the cryptography or like The reason behind them can safely use them because they are the easiest thing to use and they're the safest thing to use so I think you know from like a technological point of view Some of there is like there isn't no tension in a sense. There is like a third choice there is not like There isn't an economy is there is a third choice, which is trying to work on Technologies that are safer by default and they're simpler to use for a wide range of engineers out there They don't need to understand deep security impregnation behind them I think we tend to see the anything that happened in the past as linear and easy so we're talking about Security and safety in bridges before or a car is and we think that someone came up with the idea of a car and the Engine and then overnight someone thought well actually we might need Speed limitations and roads and zebra crossings and red lights and seatbells, but that wasn't like that We have all these because there was a long process of social negotiation of where the limit should be and what the regulation should Look like and mistakes were made in that in that process But also some things were done that have lasted a long long time But it's only recently that we have airbags so new things are come up and then there they become new additions that become the new Standard, but these are long processes and which lots of people screw up the industry the regulator the startups and the companies that provide the solutions It's just it takes time for us to agree that this is a decent enough solution for a new technology Be it bridges or cars nothing We're in the middle of the same debate for many of the technologies that we are developing But that's why my my argument is we need to have this debate The decision cannot be made Only by the engineer or the company or the government Because we tend to screw up when we work on our own if we work together We can have better diagnosis of what the problems are and what people expect from us And then our technologies will be a lot more successful because of because of this and and don't do not get me wrong I think that the future of privacy resides in privacy enhancing technologies Not the government and I think that the future of security resides in crypto solutions Not the government, but the government can help in an understanding and identifying when Collective problems and collective rights are being abused by those that don't play by the rules of bait or by those that minimize the impact of Their devices on the rights of people, but this debate has to be rooted in hard data, right? Not just anecdotal references to sort of to public to well-publicized cases, but an actual hard data Right, so when we talk about the safety regulations once again, you're right Those evolved over time and through a long dialogue, but also based on real sort of fatality data, right? And we don't have much in terms of that in sort of in computer security these days, right? That's why insurance company for data would be a good idea because you put a price tag to that data bridge or a privacy loss But we definitely we like that So what is the cost of a privacy breach? How much is my privacy cost and and many people in my community are against this debate because they think we cannot monetize a Fundamental right, but well listen, you know, that's the EU versus America debate, right? So in America, you have to show real harm in the EU It's a fundamental right so There are no more further questions from the audience. I'm really surprised Okay, so maybe I mean I really enjoyed the discussion so far and I think we touched a lot of strategic Dimensions and epic problems that we need to solve Maybe we can get maybe we can get to try to get it a bit more concrete. For example, the hardware was a software Trade-off. I guess that could be interesting for the audience so I like the point that you made about Some devices are only two years in the field so we can put it in hardware and then throw it out like Smart cards are like three five years in the field and then you throw it out But in long run you need software. So how do we get software more secure? We have hardware virtualization like trust zones Yes, but then people come up with all kind of fancy new attacks flip Feng Shui Rohamma cash-timing attacks in the clouds these kind of things so and then again attackers are always ahead of Engineers engineers luckily erect rather quickly Compared to politicians. So that's another question How do you make politicians decide faster and then train judges to to act on that law? But maybe we should focus more on the engineering challenges So do you have any thoughts on what could help in hardware to make secure software more secure? Do we need that? Is it possible? Can we just do everything in software white box and so on or do we is there still place for some hardware anchor That's a broad question. I think that there are lots of challenges There are things where their areas where hardware could help their areas where hardware crypto may not help that much like you know if we look at the You know the car hacks from Charlie Miller and Chris Valasek for the last few years. I'm not sure how crypto would have helped with any of that or Maybe even hardware roots of trust or anything like that So a lot of what we need to do on software is kind of just mundane secure development life cycle Stuff that takes time and effort at the same time kind of the most fundamental software problem of a Stack-based buffer overflow, you know You can say that it's a hardware failing that we don't have that hardware doesn't provide us with an abstraction to call A function and return to a place where it's supposed to return, right? The I'm gonna interpret the question really broadly To touch a broader software subject. I think well two points. So one You know, we talked about being able to patch software. I think That will remain actually a big challenge And the challenge is not gonna be you know getting the bits in a secure way to the device That's a simple engineering problem. We know how to do that The challenge is going to be getting the patch in the first place, right? And that ties back to the business models because maintaining software and developing patches costs money, right? So business models need to evolve that provide that continuous revenue stream To you know to the vendors of the product to keep producing the patches Especially if it's in the field like automotive where it's gonna be in the you know Where it's gonna be in the field for 20 years Well, yeah, sort of 10 20 years. So, you know, that's a lot of engineering time and effort supply patching Yes, well one more quick point. I would like to make I think reverse engineering is gonna get increasingly important I think systems are getting remarkably complex and a lot of times like complex where They're sort of bordering complexity of you know, natural biological systems right and the only way you can really study systems this complex is To kind of you know to treat them as a black box system Form a hypothesis and test it which is you know kind of how science works, right? How you study the natural world how we and that's how we discover more and more things Think about, you know, man-made systems these ways these days We rely more and more on reverse engineering and I see reverse engineering really becoming a much bigger part of our Factually of the academic field in this industry both hardware and software. Yes, both hardware and software picking up on that, you know, you mentioned the G pack patching So I think it's I want to title to like hardware and software and like the things I was saying before about like secure defaults So I think part of the kill chain of getting like full control of the canvas Was an insecure for more update on like one of the chips in the chain, right? That was that was that was the bug they used but they said that if that wasn't there like there was right right a range of other Ones available to them sure that didn't require So that was part of the kill chain and Why was the manufacturer left to design a secure for more update protocol, you know like these things should be designed once by the Chipmaker and done right and you need to like be able to like plug a key and Like do it everybody the problem is there's 12 chip makers in the car sure Sure, and like maybe like you update each one of them using their protocol that gets designed once and right But like why do you leave it to the car manufacturer? But it wasn't the car manufacturer, right? It was the I Don't even remember if it was a tier one or tier two Yeah, I guess I guess the point is like can we like make sure that these technologies can be just They're done right once and you know Yeah, I think we can but again like to stress I think that's the easy part of the problem. I think so solving the secure update is easy It's getting the updates that will actually be Continuously pushed and sort of getting them through the supply chain. That's going to be the challenge for the next decade Well, there's there's going to be IT systems in the automotive or the vehicular in the transportation industry that haven't existed before specific to cybersecurity in a lot of ways but The the issue that is always going to be there is that there's no silver bullet Everybody knows that every system's got a weakness So the solution that you're going to have in vehicles is a layered approach You're going to have a layer of authentication. You're going to have a layer of encryption at least one You're going to have asymmetric key Generation you're going to have symmetric key exchange You're going to have lots of different layers put in there including an IPS system to provide a feedback loop to improve your cyber system So, you know, it's about layers in it and the more layers you have the more complexity you add But if you do it in software you can you can engineer or architect out a lot of that that complexity that you can't architect out with hardware I think there is also That's also the point in the sense that I Was, you know, and I encourage everybody to see the presentation I use next of the NSA Dao chief he is describing their offensive techniques of how they get Into other people's networks, you know, very Very interesting talk and one of the points that he brought up was that reputation should be used more So even in I think potentially even low-level systems. So for example You know When a hack happens like you're probably gonna actually try to S3 data or getting commands from outside It's a hard problem to try to whitelist all the sources that can send you commands But you can't have some centralized source that says well, I Have seen commands from these different end sources and now all of a sudden I'm getting a command from this one new source Then nobody else No other car like this has got commands from you know, what's going on like why why am I getting updates from like some completely new? IP address or you know hosted has no reputation whatsoever. I thought that there was an interesting Vehicle of us sure ends Because it's basically like offloading The problem of security on it and make it become a problem of reputation, which I think It's it's a it's a very interesting way of going by just maybe to build on this that The technical challenge first the main technical challenge from the point of view of privacy and ethical impacts of technology is anonymization How do we make sure that if hacks are gonna happen? information's gonna be hacked and I only work with data data-intensive technologies technologies They're intensive in the use of personal data So there's other hacks for they're looking for money and things like this and I don't work on this But if systems are gonna be hacked and hackers are gonna have access to data Then how do we make sure that that data cannot lead to the re-identification of the individuals? How do we anonymize the data? How do we make sure that the data that we input into our systems can never? re-create Physical person with their name and their address and their sexual preferences and their medical data And that's a huge challenge for us and the EU is making a big effort in that respect at least in terms of Regulating around this there's a there's a position piece from article 29 on anonymization That talks about differential privacy attribute based encryption and these are supposed to become the standards of the new era, however We have some mechanisms and some some techniques that work, but because solutions are always concrete. We still lack many more techniques and methodologies so developing these anonymization techniques and Designing anonymization methods for specific technologies in different fields is a very clear challenge for us Okay, if you have time I'd like to have each and everybody Give me one final answer in port's talk. We heard that he believes in three to five years It's going to be worse, but then there's hope that in ten years. It might be even better than now security privacy wise I'd like to get your simple answer better or worse So how do you see the security private situation will be in ten years from now better or worse? I Think it's gonna be better. I Always say it's because I started etiquette three years ago I left university and I thought it was gonna be just a thing for myself You know, I'd be able to employ a couple more people three years later There's there's 50 people working in 10 countries on these issues And I often say that in our field. We don't need to do any commercial activities We just need to sit back and wait for privacy disasters to happen and then every time there's a data bridge and yeah and Personal data gets out there the phone rings and it might be a Municipality hearing that London had to remove the spy beans after Newspaper picked on the amount of information they got from people's mobile phones and they call us and be like Can you please help me make sure that the same thing that happened to London doesn't happen to me when New York had to remove the beacon-based? Systems on phone booths the same thing happened so every time there's a privacy disaster the phone rings and Luckily for us, I guess This happened daily and I think that in the field of security. That's that's pretty much the same There's gonna be more and more awareness that is gonna be led not by Governments not by regulation not by the industry but by people Realizing that this brand new world of data has its downsides and therefore demanding Solutions answers and alternatives and I cannot highlight this enough We need alternatives. We can't keep telling people you need to become an engineer in order to protect your privacy or protect your data People need to have solutions alternatives. We need standards that are less privacy Invasive and we see a huge market failure here. They're not enough engineers working on privacy enhancing technology And the market for these is just huge Okay, that's one's nil for better. I would like to have it annuals. I'm an optimist I'm gonna say better and I think it's because if you look at, you know, Windows 10 Way more secure under any single metric than Windows 95 was and I think one Hypothesis you need to have in order to think that things are gonna get worse is that there is an infinite supply of Categories of attacks So, you know, there's much in the stack and there is return oriented programming On like, you know, and then you can get incredibly sophisticated and then, you know, you have Rohammer, you know challenges everything you thought about memory and like boundaries But like is there an infinite supply of these things and I don't think there is like at some point We're gonna like have to run out of of them and we will have we will never understand how computers work You'll become a trickle though like right now it's it's like we're discovering this whole new thing And I hope that there isn't an even I mean, I'm Optimist, so I hope that there isn't an infinite supply of this like 50 years from now Somebody will not come up with something as insanely clever as Rohammer Hopefully or if they do it is gonna be a much more rare thing and it's just that me and just happens every year So I hope things get better. Okay. That's to Neil. Oh David. Yeah Well, I've got answer on both sides for people in the cybersecurity business, especially in embedded ten years from now that's you're gonna be looking at a Hundred billion dollar a year business that today is in less than ten billion dollar a year business So from business perspective things are looking up From a chaos perspective We talk a lot about windows and Linux and all the solutions that exist to protect servers and workstations and desktops and notebooks when you get down to a M0 plus or a 16-bit microcontroller you don't have any of the security that's available today in in the in the mainstream and most of the devices that we're gonna be Interacting with and they're gonna be actuating in our lives are gonna be low performance devices with no OS if if if an R Toss at all and so that's gonna make for an enormous amount of chaos opportunity for the people in this room So that's worse worse and better Business perspective Okay worse So so I'm an optimist too, but maybe I define the term slightly different right differently You know pessimists is somebody that thinks that things can't possibly get any worse and optimist knows that they can get much much worse So I think up until maybe a year or maybe a few months ago I would have said better, but I am really I'm still not quite sure how to Internalize like the Sony pictures and the DNC hacks if sort of the attribution stories are to be believed and this is really kind of you know civilian collateral damage from nation state attacks, then it's Very troubling kind of development and it's not really something that we've seen before and it may can evolve our thinking in very new and different ways, so I Would have to like right now. I'm worried that it will get worse. All right So if you draw then I'm here to decide I think it's going to get the next couple of years much worse and much much much worse But I'm also an optimist and I believe in 10 years. We will have solved it Or at least it's how to get better security level is definitely higher or retired And then but then the big question is how do we make that happen? And what should we be looking on research wise to do that and I took a couple of notes during the discussion I think one of the most pressing issue is how do we get security out efficiently and fast and quickly? Then prove the frameworks for secure coding another great code would help So that's a good PhD topic. I guess then we have software patching But they are more on the business side. So maybe PhD in business Economics could help here reverse engineering out with software driving the boundaries that helps Then I think really a big challenge and that's probably an interdisciplinary Approach as well as how do you measure security and the impact of breaches and then we need policies and things involved and Finally, how do we make politicians decide faster and train judges and sociologists and everybody else? So I'd like to thank everybody here for the Participation and in particular. I would like to thank the panelists for everybody coming here And I would like you to join me in giving them all