 Hi, everybody. I'm Kevin Bankston. I'm the policy director at New America's Open Technology Institute and I'll be moderating the last panel of the day. Thank you for sticking around for it. We'll have a reception after. So far today we've talked about security at the personal level in our first panel and at the international and national levels in our second and third panels. Now we're going to jump back a couple levels to talk about security at the institutional level and the threats that companies and nonprofits are facing from their attackers. And so we have a wonderful panel of some really high-powered folks who are going to talk to you about those advanced persistent threats. I'm going to briefly organize them. Sherri McGuire here on the end is vice president for global government affairs and cyber security policy at a tiny little security company called Symantec. Symantec is that, yes, where she leads their global public policy agenda including cyber. Angela McKay right next to me similarly works for a tiny little scrappy startup called Microsoft where she's director of cyber security policy and strategy in the global security strategy and diplomacy team. Many of you are now familiar with Alex Stamos who after co-founding two successful security consulting companies, ISAC Partners and Artemis Internet recently joined a little web play called Yahoo as their chief information security officer. And then finally we have Morgan Marquis Bois who last year became director of security for First Look Media. Congratulations to your colleague Laura Poitras on her Oscar win for Citizen 4 last night. And he joined First Look after spending many years at Google working on fending off threats from nation states including working on the team that dealt with the Aurora operation which originated as far as we know with Chinese intelligence and which at some point all of the companies represented on this panel either investigated or were targeted by. And is a good example of the topic of this panel which is advanced persistent threats. We have a wide variety of threats online in terms of cyber there unsophisticated attacks by script kitties who are doing denial of service attacks there opportunistic attacks going after particular vulnerabilities at particular times. But then there are the very well resourced or sophisticated players who like nation states or organized crime that pick a particular target for financial or ideological or other reasons and then pound and pound and pound on the door until they get in. And so I just wanted to start the panel with the basic question of what is an advanced persistent threat and is it actually a useful way of talking about cyber threats or should we just go ahead and go to the reception. Angela. What you said Kevin there in your description of kind of well resourced working over time and targeting a particular victim is a good way of thinking about the so-called advanced persistent threat. But I do think that that term itself is a significant misnomer. You know there are adversaries who are very agile and adaptive. They want to go in get their stuff and get out. There are ones who want to. They are determined and again working towards a particular target but not necessarily really using advanced techniques. You know if you are looking across the surface area of not just an individual company but also all their vendor relationships ultimately there are going to be some challenges in managing those environments and so not all actors have to be particularly advanced. And so I think it's helpful to understand that there are sophisticated and well resourced actors but then different tactics and techniques that they use in order to execute those. One of the things that I think is really funny is you know as a security community the phrase increasing number and sophistication of cyber attacks has been something everyone has been saying for over 20 years. But what has really changed in I think the environment more recently and particularly in the awareness in the public domain is the role that nation states play and how that really escalates the overall threat landscape. One of the terms something that our general counsel has said that is that you know government snooping is potentially could be characterized as an advanced persistent threat and in many ways it's not just the government snooping it's when you have a particular adversary whether it's a government or a criminal unit that can really dedicate time resourcing and programmatic to execute an attack that's something to be seriously concerned about. Well in terms of government snooping or state sponsored hacking this is something that Morgan has a great deal of experience with defending Google's network against the Chinese dealing with research on how Syria uses malware to infect the laptops of dissidents and rebels. And the Intercept just published at first look just published a fascinating and worrisome story about GCHQ supported by NSA stealing the encryption keys of millions of SIM cards. So your specialty is dealing with Spain's state sponsored attacks which if you use the APT terminology is definitely advanced and definitely persistent. What insights do you have into government as an APT? So I think the term APT was actually pretty useful sort of around the time of the Aurora incident when it was sort of really popularized I think by a lot of the work that Mendien did at the time. I think that it's it's less useful now because it it's sort of inaccurate right like you know people talked about the Sony hack advanced highly debatable you know like a lot of what a lot of the time what we see is you know state actors is heavily persistent and I'll talk about the Gamalto hack a little bit later when we actually talk about encryption but I had a teammate who characterized the APT thing like this advanced US Russia persistent China threat Israel which I kind of actually always liked right and so sorry I think that you know to call I mean I think in the previous panel Nick or Nate from in-game talked about the sort of blended threat landscape that we're seeing now so I think that calling any state actor an advanced persistent threat actually cast a skew on how we sort of appreciate their craft because you only really want to be as good as you need to be in order to be effective right and so it sort of lends to this trend and reporting which is really odd and that whenever you know a reporter wants to discuss a nation state hack it was like so how sophisticated do you think it is so not not how effective was it but how sophisticated is it like this is like this sort of the magic thing that everyone wants to get at when they talk about an advanced persistent threat so I don't actually find that a very useful term to use these days on however having said that you know like what we what we describe as APT is you know sort of state sponsored hacking and I would say that's obviously exceptionally prevalent at the moment either of you want to jump on the bandwagon and dump on the term APT is it abuse I do you think it's highly overused these days and certainly it is if something is advanced and persistent and a threat then it could come from anywhere so I think by designating it almost default these days which oh if it's an APT it must be nation state sometimes I might actually give too much credit to the nation state that's launching the attack because it may not be very advanced it may just simply be taking advantage of holes in a network or not very secure system so I think you know we probably need to ratchet down on what that definition actually means and but it does give folks I think a general sense of whether or not something is serious and should be taken seriously but once it becomes too overused and nobody knows what it means or what to do in response to it so to overuse these days I think it would be useful if we actually updated our thought on what advanced means but it seems like the term APT is still stuck in the 2009 definition whereas what the Chinese did in Aurora is now within the realm of every you know Eastern European gangster and Adidas track suit has that capability right and you know things like computational can somebody crack an unlimited number of antelum hashes used to be something that was interesting and now anybody can either buy the GPUs or rent them from Amazon web services with a credit credit card so Bruce and I are I think was had a good point about what we see from the NSA right now is what we'll see from criminals three four years ago so if we want to continue to use the term advanced prison threat I would redefine advanced as semi-autonomous disconnected malware malware that has the ability to jump air gaps and to do its mission without command and control which you know 90% of the anti-APT stuff that you buy for a network is all about command and control detection and so whether or not your systems are actually air-gapped usually utilizing the techniques that you see in flame and Stuxnet and such to have extremely quiet command and control systems is going to be useful for criminals not just for governments that are attacking disconnecting nuclear facilities and then things like persistence off of the hard drive right like if you look at a MacBook Pro like this lady's using up front a MacBook Air that has something like 25 different places where you can put state on it outside of the hard drive right the Thunderbolt controller and the reprogrammable keyboard controller and boot prom and all this crazy stuff and so that's what we've seen out of the NSA equation group has been discussed is the ability to have pluggable modules to take over hard drive firmware I think we'll start to see that a firm of attacks that survive reboots were survive rebuilds that exist in the frame of the computer for the lifetime of that system so I think APT could be useful if we actually redefine it as what is the cutting edge now which is no longer honestly the Chinese or the Russians it's it's what we're learning about NSA and does APT do anything for us that state sponsored act could really do I mean I mean essentially we used APT to discuss state attacks from China I mean really is to get venture capital right yeah so that's actually so perhaps APT is not a useful term unless you tag it with a year this was an APT as of right yeah 2014 is a different matter but okay so we have advanced threats we have persistent threats we have some serious threats that are neither so the question is what do we do in the policy environment since we are in DC and we should talk about policy what do we do in the policy environment to try and address that one of the perhaps the most discussed policy proposal in regard to dealing with cybersecurity threats is our proposals around information sharing at this point we're four or five years into a legislative debate over whether or not we need legislation that would provide a lot of liability protection to incentivize and allow for private companies to share more information with the government the president just signed an executive order also encouraging such sharing and trying to make it easier for the government to share information with companies CISPA the somewhat notorious CISPA at least in privacy advocacy communities was reintroduced for the third time earlier this year there's a new bill from the Senate Intelligence Committee coming any moment and the administration itself has its own proposal what do we make of these proposals do we need this type of legislation and what if any privacy concerns might there be around such legislation sherry just testified on the topic I'm wondering if you have some thoughts you share sure so first off information sharing will not cure all the world's ills in cybersecurity I just want to put that out there first everyone seems to think it is the magic pill that will cure all the problems it will not it is simply a tool to give those who may be participating in whatever information sharing organization is set up it is a tool by which to hopefully learn more about common operating picture or common attack picture and then maybe some ways that you can protect yourself so I think first off that they can miss no merriers that information sharing will fix everything and it won't that being said however there are some valuable lessons learned from information sharing that's gone on in the past so if we look at things that for example the financial services sector has been able to do with information sharing and analysis center over the last seven or eight years they've actually been able to share some of the common attack patterns some of the malware samples etc that that helps all of them to protect their networks in a more effective way when we get to talking about information sharing in the context around privacy though that really is about the private sector sharing with the government and there are some significant concerns there around privacy issues we've called for for many years now that any information that is shared by private industries to the government should impact be minimized and stripped of all PII not everybody is of that same mindset which frankly has been a bit of a surprise to us in the private sector it seems like a natural thing to do to protect our customers and their information the other thing that I will say is that we've also had some great success working with law enforcement and some of the big botnet takedowns that we've seen have in fact been international consortiums actually semantic has worked with Microsoft and a number of other companies with the FBI with Europol and other academic organizations to to take to identify the command and control servers sinkhole the information and take some of these offline so there's there are real I think positive proof points but there's also the privacy concerns that we any legislation that has passed frankly needs to take into account and have as a fundamental foundation privacy and civil liberty protections and I don't think that should be up for debate I tend to agree but let's not talk about us I mean I will say at the side of some there was a panel that that your CEO was on where he agreed with Jennifer Grant that in the vast majority of cases the information that the companies want to share and want to receive is not going to include any private content is not any PII which leads to the natural question then why is it so hard to convince the Congress to include strong in these bills I mean if most of the time you're not even going to need to remove anything why isn't such a big deal a mystery that perhaps one of the sponsors and bills see I actually found that very interesting as well as someone who spends a lot of time on the sort of incident response minds so so when we we share sort of indicators of compromise which is presumably what the government is asking people to share traditionally this takes the form of you know an address of a command and control server or and here's sort of a hash or an identifier for a piece of malware that an adversary has used like whether or not you're actually purchasing Intel from a company that's sort of providing you a threat data or whether or not you're sharing it between you it's that this is you know by and large sort of 90% of the data that's shared and it doesn't contain email content it contains no email addresses and that type of thing so it actually does be a very interesting question why why the bill needs to be that broad actually why there isn't those protections because I don't I don't think it's needed at all Alex where do you think information sharing falls in the hierarchy of needs when it comes to cyber so definitely I was a CS perfect major so I don't have my entire psychology the exact wording but it's definitely not one of the important things it's not I think it's pretty clear that information sharing is what we all talk about because it's easy and milk toast and doesn't require almost any work and unfortunately we've been talking about for years and of my top 10 things that the government could do that wouldn't break the top 10 maybe it's in top 20 I mean we already do information sharing privately with like mining companies with tech companies with with all the companies up here of indicators of compromise like Morgan said has no PII or privacy impact and there's nothing has stopped us from doing that we're not asking for any legal protection there and so I I don't think information sharing is gonna do much I think there are other places where the government can actually put its muscle to do more things that to bend the curve a little bit if you could pick one place where the government would put some muscle or two or three I mean one of the biggest things as we've heard from the last panel is it's pretty much impossible to hire folks right now with any kind of background in this there's maybe four or five thousand people in North America that I could hire to do the highly technical work that our team does and I think that means that we need to have security engineering as a basic part of people's computer science training right so like the fact that computer science departments all over the country are graduating people without any security exposure is like a truck driving school teaching you everything but how to use the brakes right and and that's a place where government has a really good background of you know enticing schools to teach certain things to give grants to that there's guys like Matt Bishop at Davis who have good ideas about how you how do you weave security into an undergraduate curricula in a way that it's not a separate class and such something you learn the entire time I think with a little bit of funding a little bit of enticement you could see that done you can see a lot more cybersecurity standards of excellence I think if we had comprehensive immigration reform we wouldn't no longer have to hire our best security experts in other countries they could come here and become Americans and have American kids and pay American taxes just like you know the brain drain helped us win the 20th century it can help us win the 21st century right like the fact that everybody wants to come here is a soft power benefit that we should utilize and in cyber that's completely true I think honestly where I'd really like to see the government just put some dollars is cancel one F-35 and take that money and put it towards public bug bounties for open-source software right so bugs like shell shock and heart bleed right like the government pays for bugs but it pays for bugs and offensive purposes like Katie talked about it'd be nice to see an open source public bug bounty where there's clearly to find rules about what happens with the bugs and and that they they're never used for offensive purposes and somebody outside the government runs it but the government puts the money behind it so that we can have real top-to-bottom reviews of the critical open-source software that is running on every single phone in this room and on every single server all of you guys are talking to right now that's I mean the amount if you spent $500 million on that you would increase by an order of magnitude the amount of money spent in the United States on software security right and 500 million bucks is what the Pentagon pays on post-its right like you know from a if this is truly like a critical national national priority then we should put the money behind it consummate with what we spend on other national defense programs since you brought it skip one let's talk about this market vulnerabilities Katie Katie did a talk earlier about about bug bounty programs which are one type of participant in the market there are states that are in the market who are buying vulnerabilities for offensive use it leads to questions about what is the appropriate role of government either in participating in this type of market or or regulating it and it also leads about what is now being called vulnerabilities equities when the government is aware when should it hold on to it and use it and when should it disclose it so that it can be patched we have any any thoughts on the panel on that on that family of issues I certainly have some thoughts here I I think that there's a series of related issues right there's the and I love the term earlier Alex really hit on how do we incent behavior towards the right kinds of things and so in the bug bounty space there's certainly a role for bug bounties in the ecosystem I think that we have to think really carefully because governments are already participants in this process when we reference back to the earlier conversation of nation states being involved in leveraging vulnerabilities to conduct advanced attacks they're already in the ecosystem so is there a government role they're already in there so I think we have to think about what is the constructive way to incent the right behavior in the bug bounty space to certainly say that one of the things when Katie was actually with Microsoft it was really helpful was thinking not just about incenting vulnerability researchers to find vulnerabilities but also incenting researchers to think about creative ways to defend systems and also what are the different attack methods that can pass certain mitigations and so how do you how do you basically get the security researcher community not only to look for the vulnerabilities but also help raise defenses and in a related note when you find a vulnerability the bug bounty programs generally are saying report that to the vendor and I think it is really important to think about that vulnerability disclosure piece as well because there are different basically different philosophies around how vendors should deal with vulnerabilities once they're reported to them and that's again kind of rolls into that vulnerabilities equities process from government and the US government actually has done one thing that I think is particularly helpful in this space is they are the only government that I have seen so far that does have a small it was just a blog statement from Michael Daniel in April of 2014 on how the government is going to think about what they do with vulnerabilities that are reported to them are they going to report them to the vendors are they going to hold them for future use and some of the criteria that they listed is basically is it really easy to reuse the vulnerability you know what are the scope of consequences so a vulnerability that's found in commercial off the shelf technology from you know Microsoft or others is very different than a vulnerability that's in a unique government off the shelf technology so those things have different consequences you can direct actions in different ways and then there's also a piece about what is the cost to society relative to the gain that you get of using the vulnerability and again the example here is really clear and you think about Stuxnet and the set of issues that were there there were four windows vulnerabilities zero day vulnerabilities and two in the industrial control systems but I think that the challenge was in thinking about the cost to society versus the intelligence gain out of that and I'm not going to say the determination was right or wrong but I think what was lost in that determination was it's not just that they're these vulnerabilities so Microsoft has cost to fix it then we have all of the enterprise customers who have to go patch their systems and redirect resource to that all the way down to the consumers my grandmother who's running a system is also having a patch the system so bringing in the idea of what are the full range of consequences from these vulnerabilities relative to the gain that can be used I think is actually something that we heard that the US government is doing and we'd certainly encourage other governments to think more about so wasn't wasn't it historically a program like you guys have a map right which is where you sort of notify large enterprise customers that there's some of these vulnerabilities should be patched so they have time to you know get their ducks in organization in a row and that sort of thing but wasn't there an additional program where you disclosed advanced notice of security vulnerabilities being patched to an undisclosed five countries and you guys did that for years prior to this note in revelations right like I hear you've stopped doing that but but wasn't this sort of part of the evaluative process that you've just I mean described because I mean there was definitely going on when stocks that happened so there's a there's a series of programs map included where we basically notify folks who who develop intrusion who have intrusion detection intrusion protection systems and we give them information so that when the patch comes out then they have they can actually deploy protections in advance of being able to update their systems and there are our national search who are members of that program but there are a series of criteria to be part of that program and part of that is actually understanding how they're going to use the information for defensive purposes so I mean it can be I'm sorry I've been handed a second microphone because apparently I'm not loud enough which is not a complaint I've ever heard before but so this is obviously it can be a complex question whether you're a vendor or a researcher or the government do I disclose this do I disclose it just to the vendor do I disclose it to the public then there's also the question of you know Bruce was Bruce Schneier was talking about you know complexity creates security problems and we're looking at a really complex ecosystem so for example Android gets patched very quickly but most of your cell phones running Android are running fairly heavily customized versions of Android so the phone vendor actually has to customize patches and so they are actually fairly slow in getting your phone patched and so you have we have a pretty serious security problem with many of our mobile devices but it's also worth noting the president's review group on this subject suggested that that the pendulum should swing pretty far in the direction of disclose at least to the vendor rather than hold on to for offensive use but in terms of the problem of getting our stuff patched that kind of falls under the heading of cyber hygiene which is a word that is now starting to get more currency perhaps maybe even overused like APT what is cyber hygiene how do I stay cyber hygienic and how do you feel about that in the in the in the stack ranking of things we need to do to protect ourselves so I'm happy to talk a little bit about this as well first of all I do think that cyber hygiene is a very important thing when you did the introduction Kevin you kind of said there's opportunistic threats and then there's this other series that are potentially advanced potentially not but at least determined or persistent so there is a great opportunity to raise the baseline of cyber hygiene across the enterprise the ecosystem whether it's enterprise or consumers and there's a lot that's being done I think both in the windows environment as well as in other platform environments with automatic updates so basically systems need to be patched regularly the firewall signatures need to be managed need to be thinking about configuration of the system there's kind of some basics that have to be done I think one of the things that as an industry we're really trying to grapple with which is what should we be doing on behalf of users like automatic updates and what are the things that we want to enable or inform users to make good risk decisions and that's something where you know the pendulum hasn't found a good equilibrium point you know originally we weren't doing automatic updates in large environments because there was some concern that you know Microsoft is being the big top-down antitrust world but as we realize the security ecosystem is changing that made it you know we realize that use we needed to help users in this space and the same thing actually happened with firewalls used to be that a lot of systems didn't come with those turned on automatically and now they do and so I think that's one of the things where we just really have to grapple with what do we do as a research community to understand how people use systems and make good choices what we need to do in the technology environment to enable that and then how we actually give them information to make good risk decisions the example that I know Alex heard this last week at the cyber summit from our corporate vice president is you know there's a something that pops up in the browser it says this website may not be safe oh okay well what do you do with that as a user so I think this is one where you start to see the multidisciplinary aspect we're gonna have to bring in things like behavioral psychology to understand how users are interacting with technology and give them appropriate information to manage risk so I've involved in my thinking to this point where I'm a pretty strong security paternalist I think there's almost no decision a user can make when you talk about helping people at the billion-user scale that is safe for them to do so so who here has gotten the you have this is a bad certificate error right everybody in the room yeah so if you click more information it gives you a bunch of text and what the browser is asking you is please use your knowledge of discrete mathematics and the x5093 standard to make an intelligent decision and since Bruce Schneier is left pretty much nobody in the room is qualified to make that decision we shouldn't as that's kind of a CYA as engineers it's like we like to dump out and put when there's a problem like that put it in the lap of the user and as an industry we have to move away from that security decisions like that need to be made automatically in the most they has to fail towards the safe way and if that means things break they break and one of the things I hate to hear and I heard earlier today is the idea that security and usability are orthogonal things to shoot for and they're not if a product is easy to use but when you use it you're you get in trouble then it wasn't usable right like if it's really easy to drive a car but then the car explodes after 30 second that isn't a usable car right and we need to start thinking about that from from a software perspective when we build products they need to ship safe by default they have to make choices on behalf of the user and the user wants to have an advanced mode they turn off patching or they want to be able to accept SSL warnings then you have to hide that somewhere where an advanced user will find it and 99% of users want and I hope you know Microsoft has made movement this direction I hope Microsoft continues to move that direction and that other you know you see that that's kind of more of a how Apple does stuff it's how you know in some ways I had a big fight with the Chrome team I hope the Chrome team was really proud that they get people to make a decision correctly 68% of the time and that still means 32% of the time people are in trouble and that's not an exceptionally high you know number that seems like the magic number I remember that they that Twitter would do internal tests just to see how susceptible their employees were to phishing attacks and about a third of them and I mean these are IT professionals at Twitter and about a third of them would would click that link or down or open that file we do that I'm not going to say the number but it's to the point where you realize that passwords in a corporate environment are completely useless I think it's important also to not just think about cyber hygiene or best practices or whatever you want to want to call it is not just about the technology I mean you can't just set it and forget it I mean that's not what security is today it's constantly evolving with the threat landscape and so with that we also have the people and process component of this you know we always talk about people process and technology well the people component we which you've just raised is a critical part of this and that is you can have the most educated most cyber aware group of people and someone is possibly going to end up getting do not not because of any failure on their part but because the attackers out there are using multiple methods it's not just a phishing email now it's a phishing email followed up by a phone call that says hey I just sent you that invoice I need you to open it really quickly most people wouldn't think twice about doing that but those are the old sort of very simple social engineering practices that have been around for ages and we just have to be aware that in fact if we think something is suspicious probably is and just because you get pushed to the next level to maybe open it doesn't mean that you should so one aspect of you could call it cyber hygiene is making sure that your data is encryption which is an issue that's been coming up fairly regularly today it we had a debate in the 90s about whether or not we should all be able to access strong encryption without the government having the keys and ultimately policymakers fell on the side of information security and economic security and chose to allow us to have that those are called the crypto wars it seems like we may be on the verge of having to refight the crypto wars and I'm wondering if the panel has any perspectives on what that means for the security of their products for the security of the US internet industry and technology industry etc so I might posit that while publicly we thought we'd won the crypto wars that the intelligence community never stopped fighting them and that I mean what we saw recently right I think I think the entire framing on this issue was sort of duplicious on behalf of sort of both the NSA and law enforcement because what we've seen is sort of a great willingness on behalf of the intelligence community to steal encryption keys liberally when it are possible in bulk right so I mean this was the what came out about the hack earlier this week or within the last week which was the NSA and GCH Q stole all the crypto keys from the world's largest SIM manufacturer right and so so publicly you have this like we really need some sort of legal framework to give us access to encrypted communications and privately it's like well we already have access to all these encrypted communications but if we give them an inch then you know they might actually take two inches and we'd really like to get more budget and do less work and so I see that and I see for instance you know Comey from the FBI sort of lamenting the the pedophiles and drug dealers that will be will be aided by the end-to-end encryption and apple-dye message and stuff like that whereas you know there's this sort of public evidence regarding the FBI's liberal breaking into people's cell phones and computers I mean you can always go targeted on targets of interest right and so there's a vast industry now that sells you know targeted hacking for lawful intercept and law enforcement surveillance and yet there's this whole debate is sort of being phrased around the fact that you know these these Silicon Valley companies are doing harmful things when the truth is of course you know sort of internally the the golden age of SIGINT is being celebrated well it's interesting you know obviously law enforcement does raise particular and particularly emotional examples often like child pornography which horrible crime but what they often don't talk about is is the crimes that will be prevented by encryption in fact fun fact there's a great FBI advisory to consumers during the holidays that in terms of avoiding scams they say your phone may have encryption you should turn that on in case your phone is stolen but I guess I guess Director Comey didn't read that advice but I don't know Alex did you have anything you wanted to expand on after your exchange with yeah I think I mean so you have read a good point about the crimes that encryption we just had a whole discussion about how APT is not very useful because the kind of activity you see from non-state sponsored hackers is exactly the same and that cuts both ways if we build products that are safe against all threat actors that that can include governments including our own and there's no good way to build products that are safe against only a couple of them and not safe against the other ones and the metaphor I was using earlier that I didn't get to expand upon too much is you know it's like the government asked us to drill hole in a windshield and saying oh well you can only let the US government through that hole but everybody here knows if you have a hole in your windshield that eventually that whole thing's gonna crack you can't build a system that intentionally subverts its its own security for one purpose and then and then make the whole thing safe and the problem is not that technically you cannot ever have a backdoor it's that entire models of securing users like the iMessage model of somewhat end to end encryption that entire model is no longer compatible with the desires to have backdoors built in and that's a great example of who here sends text messages to people who here has an iPhone who here's ever sends a message that's the green bubble right so the green bubble the people you talk to are people who aren't on iMessage those are the most trivially sniffed things on the internet one they're incredibly expensive like on a per byte basis they're more expensive than the Hubble Space Telescope but the other thing is those are sent in plain text on the control channel of the GSM network anybody with about 800 bucks of equipment in a backpack I could have a backpack right here recording every single SMS message I'm not doing that to all the federal agents in the room I could have a backpack here recording all that the SMS messages that's why Apple built security iMessage for a common sense problem that a lot of people were facing in a lot of places was there their SMS being sniffed in a lot of bad things happening to them not specifically for governments and so that you know the next evolution of building secure products means building and end encryption and there's no way to do that in a way securely it includes backdoors and I think that's just the you know there's a reasonable balance to be here but the balance is not we we don't build secure products that that backpack is cheaper than a hundred bucks by the way really to build yourself oh yeah I do think this is the issue that the ACLU's Chris Tugoyan was tweeting when he asked the director about whether or not foreign governments are sniffing cellular traffic here in Washington DC which they are right which everybody who went to the cyber summit you know all of our phones went crazy when the president entered the building so clearly it's not just the foreign governments that are doing MZ catchers I guess the the sort of the question right is that you know once the you know the five eyes had breached Gamalto and stolen all these crypto keys rather than say holy crap you know a lot of the people were supposed to be protecting it could be could be targeted if someone else steals all these crypto keys we could tell Gamalto how easy it was to steal all their all their stuff you know the capability is kept rather than the whole being sealed right and so I guess the worry is is that you know if it was you know the five eyes are capable of doing it then then perhaps China and Russia are as well and so so I mean I think there's probably a lot of people in this room who would probably be fine with say you know five eyes and information sharing agreement having access to these keys but not the Russians and the Chinese that this this sort of thought exercise might become difference if you like are you fine with the Russians the Chinese the North Koreans having these keys as well as the US government well I'll just say that as a certificate authority provider we're not okay with anybody yes we don't even hold we don't even hold our customer keys so even if we were asked for them on a purely practical level by governments around the world we cannot provide them because we don't hold them there are a couple of more policy things I'd love to hit or I do want to get to questions so if you wanted to happen to ask a question about data breach legislation or updating of the computer fraud and abuse act and how that might impact security researchers like Katie was mentioning earlier feel free to ask a question about that but I'm gonna ask one more question of the panel and I think this is starting to become one of the overriding themes of the day which is both the sort of trust and cultural gap at this point between the East Coast and cyber policy and the West Coast and the technology industry and the security research community there was a story in the Hill for example that that argued based on mostly off-the-record comments from people at West that that the Obama summit was not actually well received one person said well in in in regard to information sharing legislation why would you why should we want to share information with these people who are weaponizing our products in regard to the firmware hack and yet at the same time we have this desperate need for more technical expertise on the East Coast as we make these important decisions can't we all just get along what do we do how do we bridge this gap what are there any any pride ideas on how we bridge this gap yes I mean I feel like the feeling of a lot of people who work in the tech industry is that government doesn't see technology as a real profession right like the Attorney General of the United States is a lawyer and the Surgeon General is a doctor and the chairman of Detroit Chiefs led a armored division into Iraq twice right and then the guy can probably field strip a m16 still but like when you look at the leadership positions in cybersecurity they're not people that have done the hands-on job of either breaking into or defending networks and that doesn't mean there's not a place for the lawyers and the foreign policy people and all the folks but right now pretty much a hundred percent of the conversation is dominated by those fields and it would be nice to see more technologists in place because I think we wouldn't see as much focus on information sharing and things like liability protections if you had people that had hands-on experience you know spending their entire careers trying to stop cybercrime or protect networks and governments it is a real profession it's not just something you pick up you know because you're interested it's good for your resume to spend 18 months doing cyber right and stop using the word cyber we respect you all I guess I just have one thing to add and I think this was brought up on the panel before but I think it's a really useful thing which is I think the folks that I've interacted with in security research community don't always fit well within the boxes of the way government jobs are structured there's a lot of process here you know there's a lot of checking boxes a lot of socialization so earlier on the panel I think someone brought up the idea that you know autonomous making sure that people were able to work on autonomous things not as and then also having clear destination as opposed to being told all the different steps along the way so I think you have to think about the things that attract folks to some of the work in private industry and how to start working to build those into the way government jobs that are in information security or cyber security or security itself I mean the US digital services like the US digital services did a good job of doing this right of bringing in people from Silicon Valley like to fix healthcare.gov and now to fix up other stuff it would be nice to see a US digital service version for cyber security of people who have actually done this hands-on be a part of the conversation. I find the the sort of the schism between Silicon Valley and sort of the government and intelligence community to be the most one of the most interesting things that's come out of the sort of the so-called Snowden revelations right because you know you're on one hand you've got a bunch of people trying to pass cyber policy sort of sticks in the core right and and these sort of discussions about getting Silicon Valley to backdoor encryption and how they need more data whereas Silicon Valley's taken a massive economic hit in the wake of what's come out in the wake of prism and sort of being seen to be complicit with US government for foreign policy interest right and so you know we've got Cisco taking big hits you know people losing contracts in in Europe because of the privacy laws there you know in Brazil losing contracts and so Silicon Valley is actually sort of taking a an economic hit right due to the US government actions and I I think that sort of lends to you know this is a type of hostility for a bunch of people who are sort of passing policy that they don't really understand towards people that are sort of already losing money from the historic actions and as Alex so you know aptly pointed out it's like none of these people there are no technologists in positions of leadership there's actually been a sort of a criticism sort of throughout the sort of reporting on a lot of the ways that sort of DC I guess to address sort of policy review post Snowden which is that there's sort of almost never any technologists actually involved in these discussions which I think sort of means itself to really really damaging policies you do occasionally see the policymakers who say sort of like in the climate change discussion like well I'm not a scientist but I stayed at a holiday in last night and therefore this one time I used a computer but I do think though that there if you look at our intelligence community and others who you're talking about there's a lot of technologists there the issue is how do you get a private sector perspective around the economic component of this into that equation when those decisions are being made right now that discussion does not seem to be happening at all no one is weighing I think the president said it what 2009 when he released the cyberspace policy review that cyber security in economic and a national security issue but yet no one seems to be talking about the fact that right now that economic security piece is somewhat ignored we've talked about the the strong headwinds that US ICT companies are facing around the world because of the lack of trust in our products that have been created after the Snowden revelations but yet I go to chapter 8 of the president's surveillance review group that was all focused on things that could be done for private industry I don't know that I can point to any of those recommendations that have actually that action has been taken a ringing endorsement of crypto if you'd like to learn if you'd like to learn more about this issue of the economic cost of surveillance funnily enough open technology Institute New America has a paper on it called the cost of surveillance which I commend you but now let's go to questions in the audience operators are standing yes ma'am this please wait for the microphone we are streaming live my name is Nancy Wong I'm a reporter I'm a Dell new service so I'm just with all this a lot of the cyber security questions have been going on around and incidents that happen for example Lenovo with the super fish and everything what how do you think what is the government's place in the private sector's cyber security incidents regarding like cost consumer safety issues do you think government should put more measures in that area to make sure that things like this don't happen again and consumers rights are protected or do you think it's something that's completely should be left alone for the private sector to deal with itself isn't this what the FTC is for the whole like Lenovo super fish thing when you talk about like people who are technologists have been the FTC one of the highest up people I respect is Ashken Sultani there I would expect you to see FTC action on that that kind of well within his wheelhouse right is here first I am actually very very interested in your insight and opinion on amending the computer fraud and abuse act I'll start so you know I spent my career as a security researcher right in which you're walking a line of finding bugs and accessing systems in a way that exceeds authorization in a way it because you want to make the world better and so I'm not a neutral party on this I was the expert witness for Aaron Schwartz who ended up committing suicide when he was looking down decades in jail for downloading journal articles and so it seems to me if you can catch Aaron downloading to me journal articles and if you could face taking to jail the CFA needs neither needs to be broader nor have stronger penalties and any move to make the CFA larger to outlaw legitimate security research is just going to mean that other countries are going to be better at it right like you know England didn't win the industrial revolution by outline steam right now I think we're going to win the information revolution by outline people doing good research and information security I think the CFA is already way too broad and is abused in a lot of situations where it shouldn't be I don't know if anybody else has an opinion I largely agree with you I mean I I think what happened to our was tragic I think it was actually kind of sad that we had to let it get to that point like Aaron was a you know a perfect victim as it were and that he was clearly doing something which I mean even the victim in the case declined to prosecute right like they were like we don't we don't want any of this that was MIT they they didn't care that he was downloading journal articles but I mean prior to that we actually saw sort of overreach by the CFA I mean so so we've was far less of a perfect victim being a being a notorious internet troll I mean however he was you know the book was thrown at him for simply downloading documents from a website that were not password protected in any way and yet I mean so there's actually been sort of like a history of this abuse used largely to batter security researchers that the Justice Department besides they don't like for some reason which is very very worrying and I think how you try to reform the CFA so that doesn't happen is a very difficult problem yeah so it is an ironic situation if the law that is meant to protect cybersecurity is actually preventing research that would make cybersecurity better Katie so there are two individuals that were mentioned you know two individual researchers that were mentioned during that panel but I also want to call attention to the fact that you know having started vulnerability research programs at major corporations so started semantic vulnerability research which would allow the researchers working for the company to report security vulnerabilities to other companies that they found and similarly starting Microsoft vulnerability research the experience coming from those two major mega corporations and trying to knock on the front door or even find the front door in some cases of some other vendors was horrendous and this is coming from essentially to the biggest software companies in the world can you imagine what it would be like as an individual trying to scale that cliff and that's something that I just want to call attention to because you guys brought up great points about those individual researchers and that is primarily what the CFA you know is there to to prosecute but if you think about it the state of security vulnerability reporting and disclosure is still in a pretty sorry state and that applies across the board so I just wanted to make that point in case you were thinking it was about individuals it's about organizations even trying to report vulnerabilities to other organizations thanks do you want to ask a question about data breach this is called a reflection attack data breach there is a proposal from the White House on this issue basically trying to create a federal law that governs how companies should handle it if they discover that the data is breached what does cement tech think about sure so funny enough I testified on this also two weeks ago and you know we do believe that there should be a national data breach law standard today we have over 48 different state laws very difficult for companies that operate across borders to scale to that if in fact they have had a breach but we also think that any breach legislation should have three components to it and one is it should but they should apply equally to industry and to government or to any organization that holds PII today that is not the case the second thing is that it should be linked to a baseline standard of reasonable security measures to help incentivize organizations actually prepare and protect themselves before they're breached and the third piece of that is that if information is encrypted or uses another type of security technology that makes the data that has been exposed either unreadable or unusable either in transit or at rest that that should be a standard for baseline of reporting essentially it creates an environment where you don't get into over reporting then if in fact the information has not been exposed and there's been no harm does it matter if the government has a golden key to that encryption it's somewhere in Hogwarts I think obviously I think most people would like to see federal data breach legislation obviously the big conflict is going to be how strong is it what are the standards for disclosure I think the concern in the privacy community is right now what's been proposed is not as strong as some of those 48 laws and so we'll see how that goes although right now it seems like information sharing is the top priority for the White House on this issue as Alex rolls his eyes do we have any other questions or any anything that's appropriate for a G rating on the internet or PG PG's it's an adult it's adult oh where's it hi Igor Miklich from ran with regard to data breach our liabilities today in our society in the right place for that in particular example for an average citizen who gets his identity stolen as a result of the target breach seem to be liable for that perhaps I misunderstand your thoughts so I think the liability so it goes back to sort of a really interesting question that I sort of watch people talk about today which is you know it's who's liable when you're breached and sort of breached by who my guess is the question right because I mean people sort of talked about the Sony breach and so forth right like if Sony is breached by some random hackers then it's on Sony if Sony is breached by North Korea maybe it's on the US gov right which actually becomes really interesting as especially since this sort of goes back to I think when you're sort of talking about about nation state threats and that sort of thing right like you know when when Google is breached by China the US government is your friend when the US government is stealing Google's data via unencrypted lease lines maybe not so much so you sort of got this kind of frenemies thing going on right which I think it makes it really difficult in terms of discussing like you know where's the liability there for instance you know was Google liable for all that customer data going missing when the US government stopped it or you know again you know is Sony liable now I think it's I think it's a difficult question I mean I'm not a lawyer so I find it difficult to you know sort of put the the liability line but these as a sort of a security engineer I guess these are my things that go through my head when I think about threats and where my data is likely to go I just gave a whole speech on respecting that fact that people have professions I'm not going to pretend to be a lawyer right now I would just say I'd be careful in any situation where you want to disincentivize companies from providing free services to billions of people all right where we are just about out of time and going to be transitioning to some final comments from my boss and Marie Slaughter but first I'll tell a little story about her that's relevant to what we just talked about you mentioned Ashken Sultani FTC technologist prior to that gig he was doing a panel at New America and we were talking about this problem of the technology pipeline and how do we get more technologists into DC and he mentioned this cultural difference and mentioned in the panel he was like for example I know that you have technologists on staff at New America but that you also have a dress code where you can't wear jeans except on Friday and Marie was still relatively new at the time and was shocked to hear this we expect people to code in slacks and and the next day the the dress code was was revoked and so there it is people the answer to cybersecurity is letting people wear hoodies in DC