 Okay, Nate's a tough act to follow but we are going to do amazing things for you this morning. Good morning. Thanks to New America for us being here. We have an amazing set of panelists and I'm going to start out with some high level introductions and then we're going to have some opening remarks but I want you to know that we are going to be swinging back at the end for some conversations with you. So think of your brilliant questions as we're going through this. We're focused on the new ideas, you know, those how do we skip echelon, how do we think in game changing ways about the cyber arena and of those ideas and concepts, which are the ones that this administration that we could get them passionate about, right? So it has to map to some of the initiatives that this administration are already focused on. Otherwise, right? They're not going to take it on. So first on our panel we have Christy Rogers. Christy is currently the managing partner at principle to principle. Previously she successfully launched two companies and is founder of the Women and National Security Group with the business executives for national security. Next we have Ava Galperin. Ava is currently the director of cybersecurity at the Electronic Frontier Foundation or EFF. Previously she has worked in security and IT across Silicon Valley and she is focused on providing privacy and security for vulnerable populations around the world. And next, you skipped in, I saw that. Next we have Bobby Stemfley. Bobby is the director of cybersecurity implementation at the MITRE Corporation. Previously we all knew her as the deputy assistant secretary and acting assistant secretary for cybersecurity and communications at DHS. And previously the CIO at DISA. Bobby is focused on advancing the cyber strategy across the MITRE customer set. And last but not least is Herb Lynn. Herb is the senior research scholar for the Center for International Security and Cooperation. The Hank J. Holland Fellow and Cyber Policy and Security at the Hoover Institution, both at Stanford University. Previously he was the study director of major projects on public policy and information technology at the National Research Council. And if I went on and on we'd be here all day. So I'm going to cut it short and I'm going to start out turning it over to Christy. Christy if you could give us some opening remarks. Excellent. Can everybody hear me okay? Good morning. The traffic on the parkway was a nightmare this morning. I think I had way too much caffeine in my car. So I apologize if I go fast I'll try to go slowly. Well I'd like to say we all know everything is connected now. And if it's not it soon will be. I mean 15 billion devices are connected now. It is estimated by 2020 that anywhere between 50 and 200 billion things will be connected to the Internet. And as my husband likes to joke in public and at home he thought the refrigerator was working against him already and now he thinks it really is. So we have this conversation professionally and personally at home as well. But if you think about it the digital economy which is soon becoming our economy is has the potential to drive one trillion dollars worth of growth by 2020. That's a staggering amount. The opportunities are incredible but so are the risks. Now that one trillion dollars in growth depends on our continued innovation. The continued innovation of our entrepreneurs, our technologists and our business folks. At the same time we need to protect our data, our intellectual property and our crown jewels. So we really you think about it the economy and security are two sides of the same coin. They should work together. They should be complementary. We should not think about them as opposing entities or on opposite sides of the universe. And it seems as though in today's debate or discussion all sides have retreated to their respective corners in the boxing ring. And it's extraordinarily unfortunate because that growing divide, the divide between Silicon Valley and other innovation hubs and Washington D.C., the policy makers, that growing divide is stifling innovation and growth because it is leading to reactionary regulations, reactionary legislation that is punitive in nature and that actually hurts businesses. Now if we continue down that path I think and so do many experts that we are not going to reach that potential in one trillion dollars worth of growth. So this is about our economy and it's about time that all of the experts and the practitioners reframe the discussion. Cyber security sounds so passe and so many people from Silicon Valley have told me you sound like a cyborg when you're talking about cybersecurity. Everything is digital now. It's the digital economy. We need to start talking about it in terms of our economy. And we need to come together as business leaders, as entrepreneurs to come up with guiding principles for the digital economy. Our policy makers need a road map. We cannot fall back and rest on our laurels and let policy makers decide the guiding principles because it's not going to work. Not only are they going to be five months behind, there will be five years behind because when they're trying to regulate or set up policies to support a technology by the time it's enacted, the way the glacial pace our Congress moves, the executive branch, it's obsolete already. So business leaders and activists and entrepreneurs need to come together to set up guiding principles. And we all laugh. Maybe that sounds polyannic. It's been done before. Twenty years ago this July 1997, you can actually Google it, a C-span video. None of us were alive. No, of course not. There was great hair in the video. There were five people that sat on the White House lawn, Steve Case, Eric Schmidt, Tara LeMay and two other entrepreneurs, I'm forgetting their names right now, forgive me, to announce the guiding principles for the Internet of Commerce. Now they said it's going to be transparent. It's going to be open. It's going to offer all of the economic growth and prosperity you can dream of. But they acknowledged that it does not include inherent security principles and it doesn't include encryption and they said the word. And they said 20 years ago that the policy leaders are going to have to work with businesses to come up with those principles. And we've not really yet done that. So my challenge is the next big thing for cybersecurity is to align it with the economy, to align economy with cybersecurity, to develop guiding principles for the digital economy, for the sake of our crown jewels and for the sake of our economy. Thank you so much, Kristi. Eva. Hey there. My name is Eva Galpin. I'm from the Electronic Frontier Foundation. So I guess I'm here to represent civil society or as you guys probably think of them, the people with no money. Sign us up. I know it's a really exciting value proposition. Protect people with no budget. But that's mostly what I do at EFF and mostly what I think about, which is that right now, if you think that you are vulnerable or that your company is vulnerable, I am here to remind you that there are people out there who are even more vulnerable. So activists, activists, people working in war zones, women, people of color. Surveillance disproportionately affects people of color and the poor. Women are disproportionately affected by surveillance, not just by governments, but also by abusive partners. So these are the things that the people with money very rarely think about. I spend a lot of time talking to rooms full of 30-ish white software developers from Mountain View and they make software for 30-ish white software developers from Mountain View with 30-ish white software developer from Mountain View problems. I usually show up in order to remind them that there are people with different kinds of problems where if your information is exposed, the consequences are much greater than, say, having to call up your bank and change all of your credit card numbers. If you are a journalist, your sources could be unmasked. You could end up sending somebody to jail. If you are a woman or a person in an abusive situation, you could wind up being beaten or killed. If you are an LGBTQ teenager in the middle of the United States where you feel that nobody understands you, you run the risk of getting bullied or beat up. So the consequences are very real for these people with largely no money and not a lot of understanding of how privacy or security work, much less the kind of... So what are some of the big ideas? We're getting there. The big idea behind all of this is that because these people do not have their own security teams and they do not have the power to represent themselves in the security industry, we have to do it for them. It is our job to think about what they need and to make sure that these visions are already built into the products and they're already built into the policies from the very beginning. People who are at a disadvantage are not going to show up and say, do this for us because they don't even know that it can be done. But we need to think about building products and policies that are for people who are not ourselves. Absolutely. That's a great point. Thank you. Bobby. Actually, my colleagues' comments resonate with me a great deal because I've long believed that the security community has got to stop talking to the security community and I'm very frustrated by the statement that says we've been talking about the same thing for 20 years. Why do we think we won't be talking about these same things 20 years from now? I like to make progress. And one of the things that I've been giving a lot of thought to is what are the narratives in this space? As we've grown awareness, as we have security on the front page every once in a while of the paper and it's at the boardroom level and all of these pieces, what are the narratives that resonate with people and how do we find ones that will promote action in that space? So I'm going to do a little query of the audience here, Terri, if you will allow me. Sure. Raise your hand if this is the way you think about it. The cop on the beat, law enforcement is the big deterrence, right? We've got to put them in jail, we've got to figure out what they've done and investigate them. No. It's just another risk, right? It's just one more thing I've got to manage in my portfolio of risks that I manage as a company. Okay. I've got a couple of hands here, three, four, 10, 15. Just one more, right? It's just too bloody hard. I can't even deal with it. I don't know how to spend money on it. I don't know what to buy. It's better if I just ignore it and stay one step ahead of the guy that is less secure than me. Okay. A couple of people have heard that one, heard that one too. It's too bureaucratic, right? I don't even, it's the government's problem. There's a bureaucracy about it. They're going to tell me exactly what to do and as soon as I do that, I'm going to be covered. Right? That was a little bit of that in your language there as well. This is war. It's the government's problem. This is war. How can they expect me, company XYZ, to handle an attack from a nation state? This is a war where is my government. Right? Okay. A handful or two. These are the narratives that exist and do you notice how much they're actually competing narratives in that space? And you wonder why we can't make progress, right? We can't make progress because we are in a problem of paradoxes and we have to start figuring out how to keep those paradoxes, how to look at them as paradoxes and not as false choices. Privacy or security, right? That's a false choice. Security or innovation, that's a false choice. There are a whole range of, right? Transparency and operational security, right? You can't have both. That's a false choice. We have to, I think a lot about both of your comments because if we could reframe this discussion into something that looks at it from the multidimensional way that it is, I think we could start making some progress in this space. Great points. Okay. Herb. So I'm here, I guess, as a member of the Presidential Commission that was established last year to report to the next president on cybersecurity for the U.S. government. Then we issued a report in mid-December of last year and the focus of our report as called for was the digital economy and how cybersecurity would, try to lay out a cybersecurity agenda for the new administration that would support the digital economy. So in that sense, the spirit of the report, I think, is broadly aligned with the comments earlier that cybersecurity is an integral part of the digital economy. We took, it's important, I think, to say that we did not take two things that we didn't say. We didn't say that government has no role and we didn't say that government is the, covers everything. We called for working with the private sector and so on. We also emphasized the role of incentives for doing better cybersecurity. There are two kinds of cybersecurity that we thought we had to think about. One was getting cybersecurity, the average level of cybersecurity up to the level of the best. When you have the best cybersecurity company that has really an outstanding cybersecurity posture, it's proof that you can get there and lots of people, lots of players aren't at that level. So one part of it is getting them up there. And to that end, we made a number of recommendations to help them get there, mostly focused on the NIST framework as a way of helping them to think about these issues. Then once you get up to the best, the level of the best, then there's another gap between that and the actual threat and getting even better there is something that companies aren't going to be, almost by definition, aren't going to be able to do on our own. So I thought we paid a lot of attention to what government could do in that space. You could characterize, there's a big part of that gap between the best and what the actual threat is that can be characterized as the sort of the nation state threat and so on. They're really severe threats. So we thought that government had an important role to play there. Something that the report did not explicitly say, but which is in the, if you read the text of the report carefully, it goes through it over and over again, which is really the market has failed to deliver adequate cybersecurity, both for individual players and for the nation as a whole. We thought that we could structure, that we needed a way of structuring incentives to get the rest of the nation, to get the nation up to speed. I don't mean the rest of the nation, I didn't mean that. We thought that the U.S. government had a long way to go too. So we have a lot of recommendations there about how the U.S. government should improve its cybersecurity. Okay, thank you. So before we open it up for questions, if we could drill down into each of the themes that you talked about and pick one thing that you would like this administration to work on, an executable idea. We know that they focus on, there is a focus on the economy, entrepreneurship, extra resources for DOD, perhaps that means rules of engagement, R&D, private-public partnership. So can we have one thing that you would like them to do? Quick. Go. Reframe the discussion to be about the economy and bring together leaders from the private sector to create the founding principles or new rules of the road for the digital economy. Okay, thank you. Probably when somebody asks me what I would like the government to do, my answer is nothing or as little as possible. But I do actually have something that the government could do. Which is that I would really like to see more transparency on the VEP, the vulnerabilities equities process. I think that given that the government is buying zero-day vulnerabilities, I would really like to have a better idea of how they make the decision to either use these vulnerabilities or disclose them to the vendors. I am not suggesting that the government stop buying o-days. Really great comment. Thank you. Bobby. So I think that if we took the idea of public-private partnership to the next level, if we really thought about how do we create a near-miss database or a data exchange that enables shared analytics at a scale, right? Yes. We trust that scale is hard, very hard, but we have to tackle that hard problem, I think, to make progress. That's a really good one. What I'd like to... There are many things that the government could be doing, and we talked about many of them in the report. And I think one of the most important things about the report is that it's an integrated whole, that you can make progress in area X, and if you don't make progress in area Y, you don't reap the full benefits of your investment in X. But that said, the thing that I want to point out as worthy of government attention is something that's apparently in the draft executive order on cybersecurity, which is getting the U.S. government to adopt the NIST framework for thinking about cybersecurity. There's an order in it that says government agency, you shall adopt this as a way of thinking about cybersecurity. That's a good thing. There's never been that before. Okay. Great point. Start with yourself. Okay. Questions from the audience? We have mics available. You want to raise your hands? Up here? Good morning. My name is Maurice Turner, congressional innovation fellow working in the Senate. My question is more directed toward Christie with a little bit of info pulled in from Eva. If you're saying that it's up to the business leaders to come together and have these guiding principles really drive the conversation, how is it that you expect more communities to be represented if there is under-representation at that board level in business? Well I actually do think the government can provide a forum for that and I think it will be up to, quite honestly, the business leaders and the innovators and maybe government to make certain that it is an appropriate and accurate representation of all communities. So small, large, mid-sized companies, rural, urban students, the disadvantaged, the slighted. I mean I do think that there needs to be, because guiding principles for the digital economy, you know, it's not just about tax reform or the right tax structure. It's much bigger and broader than that. So I actually do think that the government can set up the forum and maybe can be responsible for identifying the right people. I think it's up to the business leaders and the innovators and maybe saying business leaders isn't the right word. Maybe it's saying the American public and leaders from various sectors of the American public coming together to develop the guiding principles. Maybe that's the better way to say it. That's a good qualifier, Ava, anything you want to add? So there are sort of two directions from which to approach this. One is to change the makeup of the companies and the other is to make the consequences of not serving these sort of grassroots and underrepresented communities clear to the companies and EFF sort of has an opinion on both. One of them is to encourage companies to hire a more diverse workforce because people build tools for people like themselves. And the other is actually reflected in EFF's own grassroots organizing plans. We're currently working on what we call the EFA, which is our sort of grassroots organizing organizations all over the United States. We are definitely reaching out to communities who are not like us. We're particularly interested in immigrant communities. We're reaching out to student groups on campuses and things like that. And I think that one of the things that we can do is to make it, is to offer them ways in which to communicate with these companies about the consequences of their policies. And that's really where an area in which I think EFF and indeed all of civil society can really be a bridge. One thing I would add is we're not demanding that our privacy and that our security be embedded in the products that we're buying. And so I think a grassroots approach is enough education out saying, hey, if you ask for it and demand it and make buying choices based on it, you start changing the whole dynamic of what's going on today. And we're not doing that as a society, but just a footnote. Another question. Yes. Over here. I'm Mike Nelson with Cloudflare. And I wanted to pick up on something that Nate Fick had said about how there's going to be more consolidation in the cybersecurity industry, the move to cloud platforms. How can we do better with the new cloud-based platforms that are providing cybersecurity protection, providing the infrastructure, the small businesses plan, rely on for their entire business. And I'm talking about government action, business action, and civil society action. So one thing just before I turn it to her, because I know you've focused on this, and then Bobby, is I always think of secure cloud as more secure than not doing anything. So if someone is not doing anything today, at least that's an improvement. I'm not speaking for the commission specifically in response to what you're saying, but from my standpoint, at least one of the big issues in thinking about security and the cloud is the uncertain legal and regulatory environment that it faces. So when you have a cloud service and you quote, upload it to the cloud, where is it? And whose laws protect you or don't protect you. And there's all kinds of uncertainty about that. And I don't think anybody believes that that uncertainty is a good idea as a matter of course, but nobody wants to say, no, no, I'm willing to give up my jurisdiction to somebody else, but I don't want somebody else to have jurisdiction over me. And if the answer is world government, that's not going to be the right answer. Bobby? So I agree that we have to produce some of that clarity because that's an impediment to the kinds of things that need to happen. One of the things I find so important about cloud is that it is a re-architecting moment for organizations, right? And we are rapidly disintermediating everything. And if you take that moment, it is an opportunity. And if you take that opportunity to think about what your business process is, why your business process maybe needs to be operated in that way or not. Simplify, clarify, and put it in place. You can in fact make market improvements in your security, even if the technology is only marginally more secure, right? Because we don't actually understand, enterprises today don't generally understand the complexities of the operations that they're doing. And having helped large enterprises move from one technology platform to another that you discover a lot in that moment. And so that lost opportunity has fairly profound consequences if you're not thinking about it. You'll never take the opportunity if you don't provide some of that clarity. One of the things that we were really clear about in the CSIS recommendations is that for the government in particular, and I'm a strong believer in this for there and for industry. You should not think about cloud as exclusively a business efficiency environment, right? Everybody goes into these shared services consolidations as it's gonna save me money, and that's the only lens you think about it from. If you have an unknown, unmanaged, unenvironment moving it to a more managed, mature, executed environment, even if it's on somebody else's computer, which is what the cloud is, right? It's just somebody else's computer. You've actually, you can make a security argument or a security reason for that and could help you in crown jewels in other areas. So I think that's a great one to punch because it's really for a lot of people they need to do it, but our policies and our statute have not caught up to make sure that it's a framework, right? That has been looked at. Okay, we have time for one last question, I think. Yes, ma'am? My name's Supriya. I'm a student at GW, and I have to say I'm relatively new to the field. But I was reading the other day about how I believe that was the, there's a former NSA task force that evaluated the security of different government agencies, and then I believe it was dissolved because people weren't making those corrections. And I also know that that's quite familiar in the public or private sector field too. You make the suggestions, you don't see the changes, they still experience the problems. So how do you get individuals to really care about their security without scaring them? Can I jump in? Absolutely. I'm going to point to three different statistics. One from 2012, one from 2014, and one from 2016. Same question, identifying, or looking at small, mid-sized, large companies. 90% and the statistic varied from those three studies from 85 to 90%, and that's it. 90% in 2016, so it went up, of all breaches and attacks could have been prevented with simple cyber hygiene, controls and processes put in place. So your question is a real legitimate question. I mean, a basic cyber breach or cyber hack that is not what happened to Sony, right, it's not what happened to Target, it's not what happened to Downey. If it's a high end. But small businesses today, if they have a breach, the vast majority of small businesses don't survive after six months. So it's wiping out our small business community. So how do you do it? It's really going to require, I think, a cultural change. It's exactly what you said, Terry. It's how we, the consumers and the users of the digital economy or the digital space, look at it. We are not used to demanding that our data be protected. We are used to it being open. I mean, just think about it when you open up your Google, your Gmail from Firefly or from Safari today. Huge malware that's out there that you could actually attack. We don't think about that. We don't think about what we put out there. So I don't really have an answer other than it's going to require a cultural change in the way we think about it. And no matter how many cyber training programs that are out there and have been implemented, I work with a couple of companies that actually provide them, they're really not making an impact. And that's, we need to think about it that way. Please. I have an answer. Yeah, so the bad news is the security trainings don't work. At least not the kind of security trainings where you parachute somebody in, have them talk to people about security for an hour and then leave. Right. Or even if you have them talk to people for, say, three days, that doesn't work. All you can really hope to do is make incremental changes. So there are a couple of different tactics that you can take. The first is to make security default, make it transparent. And one of the best examples of this has been our efforts to encrypt the web. About 50% of all web traffic right now is encrypted and that definitely was not the case five years ago. I mean, we still have a long way to go because 50% is kind of embarrassing. But it used to be that if you could just sit on the network, you could sniff nearly everything which was going on. And that is kind of mortifying to people today, or at least I hope it is. And the other is a little bit of education. Possibly not going so far as training, but the majority of security breaches really do happen either because people have not patched their software or because they have fallen for a phishing attack. And so it is fairly easy to fix both of these problems. And it's something that really doesn't get enough attention because everybody likes to think of security breaches as this sort of mission impossible, like Tom Cruise coming down from the ceiling kind of situation. But really it's a phishing email saying reset your Gmail password. And a lot of security people either have given up on trying to educate people about phishing or think that this is a solved problem, two FAs. So done, hands up, and neither of these is the case. So to wrap up, cuz I know we're actually retraining time that we've had. I think it's two things with the government. It is digital literacy that starts in elementary school and on up, right, across the population, right. You can't fleet in people to do training and change culture and minds. And the next thing in government is accountability. You can give us money and you can take it away. And there needs to be accountability for over time reaching those basic benchmarks. You can do that in government. Any last seconds and then we'll wrap up. I think you do have to scare people. And the point is that you have to scare them a little bit but not too much. It has to be scared enough to take action but not enough to that you say, it's all hopeless. And security is ultimately a tax. That is, it's something that you pay. Everybody wants the convenience, wants convenience. And if you want security, you're gonna have to give up some of that. So that is an unavoidable trade-off and you can manage it. But you have to start thinking about what things are more important. What are you willing to put the effort into? Nobody here has passwords for the fun of it. Okay, Bobby, come on. I agree to a point. I agree to a point. But one of the things we've got to do is engage more disciplines in the conversation. Right, cuz you can scare people but people still smoke. Right, maybe a whole lot fewer people smoke today than used to but people still smoke. And so we have to bring in social scientists, cognitive behaviorists. We have to make usability a key feature of the products that we use so that people are using them correctly and smartly. But there are lots of pieces of this. But it's an interdisciplinary problem. It's our society, it's our economy, it's all of us. And security is not a destination. Right, patching is not something you do once. No, but economic success is. Okay, well listen, what an amazing panel please, a round of applause. Thank you so much.