 Thank you for coming out to CSIS and our shiny new building, although it's not so new anymore, it's about a year old. Still shiny though. This event is something we've been really looking forward to and I'm happy to do it because what we've seen is the roots of this go back to really August of 2012, right? With the failure to pass legislation and that legislation would have made a single agency responsible for cybersecurity and I think the model was very corporate, you know, it would be like a CIO or a CISO, you'd have an Uber agency and that I think was the wrong model is one of the things we learned. So there's been a new approach to cybersecurity, one that gives the sector specific agencies the independent regulatory agencies a much greater role. I think this is going to be the wave of the future, this is a new model for cybersecurity, it offers the possibility of working really well. We had a previous event with FireEye and I think there's Orly there, a previous event that focused on the NIST framework, again building on what the EO has done. And this one, the framework is not our focus today, we've asked the leading experts from key agencies to talk about what they're doing and where they see the challenges. So for us this was kind of a surprise, this was our dream team for speakers, usually you don't get your dream team and so I was really happy when we got these people. Their full bios are on our website but I will do some quick introductions. The format for today is we will have Julie Brill, who everyone knows, commissioner from the FTC and the leading voice in Washington on some of these issues, particularly privacy and security now. She'll open with remarks, unfortunately Julie has to leave a little early so she'll make some remarks then she'll take Q&A from you and then we're going to switch to the dream team panel, which is Valerie Abend from the office of the comptroller of the currency and I'll do their full bios when we get the panel or more of their bios. Joseph McClelland from FERC and David Simpson from the FCC joined by Jake Olcott who bears much of the culpability for the 2012 legislation, no that's not fair. But we'll have Julie and then a panel, I'm looking forward to this discussion, I think it will be something we really haven't talked about that much before, so I welcome our panelists and our speaker. Thank you for that nice introduction and thank you to the center for inviting me to address you all this afternoon and for taking up this topic. It's really a pleasure to speak to a group of people who have such depth and breadth in security issues. We live in a networked world. We Americans depend on constant connections to work, relax and toggle between the two. Communications networks synchronize our critical infrastructure including our electricity, water, hospitals, buses and transportation systems and we're rapidly moving toward an internet of things which will put everything from our washers and dryers to our cars online. These developments hold promises small and great from allowing us to save a few extra steps when we need to turn off the lights to using our resources more efficiently. All of these connections bring risks along with benefits. Over the past year it seems that we haven't gone more than a few days without hearing about a major security breach involving consumers financial data or other sensitive information. Verizon's latest data breach investigations report records nearly 1,400 breaches in 2013 alone. Retailers, hospitals and universities have all been targets and federal agencies have taken their hits as well. The scale of breaches has kept pace with Moore's law and at the same time we're putting more and more sensitive information online. This means that the stakes in the security game are continuously increasing. Consumers expect companies to protect their information. Data security protections are increasingly like keeping the lights on. Consumers might not notice when they work, but they sure notice when they fail. Data security is one of our top consumer protection priorities of the Federal Trade Commission. In our enforcement actions and policy initiatives, we focus on the harms that consumers may suffer when companies fail to keep information secure. Unauthorized access to data puts consumers at risk of fraud, identity theft and even physical harm. Data can reveal information about our health conditions, financial status or other sensitive traits. Security is also an essential element of maintaining consumers privacy, which is another top consumer protection priority at the Federal Trade Commission. I'd like to convey two main messages about our data security enforcement program at the Federal Trade Commission. First, we enforce a flexible standard of reasonable security. Second, the Federal Trade Commission is the only federal agency with the authority to enforce such a standard across broad swaths of the U.S. economy. Our reasonable security standard adapts to rapid changes in both technology and security threats, allowing us to apply this standard to both older technologies as well as technologies that are just emerging. Let me take a moment to put the FTC's data security enforcement program in the context of other recent governmental efforts. Over the past few years, other governmental experts have turned their attention to answering difficult questions about the legal, economic, political and military aspects of cybersecurity. The Obama administration has been active on this front, reaching important milestones with the executive order on critical infrastructure cybersecurity and NIST's framework for improving critical infrastructure cybersecurity. I applaud the administration's efforts and its use of an inclusive process to develop these policies. The core of the NIST framework is about risk assessment and mitigation. In this regard, it is fully consistent with the FTC's enforcement framework. One of the pillars of reasonable security practices that the FTC has established through our settlements in more than 50 data security cases is that assessing and addressing security risks must be a continuous process. There's no single right way to do these assessments. It depends on the volume and sensitivity of information the company holds, the cost of the tools that are available to address vulnerabilities and other factors. By identifying different risk management practices and defining different levels of implementation, the NIST framework takes a similar approach. Now let me turn to the legal authority that we have at the Federal Trade Commission to do the job that we believe we need to do. The main legal authority that the FTC uses in the data security space is Section 5 of the Federal Trade Commission Act, which gives us the ability to stop unfair or deceptive acts and practices in commerce. We first applied Section 5 to data security issues in 2002, back in the day when, to paraphrase Tom Friedman, 4G was a parking space, an app was something high school seniors sent to colleges, clouds were in the sky, Twitter was for the birds, and Skype was a typo. In the world of 2002, those were all true, and that is truly in the distant past, yet Section 5 remains a highly effective tool for protecting consumers' information. The FTC's data security enforcement actions initially focused on deception, recognizing that consumers' data was valuable to them, and potentially harmful if obtained by fraudsters, identity thieves, and other malicious actors. Companies began to promise to consumers that they would keep this data secure. Those promises were and are material to consumers' choices about whether to use a product or service. After all, who would entrust their information to a company that doesn't protect it? When companies don't live up to their promises, the Federal Trade Commission may step in. From the beginning, our view has been that a promise to keep information secure has to be backed up by reasonable and appropriate processes and practices. Within a few years, it became clear that the FTC's ability to stop unfair practices under Section 5 would have its place alongside deception in our efforts to ensure reasonable security protections for consumer data. The key difference between unfairness and deception is that unfairness may be applicable even in the absence of a representation or omission in information presented to consumers. In 2005, we brought our first data security case under a pure unfairness theory, following a breach that involved BJs, following a breach that exposed the sensitive personal information of thousands of consumers. In the language of our unfairness standard, this company's data security practices caused or were likely to cause a substantial injury that consumers could not reasonably avoid and were not outweighed by benefits to consumers or competition. These days, of course, it's not unusual to read about breaches that involve records about millions or even tens of millions of consumers. The scale of breaches has changed since 2005, but the legal principles we seek to enforce have not. In our settlements and guidances, the Commission has outlined reasonable security practices while emphasizing that companies need to implement these practices in a way that is appropriate for their businesses. The practices that we look for are as follows. First, companies need to do a risk assessment. Companies should know what information they have, how it flows through their enterprise, what kind of access employees and third parties have to this information, and what vulnerabilities could compromise its confidentiality, integrity, or availability. Second, companies should minimize personal information about consumers. Limiting the consumer information that companies collect and retain to what is necessary to fulfill legitimate business needs will help reduce unnecessary security risks. Third, implement technical and physical safeguards. Security measures like firewalls, strong passwords, and limiting the circumstances under which sensitive personal information may be stored on laptops are important but not sufficient. Protecting information the old-fashioned way by ensuring that backup tapes, CDs, external hard drives, USB thumb drives, and the like are locked up and securely destroyed when no longer needed, all of those practices are risk-reducing compliments to security measures deployed on computers and networks. Fourth, train employees to handle personal information properly. And fifth, have a plan in place to respond to any security incidents that may occur. Now, this is not a standard of perfect security. FTC staff investigates hundreds of breaches. And so far we have brought 53 cases under Section 5. We tend to bring in action when we find systemic failures in a company's data security practices. So the fact that there's an isolated vulnerability in a product or service that a company offers or even the fact that a company suffers a breach does not necessarily mean that the FTC will come calling, let alone file a lawsuit. Now, some of the FTC's actions are against companies that are themselves victims of hacking or other malicious attacks. But this does not and should not relieve companies of the need to provide reasonable security. After all, it's the company that decides what data to collect, how to use it, and when, if ever, to get rid of it. Holding companies accountable for their practices and the representations they make is entirely appropriate and consistent with how we apply Section 5 to other commercial activities. So let me talk a little bit about how we use Section 5 in some of the new data security challenges that we are facing. Consumers are obviously moving more and more of their activities to smartphones and connected devices. These phones and devices are producing an increasing amount of sensitive data, including user-generated health information. Our recent data security cases show that Section 5 is up to the task of protecting consumers in this rapidly changing environment. So let me focus on three areas that seem particularly salient in our new data-intensive economy, beginning with mobile. Mobile devices and apps provide convenience, entertainment, and a platform for us to connect to one another in new and exciting ways. But when apps fail to provide reasonable security, they can leave a broad range of sensitive personal information at risk. For example, earlier this year, the FTC brought an enforcement action against two popular apps, Credit Karma and Fandango. We alleged that these apps contained flawed implementations of the Secure Sockets Layer Protocol, which is a common means for encrypting data in transit. Specifically, we alleged that the Credit Karma and Fandango apps were susceptible to man-in-the-middle attacks, in which an imposter could pose as a legitimate data recipient and collect highly sensitive information from the consumer, including social security numbers, in the case of Credit Karma, and credit card information, in the case of Fandango. These companies were not tripped up by bad luck. Our complaints alleged that they overrode more secure default settings and failed to test adequately what would happen after they did so. The FTC also brought an action against the mobile app Snapchat, which allows consumers to send photos or videos that disappear after just a few seconds, or so Snapchat told its users. The part of the FTC complaint that seemed to draw the most attention was the allegation that, despite the company's representations, recipients were able to save snaps indefinitely using a few simple techniques. But we also alleged that the app exposed consumers' mobile phone numbers and left consumers vulnerable to being impersonated by other Snapchat users. Thus, the Snapchat case raises both significant privacy issues and reminds us that security, which includes controls to keep information confidential, is critical to effective privacy protections. As a group, I think these cases show that the FTC's framework for holding companies to a standard of reasonable data security readily applies in the mobile environment. So let me turn to the Internet of Things. While connected devices can provide innovative services, they must do so in a way that does not violate consumer privacy or leave personal information vulnerable to exposure. Some of the data coming from connected watches, appliances, clothes, and other everyday devices could reveal a lot about our health, activities in our home, or other highly sensitive aspects of our lives. Protecting this information from unauthorized access and disclosure is paramount. I am concerned that some of the lessons of the recent past aren't being applied in these new and exciting technologies. A recent study by HP, some of you may have read about this study, found that 90% of connected devices are collecting personal information, and a full 70% of them are transmitting this data without encryption. The first case we brought in the Internet of Things area was against TrendNet, which makes Internet connected video cameras. Our complaint alleges that TrendNet cameras were vulnerable to having their feeds hijacked. And indeed, around 700 private video feeds, some of which included images of children and families going about their daily activities in their homes, were hacked and publicly posted as a result of the company's alleged lax security practices. As more and more devices became connected to the Internet, the potential for more information about the most intimate details of our lives to slip into the wrong hands grows, unless appropriate safeguards are put into place. Finally, let me focus a bit on health information. Our recent cases show that we at the FTC are serious about enforcing protections for sensitive information. There's broad agreement that information about consumers' health and medical conditions is sensitive, and that consumers suffer harm when this information is unexpectedly revealed. Companies that collect this information need to recognize its sensitivity and provide safeguards to match. In two recent cases, the FTC had reason to believe that companies failed to provide such safeguards. Last fall, we announced a settlement with accretive health in a case that stemmed from the theft of an unencrypted laptop from an employee's car. This one laptop contained 20 million pieces of health-related information, about 23,000 patients. But the case wasn't about just a lost laptop. It was about the company's failure to adequately train employees, to limit the data contained on the laptops, and to implement reasonable technical security safeguards. And earlier this year, in another case, we announced a settlement with GMR Transcription Services, which used a contractor that left wide open the door to notes about medical exams and other highly sensitive medical information, allowing this information to be indexed by internet search engines. So let me take a step back and talk a little bit about policy. Policy initiatives are another important aspect of our data security efforts at the FTC. Those of you who are familiar with our work know that we're pretty adept at identifying emerging challenges in many areas of consumer protection, and data security is no different. We recently held two public workshops that explored emerging data security issues. At our June 2013 workshop on mobile security, panelists from industry and academia took a comprehensive look at security in the mobile environment. The topics included identifying and closing software vulnerabilities during the development process, making devices harder to crack if they're lost or stolen, and making user interfaces to security features more consumer friendly. This last point is critical. Just as privacy experts have recognized that interfaces for providing choice mechanisms need some rethinking in the mobile environment, so do the means for providing options to consumers to manage their security settings they need to become more consumer friendly too. Second in November of 2013, we held a full day workshop on the Internet of Things. While some companies are taking a strong leadership role in securing the highly sensitive data from connected devices, many of the workshops participants raised questions like those raised in the HP study I mentioned a few moments ago. Questions about whether other companies are paying appropriate attention to securing the data from connected devices. Will companies that for decades have manufactured dumb appliances take the steps necessary to secure the vast amounts of personal information that their newly smart devices will generate? Will companies design their devices and services to provide appropriate levels of security, not only in isolation, but also as part of a highly complex and interconnected new ecosystem? These are the issues that the FTC is watching closely. Finally, while the FTC's current enforcement authority and our capacity to develop policy recommendations and best practices in connection with new technologies all play a critical role in providing U.S. consumers with some assurance that companies will keep their information secure, I believe that we need more tools to protect consumers in this area. Along with my fellow commissioners, I believe that Congress should strengthen the FTC's data security authority by giving us new tools to address these issues. The commission's unanimous recommendation to Congress includes a call for civil penalty authority, rulemaking authority, and jurisdiction over nonprofits. These elements would place the commission in a stronger position to deter violations and protect consumers nationwide. Technology has changed dramatically since the early days of the FTC's privacy and data security enforcement. The FTC's general, flexible consumer protection authority has played an important role stopping and remedying fraud, identity theft, and a broad array of privacy violations as these technologies' changes have been underway. We at the FTC cannot address every data security challenge that the United States faces, but we will strive to ensure that companies that collect information about consumers, whether in more traditional ways or through the mobile ecosystem, the Internet of Things, or other exciting new mechanisms, keep this data secure. Consumers expect and deserve no less. Thanks very much. We have time for a few questions. Sure. I didn't mention it in my prepared remarks, but we're thinking a lot about big data and what consumer protection, privacy, and frankly data security challenges it presents. Just focusing on data security for a moment, the two enterprises that is collecting and aggregating as much data as possible, even if you don't really know what purpose it will be put to in the future, clearly there are some promises of great benefits to society in this enterprise, but it runs smack into the face of the issue, kind of like a Mack truck when it comes to data security problems. To the extent that big data research and analytics is focused on aggregated, non-identifiable, non-linkable information about individuals, which I frankly think a lot of the big data research projects can and should and in fact do focus on, we have less of a problem, but to the extent that what we're looking at is linkable or actually identifiable information or information that can be linked back to consumers, that's where we start to have not only the security problems I just mentioned, but also obviously some significant privacy problems. If consumers are giving their information for one purpose, whether it's looking for health conditions online, doing searches through apps or whatever, then that information gets repurposed for other uses, whether it's to target them for marketing or whether it's to be used for more significant decisions about them, whether they're too risky to do business with or whatever. Those are the types of things that we are thinking pretty deeply about at the FTC. We've issued a report around some of these issues and what we think some specific players in part of the big data ecosystem should be doing and those are the data brokers and we're calling on them to be much more transparent about their practices. We've called on Congress to enact legislation to ensure that consumers have appropriate tools in order to address some of those issues. Big data has a tremendous amount of promise, as I said, to solve health problems, to solve transportation, to solve energy usage, but I think if it's really going to thrive, we need to put into place appropriate consumer protection and privacy protections, otherwise it won't. People won't trust it, businesses won't engage in it. I've heard from lots of folks in the healthcare space, health researchers and others, that are saying they're in desperate need of guidance on this issue because they're afraid to actually go out and start using it because they don't know what the implications will be for law enforcement. Finally, let me just mention, and I realize this is just a kick-off question, but we just held a workshop two days ago on the potential discriminatory impacts of big data and it was a really interesting workshop. We had folks talking about the potential for big data to solve discriminatory problems. That is, we can analyze companies' practices or we can analyze governmental practices or any practices to determine whether or not vulnerable communities are being implicated and inappropriately affected, and yet also data aggregation can, whether intentionally or unintentionally, can also lead to potential harms. So it was a very in-depth discussion about the state of the law, the state of the science and what companies are actually doing, and I think there'll be a lot more discussion about the potential discriminatory impacts of big data as well, which was something also that the White House report focused on. Hi, Julie. Hi. I certainly applaud all the efforts you're doing to try to raise the bar, but I guess my question is, how do you determine where that bar is in terms of what is best practices? I'm just kind of curious. What is reasonable security? It's a matrix of issues. There's not one answer to that, and it depends on the nature of the information being collected, how sensitive is it, the size of the company, the availability, the technological availability of tools to protect it, so the state-of-the-art kind of inquiry, and then we also very much are focused on whether companies have procedures in place to deal with these issues. So it's not just what they do for any particular instance, but do they have a program in place to be thinking about these issues and to address them as they come up, and I tried to talk about that in my talk. So those are some of the things that we look for when we're examining a particular data security problem. And just to be clear, as I mentioned, there are lots of companies that suffer breaches. There were 1,400 almost in 2013 alone. We don't investigate every single one, obviously. We've looked at hundreds of cases, and we've only brought 53. So we really are targeted and focused in on those companies that are not living up to a reasonable standard. We recognize that there can be no perfect security. And the other thing that I mentioned during my talk, which I also think is quite important, is you also might deal with us even if you haven't suffered a breach. And I think that was the case with respect to Credit Karma and Fandango. They hadn't actually experienced a breach, but they had a real problem in terms of what they had done with some of their protocols. So I wish I could tell you, here are the five things, and here are the only five things, and that's all you need to worry about, and you can go on and move on with your life. But it is a constantly evolving area, right? I mean, the security threats change all the time, practices change all the time, what is reasonable changes all the time, and companies that are holding sensitive information or lots of information about consumers need to keep up with the current times. Great. Thank you. Thank you for a very enlightening presentation. I appreciate it very much. It seems to me I'm a private investor, so I look at this on the private side for a moment here. It seems to me that we have a lot of analogs in our current economy. Auditing, financial information would be a good example of one that I'm going to refer to. There is a big function in auditing financial information that many years ago, before we got to that point, we did not have. It benefits the economy to have that in the private sector. There are standards that they develop. There's enforcement procedures that occur. Do you envision any form of your functionality translating itself into, if you would, a commercial equivalency, an auditor of the security of enterprises? Oh. As an investor, I would welcome it. Right. I want to make sure I understand your question. Are you asking me, do I see commercial implications for various players to help companies improve their data security practices? I think that's yes and no. I understand the very valuable role that I've seen NIST play with the framework they've created. I know the Department of Defense and many other government agencies are deep into this on the warfare side. You're doing a significant amount in terms of setting the boundaries and the framework for consumer protection. But obviously, as you yourself said, this job is bigger than any one agency. It's very, very diverse. And I think the analog that I look at, from my personal standpoint, is something like an independent auditing agency that manages the financial auditing function. I see. I look at it as an investor and say, I'd like to know the risks associated with this investment I'm making. Right. And as you pointed out, there's a significant number of them that are very difficult to discern. Right. And if you have to be the only source that's going to propagate that discovery process and make public, I think it's going to be a slow-moving train. Yeah. I mean, and are you asking, should there be another governmental agency? No, I think I'd like to see a private enterprise. You know, my sense, I mean, it's a very interesting question, and I, you know, can't really predict the future perfectly as much as I would like to. But I do think that there are lots and lots of players in the commercial space that are working to help companies improve their data security practices. Whether it will end up being sort of a formal accounting, like an accounting function would actually be pretty interesting. I haven't heard about it being formalized in that sense. I think I go through that. Just recently I had a tour of one of the major semiconductor foundries on the West Coast. Yeah. And they themselves run a very intensive audit process across every level and every element of what they do and what they make. Right. Because they're embedding the core tool sets for the hackers to hack, so to speak. Right. And the Department of Defense, of course, with battleships and other elements, they have a very large auditing of security capabilities. So I think some of our enterprises are so critical, such as the energy industry, for example, the transportation industry. I would think that it would be at the national level to put something in play here at the federal level that encourages, optimizes, starts the journey towards creating a standard process. The NIST framework is a good process to start with, but it's a long way from a framework to an actual practice, as you know. Yeah, it's an interesting idea. It's an interesting idea. Thanks for raising that. Just, and if I could just touch on, we can get the mic to the other questioner, who mentioned that we're the only ones in town focusing on this. And just to make sure everybody realizes, you know, there are obviously some other federal agencies that focus on sector-specific issues. So, for instance, HHS under high tech deals with medical information along with us. And the financial regulators will deal with, under GLB, with various issues involving the financial sector. And don't forget the state attorney general, that, you know, the 50-plus law enforcers at the state level. I come from that world and they're very interested in security issues. The laws that are on the books right now that are general dealing with data security issues are the state laws that require notification of breaches. So, there are some other players in addition to us, but we have definitely, you know, taken the lead in terms of the broad, you know, broad nature or the broader aspects of the economy. Hi. Hi. My name is Hans Holmer from Intelligent Decisions. My question really addresses whether the FTC scales. So, there are... Do they scale? Yes. There are breaches and the magnitude of thousands. Right. You can investigate hundreds. You can file suit in tens. And if you look at the data breach report from Verizon, I think it was in 2011, they wrote that 90-some-odd percent of every hack could easily have been prevented. Right. They go on for months. Usually the target is the last person to discover it. Wouldn't it be better to implement a market mechanism that says, if you lose data by definition, you are not doing what you should be doing and you owe every consumer 350 bucks or something like that? I think you raise a really important point, and this I think is also a point that you were trying to raise. And I am a big believer not only in governmental enforcement, and I think the way you framed it, do we scale, is an interesting one. You know, we do the best we can with the resources we have, but you know, we don't have infinite resources. And I think we also do a really good job at picking cases that send important messages to various segments of the ecosystem. You know, they are raising important issues where we want to get the word out. But there is a very important role for private industry and private players as well to set up appropriate mechanisms, creative ideas. But you know, time for pin and chip, right? I mean, let's just get it done. And you know, I think that that it's not going to stop all breaches. It's not going to even stop all theft of credit card numbers, but it will do an awful lot to ratchet that down tremendously. And you know, I know that there is discussion about getting that underway, whether it ends up being just chip without the pin. You know, I just think, you know, in this day and age, in the United States we ought to be at the same level that our counterparts in Europe are and elsewhere, and we really ought to be a first world country when it comes to credit card information. So, you know, so there are clearly things that private industry can do to help with this process. Frankly, I think Visa and MasterCard play a critical role as well. I mean, when, you know, when I was able to do frontline cases, I can't do it anymore because my job just doesn't give me the time as much as I would love to because all these cases are really great. But Visa and MasterCard play a really critical role in providing information to all segments of the law enforcement community. So they're another private actor that, you know, that helps and can help, and they do help with this issue. So really good question. I think there is absolutely a role for industry to do a lot more. And let's start today with pin and chip today.