 All right, let's get started. Good afternoon, everybody. I'm Kevin Bankston. I'm the director of the Open Technology Institute, which is the tech policy wing here at New America, dedicated to ensuring that all communities have access to an internet that is both open and secure, particularly app today. I'm here to offer some opening remarks, brief-ish, and welcome you to our special half-day event, how encryption saves lives and fuels the economy. Just the latest episode of that seemingly never-ending dramatic series we call the Crypto Debate. Many of this room or watching online may have seen or even played a starring role in the first season of this show back in the 90s. When we thought that a coalition of security experts and privacy advocates and tech industry representatives had definitively fought back against the idea that the availability of strong encryption products should be restricted, or even that they should be redesigned to include backdoors for government surveillance. By the end of that first season in 1999, this ragtag band of serious protagonists had successfully argued that strictly limiting the export of strong encryption or requiring backdoors would not only fail to stop many bad guys from getting their hands on strong encryption, but would instead mostly just hurt the security of ordinary people and businesses while also hobbling the American tech economy. And we thought this series was over, like one of those single season BBC shows. But then, a few new testing the water standalone episodes came out at the beginning of this decade as the FBI started warning that it was going dark because of the explosion of new digital technologies, including encryption, despite it also having resort to massive new sources of evidence thanks to those same digital technologies. However, season two of the crypto debate didn't really begin in earnest until 2014 when Apple perfected the full disk encryption on its iPhone such that no meaningful data could be read from the phone without the user's passcode. This kicked off nearly half a decade of new debate, mostly an uninspired retread of season one, to be honest, about encrypted Apple and Android smartphones and encrypted messaging apps like WhatsApp and iMessage and Signal. And whether or not it was prudent or even possible to build some sort of secure exceptional access into such encrypted products for government investigators. The vast majority of security experts say that any such backdoor would pose profound risks to everyone's cyber security. The FBI says, y'all are smart, figure it out or we'll convince Congress to make you figure it out. The cycle continues and frankly the show has started to get a bit repetitive. That said, the few past episodes, the past few episodes have launched some new storylines unexpectedly calling into question exactly how large a problem encryption really is for law enforcement. The first in March, there was a Justice Department Inspector General report that concluded that the FBI had failed to adequately explore whether it could unlock the San Bernardino shooter's iPhone by other means before going to court and claiming that the only option was to force Apple to code a new way into the phone. A court fight they later backed out of when they actually did finally find a vendor with an unlocking tool that worked. Since then, we've seen a lot of stories about affordable tools for unlocking iPhones like GreyKey, which the FBI could buy for $30,000 and used to unlock an unlimited number of iPhones by exploiting existing vulnerabilities, which seems more sensible than legislatively demanding the intentional introduction of new vulnerabilities into everybody's crypto systems. But then in May, the FBI had to admit that it had vastly inflated the number of locked phones it told Congress it couldn't open in 2017. It was saying there were nearly 8,000 when the number was actually closer to 1,200 due to an accounting error, not their finest moment in the series. And finally this summer, there was a great report from the Center for Strategic and International Studies concluding based on extensive interviews with law enforcement that the biggest technical hurdle for investigations isn't the unavailability of unencrypted data, but a lack of technical knowledge and capacity to leverage all the data that is available to law enforcement. The low-hanging fruit, which is why the report is called low-hanging fruit. Addressing that gap too would seem a bit more sensible of a priority than demanding backdoors. But there is some reason to question the seriousness of FBI's claims of going dark, but that's not what we're here to debate today. Even if it is just a few cases that are truly hindered by encryption, I will concede that sometimes it may be a matter of life or death. But it's also critically important to remember that having access to encryption can also be a life or death issue for the people using it. The vast majority of whom are not criminals or terrorists, but just ordinary people or extraordinary people who are dissidents fighting for human rights, journalists fighting for the truth, survivors of domestic abuse, small entrepreneurs and established industry giants trying to protect their trade secrets against espionage. And of course, as the past couple of years have reminded us, political candidates and political institutions are trying to protect their digital assets from foreign threats. That's what we wanna focus on today rather than repeating the same old technical arguments about the feasibility or infeasibility of secure encryption backdoors. We wanna focus on the people that encryption protects and who will be put at risk if we weaken it. So on our first panel, we'll hear the perspective of many of those ordinary and extraordinary people whose lives and liberties encryption helps to secure. On our second panel, we'll hear about how the crypto fight has gone global. This interminable crypto debate series in the US has now been adapted and exported to other foreign markets like the UK and France, with the Australian government in particular as a worrisomely enthusiastic audience. And on our third panel, we'll hear from industry representatives about how critical encryption is to protecting our privacy, our cybersecurity and our competitive edge in the 21st century. Between panels, you'll hear a keynote from Robert Anderson, a former senior FBI official who worked on the FBI San Bernardino case. And it's now speaking out about how strong encryption is critical to our economic and national security. He joins a long roster of former senior law enforcement and intelligence officials who've done the same, including former DNI, CIA, and DHS chiefs and some international law enforcement intelligence voices as well and we've provided a convenient handout collecting a bunch of their quotes. Finally, we will close with a keynote conversation with Congressman Jim Himes of Connecticut who is the current ranking member and who has poised to soon become the chairman of the NSA and cybersecurity subcommittee of the House Permanent Select Committee on Intelligence who will share his own views on the encryption debate. Hopefully with thoughtful policymakers like the Congressman playing key roles in this story, we might even bring the seemingly never-ending series to its finale in a way that leaves all of us a little bit safer, at least until it's renewed for season three. So with that, I'd like to invite onto the stage the speakers for our first panel, the faces behind the algorithms, the real people, encryption protects. And please don't forget, you can follow and tweet the conversation online using the Twitter hashtag crypto saves. Thank you very much. Well, hi everyone. My name is Nima Singolyani and I'm a senior legislative counsel with the ACLU and I have the extreme pleasure of moderating today's panel. The issue of encryption is one that continues to be heavily debated. Some people argue that it's essential to free speech in a democratic society. There are others that are more concerned that it poses an insurmountable threat to law enforcement. Our panel today, we're very lucky to have them but they have experienced studying and working directly with communities that are often targeted by surveillance. Whether it be by abusive partners, by law enforcement, by repressive government, and by others that may not always have the best intent. My hope is that our panel today will focus on these very real people, these very real people who are impacted by surveillance and these very real people who will be the ones who are directly affected by any changing policies on encryption. I'll introduce them briefly. They all have extremely long and impressive bios that I hope you'll go online and read but in the interest of time, I'll just give you their names and we'll dive straight into the discussion. We have Matt Mitchell, the director of digital safety and privacy at Tactical Tech Collective and the founder of Crypto Harlem. We have Asya Bundawi, a journalist and the director and producer of The Feeling of Being Watched, Cynthia Wong, senior researcher at Human Rights Watch and Cindy Southworth, the executive vice president and founder of the Safety NetTech Project and at the National Network to End Domestic Violence. So I hope you all join me in welcoming them today. I thought to open the panel, I really wanted to ground this discussion in how surveillance works on the ground and how encryption operates in some of the communities that are most affected. So I wanted to throw out a question to all panelists and ask them to provide one concrete of example of a case where encryption has either been critical or where the failure to secure information has impacted someone that they've either worked with, studied, read about, or community that they've interacted with. So I guess I'll just maybe we can go in order or you guys can jump in. If you don't want to go first, it's okay. Let's do it, let's go. All right. Yeah, I mean there's so many examples where marginalized people are on the, surveillance is not good and encryption is seen as an antidote to that but surveillance is not metered out evenly and marginalized people are always seen as suspicious others. An example that I could think of off the top of my head is in my neighborhood in Harlem, there are a huge amount of surveillance technologies designed for public safety. It's kind of like a beta test Petri dish of all different types of monitoring equipment and young folks often get caught up in that and there's a case of one person, his name is Jelani Henry, who I know is also an individual who spent 14 months in prison waiting for a day in court that never arrived because of having a device that's a phone number that's not encrypted, it's tied to a bunch of other people having social media likes that are on other people's posts and because of social media history, because of contacts, because of text messages, that's all it takes for you to be part of a gang conspiracy charge, which is something that happens west coast to east coast in the inner city where doors are kicked in synchronized six in the morning, seven in the morning and hundreds of young folks are just gathered up and often they're not aware of their rights and because they don't have this digital safety, they find themselves on the negative side of these things and the good or silver lining if there is any on this story is Jelani was like, I'm not gonna sign anything, I didn't do anything wrong and he just sat there and a lot of people don't have that fortitude and that awareness and they end up doing time in jail or prison or things like that, that's a very real case I could think of. I mean just a general example in the making of the film, so the feeling of being watched as a personal investigation into decades of FBI surveillance in the community where I grew up on the southwest side of Chicago, predominantly Palestinian American, Muslim American community and the film investigates the decades of surveillance that happened there and also I go to this, through this FOIA journey to actually get the records of the surveillance from the FBI and as the film was a five year long process and as we were making the film, we realized about two or three years in that we needed to do a much better job of how we protected our data, our footage, our sources, all the people giving us information, et cetera and that was because about three years into the making of the film, we got an alert from Google telling us that government hackers had accessed our Google drives and had breached our emails and so that was the moment where we really thought very deeply we had been making a film about surveillance, not thinking so much about us being under surveillance while making a film about surveillance and realized that we really needed to think much more deeply about security and what that meant not just for us and the filmmaking team but also all of the people in the community that were now a part of the project and had talked to us on the record and so we were worried about a few different things about illegal surveillance from the government, about legal surveillance from the government and also about potential subpoenas and what that would do to our footage. If all of our footage was subpoenaed, for example, what would happen in that case? And so we went through a lot of new lengths and one of the most important one was encrypting all of our hard drives and having hard drives and also moving the hard drives around in different physical locations so that they wouldn't all be in one place. We put our hard drives also with our lawyers so there was a set with the lawyers and we had a set but it also just shifted how we were thinking about the actual process of documenting a process of revealing the layers behind the surveillance that had been happening in the community for such a long time. And so one of the things we ended up doing when this happened, we freaked out and we did a whole workshop for our team about encryption but then we went so far and we also did it in the community. So actually the Freedom of the Press Foundation came all the way out, Harlow Holmes, a brilliant person at Freedom of the Press, came to Bridgeview to the community and did a crypto workshop in the neighborhood to say, well these Google hacks have been happening, people should be aware of them, here's how you can protect yourself and encrypt your stuff. And yeah, that was a really important process where we shifted and we realized that encryption could really help us or be one of the barriers to. It's actually just one. Encryption is one of the tools you can use to keep yourself safe in surveillance but there are a lot of non-technical, non-digital tools that we also use in Bridgeview. A lot of it is about relationships, the relationships you have with your neighbors and how people warn each other about things and communicate information non-digitally and offline is also really important. So I'm the internet researcher at Human Rights Watch but I also work very closely with my colleagues who focus on LGBT rights and they have told me that there are over 70 countries around the world who currently criminalize consensual gay sex between adults. In four of those countries, that's actually a capital offense so you can get the death sentence for it. But even in countries where those laws don't exist, being outed as LGBT can get you arrested, you can lose your job, you can get beat up or even killed. We have actually spoken to, for example, activists in Eastern Europe who've had to flee their country because of these issues. Essentially, government nationalists and trolls have decided to basically weaponize homophobia. They will post videos about prominent pro-democracy reform activists outing them as gay on social media in order to get them arrested and harassed. And again, this has directly affected their ability to fight for human rights and fight for democracy in their countries. There's gay people in all these countries and they are simply trying to live their lives with dignity and respect. And encryption is actually really critical to that and to their safety. In a lot of cases, in a lot of countries, using encrypted messaging apps like WhatsApp or Telegram is one of the only ways LGBT people can connect with others in the community and communicate and socialize safely. It's also really important for groups that actually support LGBT communities, so groups that provide health information, for example, or other kinds of services, they also have to communicate with the community safely or else risk them being put at risk. So there's also, I mean, we have actually talked to digital security groups who are actually doing work in places like Nigeria, Eastern Europe and elsewhere who are specifically focused on these extremely vulnerable communities and teaching them how to use secure tools, teaching them how to use encryption to protect their data in various ways so that they can protect themselves. But more broadly, I think we've seen various kinds of protest movements in Iran, in Hong Kong, in Russia, really utilizing encryption more and more, especially since I think WhatsApp rolled out into an encryption a few years ago to mobilize, to push for reform and to demand transparency from their governments. So there's multiple levels of encryption needs when it comes to victims of gender-based violence, domestic violence, sexual assault stalking, but I'm gonna sort of look at it from two different angles. One is that for 45 years, we've promised people that if you come to our doors, if you call a rape crisis line, if you come to a domestic violence shelter or hotline, no one will know that you've experienced this horrific thing. And it impacts everyone, so we serve CEOs who have been sexually assaulted or abused by their partners, we serve law enforcement, we serve politicians. This issue cuts across all demographics. And we know that if word gets out that someone is seeking help, they are not likely to come forward. We know there's a significant chilling effect. And so 15 or more years ago, when I was just starting the safety net project back around 2003 or 2004, I got a call from a state-level government employee in California and they said, we understand you're this tech person who does domestic violence stuff, can you help us? And they said that a shelter had been broken into and a computer had been stolen. And they, under the laws in California, had an obligation to notify people whose information may have been compromised on that computer. And they said, do you have any suggestions on how do we safely reach the victims who were accessing services who were on that computer? And I said, no, there is no way. You can't just call and say, hi, this is Cindy. Calling from the domestic violence shelter is merry there. You can't. You might get someone killed by calling their home and asking if somebody had received services from a domestic violence or stalking program. And so encryption on that hard drive 15 years ago was just not a possibility. Today we are huge proponents of zero knowledge encryption so that the only people that can actually access the raw data are the owners of the data, the victims and the advocates that are serving them, not any intermediaries, not any web hosting providers and definitely not a defense attorney who's trying to pick apart a sexual assault case by accessing records of privileged, confidential, incredibly sensitive therapy nodes. And then the other flip side is we know that not only are victims in every profession, offenders are in every profession. So of course they work for Apple and Google. They work for nonprofits. They work in law enforcement. And so several years ago I was at a briefing with many of my law enforcement and prosecution friends and we worked extremely closely with the justice system and they were talking about how they desperately needed a backdoor into phones to be able to solve crimes. And I said, but how do we keep safe the victims of domestic violence and stalking whose abusive partners work for those major tech giants? And they were flummoxed. They didn't have an answer because there isn't an easy answer. So encryption really does make a difference in so many lives. So one of the things that I think sort of came out in all of your comments is this idea that people change their behavior, right? When they're worried about being surveilled, whether they have evidence of that or not, it has a powerful effect both on what they do but also on kind of them psychologically. And Asya and Cindy, I'm wondering if you both could talk about, you know, Asya, you know in your film you've talked about the effect of post 9-11 surveillance on your community. And Cindy, you've touched on kind of the potential effects for victims of violence on how they might react simply based on the idea that they might be surveilled. And I'm wondering if you could both talk about how does that affect people in terms of psychologically but also in terms of what they feel comfortable saying and doing. Yeah, that's a great question. I think a lot of times when we talk about surveillance and the effect of surveillance, people immediately talk about privacy or jump to privacy. But our community, we're not actually worried. Privacy is like the last, on the hierarchy of concerns, it's one of the lowest ones. Folks in our community are worried about deportation, they're worried about incarceration, they're worried about their livelihoods and their work being affected. And these are the concerns communities of color specifically have when it comes to surveillance. It's not just about your privacy being breached. And so the effects of that are compounded because it's an existential crisis. Because surveillance in our community at least had, and by the way, it started in the 90s. So it's about a decade before 9-11, the surveillance, one of the largest domestic surveillance investigations in terrorism, counter-terrorism ever. And it was focused on my neighborhood. And a lot of ways you can trace a direct line between the tactics that the FBI used in our community and the effects it had. So the use of informants, for example, created massive distrust in the community. People no longer talk to each other about what was happening, people no longer felt safe going to the mosque. I think one of the most lasting effects was a real chilling effect that it had on organizing in our community. That suddenly you walk a different way when you're being surveilled, you talk a different way, you're afraid of identifying with groups that you think might open you up to even more surveillance. So folks in our community were reluctant to sign up and join the Muslim Student Association, for example, at their universities when they went. And that's the real, I think it's so much harder to talk about what you don't do. And that's the real perniciousness of this kind of surveillance, is it prevents you from doing certain things. And it's a lot harder to mark and track what you don't actually do, that you didn't participate in this, you were too afraid to donate to these organizations, you didn't do all of these things, that you would have ordinarily done had you not felt that you were under surveillance. And that's really one of the most dangerous things and it's so hard to document also what are people not doing. And then on a psychological level, I would say the paranoia that we live with in our communities because of this is really heightened and brings up all these questions about mental health too and that's the kind of conversation we're really interested in having in our communities now is from a trauma perspective, what is this done? What kind of collective trauma has this surveillance created and what does healing look like from something like this? And in terms, the paranoia makes it hard for you to realize or understand what is really happening and what's not happening. And a lot of that was actually my motivation to start filing these Freedom of Information Acts to find out really specifically what is happening and what is not happening. And we've started this movement in our communities to get people to file Freedom of Information Acts for their own files, for their own family's files because there's this sense that the government is all seeing, all knowing I. But the reality is they're looking at very specific things. When you look at your file, you'd be surprised maybe to see that they're only looking at your international travel or they're looking at this. And from a psychological perspective, it's very important for people to know that they're not the also I, that this is specifically what they were looking at and what they were surveilling and these communications were at risk. And it kind of makes the problem instead of from being this massive problem that you can't really ever confront into something that you can confront. And so it's also like psychologically empowering for people to actually know what is in their files and what do the records look like. And that's been part of the process of trying to reverse those effects that have really changed our behavior in the community collectively and individually. Victims of domestic violence and gender-based violence and stalking have a slightly different spin on the surveillance issue. They often wouldn't even call it surveilling. It's just part of the relationship. And when it comes to stalking, typically it starts during the domestic violence and I hate calling them a domestic violence relationship. The relationship is not violent. The abuse of partner is. So a controlling partner starts stalking and surveilling the victim early on in a relationship. Things might even be fairly rosy at that point. And it may start with hundreds of calls or text messages just to check up on what you're doing or let's turn on, find my family apps and keep an eye on each other for safety. So it's all sort of couched in fluffiness at the beginning, potentially. But as the relationship gets more dangerous, it gets more violent, the partner becomes more and more controlling. It then is if you turn off location services, you will pay tonight. You will be harmed physically for denying me access to your location 24 hours a day. And so then when a victim is able to break away from that partner and is trying to rebuild his or her life, then the big worry is how do I keep my activities, who I'm talking to, my new job, my new location, my new apartment, my friends, the support group that I'm reaching out to, how do I keep that protected? And so they don't trust anything at that point because every part of their life has been monitored by their abusive partner. And so one thing that we're very concerned about is we don't want victims to be even more isolated. Isolation is a huge tactic of domestic violence. And if we tell victims that the only way to be absolutely sure that your partner doesn't know what you're doing is to have no digital footprint, that's just not possible. But it's also incredibly isolating. It would mean a victim's not using social media to reconnect with family and friends, access resources, learn about patterns of violence, learn about empowerment. All of those incredible resources require interaction that might be surveilled and also lead the offender to where you're at. And it also isn't just the victim, then you have to think about what other people are doing, saying and posting. We found years ago a victim of domestic violence long before we were sort of talking encryption. Her abuser found her, not because of her own phone records online, but he knew her best friend and was able to create an online account into her best friend's phone records and saw the new phone number. And of course, since they talked daily, there was a clear record showing the new phone number. And so the abuse of X was able to track down his ex-wife by packing into the best friend's documents and records. And so when it comes down to it, we need to make sure that they're strong encryption everywhere so that a friend of yours doesn't inadvertently compromise something without even knowing. And none of us should be required to walk around life saying, hi, my name's Cindy. And six years ago, six months ago, I was battered. We should have the privacy to choose to disclose when and where and to whom. And if we're relying on you telling everybody you meet that you've been victimized as a way to increase your safety, that's putting an additional burden and a lot of shame and stigma on people. So it has been striking me as we've been having this conversation and as I've had other conversations that people see this issue very differently. And I think part of the reason they see the issue so differently is that surveillance and encryption doesn't affect all communities equally, right? It's often, you know, whether it's victims of violence or communities that have historically been targeted by police or law enforcement or human rights defenders around the world. And Matt and Cindy in particular, I'm wondering if you can talk a little bit about what does it mean for these vulnerable communities to have this type of technology? What is the effect in terms of what they're able to do, the sense of empowerment, the ability to combat, whether it's discrimination or human rights abuses, et cetera? Yeah, I think like black and brown folks living anywhere in the United States are often seen as suspicious other and you're not given that benefit of the doubt. So well, no one's gonna ask you, can you be trusted what's going on here when they can surveil and do other things to extract that information? And this is a relationship that black folks have had in the United States since we got here, right? And in more recent times in 1960s and 70s, the death and surveillance assassination of so many civil rights leaders with the church commission and Cointel Pro, that's left a scar on the community that's left, a lot of trust that we'll be talking about with intimate partner violence is the same thing where that dynamic is gone. And when you're with Crypto Harlem, going to the community, going through the inner city, speaking to people in Miami, in Harlem, in Detroit, in LA and saying it gives you a part of yourself back knowing that you have the choice now. So I talk about these two outcomes that we all want where everyone treats each other really well, crime doesn't exist, set an all time low, like we've got it all figured out and everyone's at their best behavior. And history teaches us there's two ways to get there. One is it's because you're being monitored and surveilled and everything you do and say is being seen and recorded and you have a credit score that's gonna affect you everywhere you go or because you have the ability to close the shades, you have the ability to lock the door but you open it up. You say come on in neighbor, let's sit down and talk. Encryption gives you that because without encryption you don't have a door, you don't have shades, you don't have a window, you're completely exposed. And I think that in communities of color when I speak in Crypto Harlem I never get the I have nothing to hide thing. I get the thank you for being here. I can't wait to download that. What do I do next now? And it's really reaffirming. And I think it's something that really supports you because with marginalized people your identity is what's turned against you. And when really being reconnected with your true self is very powerful. I'm lucky to witness that. I wanna go ahead. Was that for me or for? Yeah. So we human rights watch works in 90 countries around the world and we have 60 offices and increasingly our country researchers are actually from the countries they work in and they actually live and work there full time. And in each of these 90 countries we really rely heavily on a network of local NGOs and local journalists and the victims and witnesses that we talk to to document horrific human rights violations. But these are also people who are extremely at risk as our own staff in country, right? In some countries if they are caught talking to us that in itself could endanger them, could get them arrested or worse, right? And so encryption is really key to our work. Basically if we can't protect the identity of our sources and protect the security of our communications and our data like we won't do the work because we take that really seriously and that's a huge responsibility. And for people who are in the country it's not even an option, right? Like they need to do that work because they need to make sure that they stop the human rights violations. And encryption actually helps us make sure that we can protect the sources from retaliation and to protect our work and basically to allow human rights violations to be held to account. And it's one of the reasons why actually in response to the Apple San Bernardino case a couple of years ago the top UN human rights official actually issued a statement that said is not actually an exaggeration to say that without encryption lives would be endangered. Take issued not with your question but with a common misunderstanding that marginalized communities are a small percentage of the population. And when you look at who has privilege and who is targeted in some form or another just one in three women worldwide will be physically assaulted by an intimate partner at some point in her lifetime. That is a staggering number and that's globally. And then when you look at LGBT relationships there's a similar rate of domestic violence and intimate violence and sexual violence. Then when you add in people that are discriminated or targeted because of their faith, their religion, their race to ethnicity it is a large chunk of the population and we tend to have panels like this and people think, oh that's a small subset of the general population across the globe. It's actually the majority of the population is targeted in some way or another whether it be by their gender, by their gender identity whether they're sexual orientation or people. But a lot of the privacy debate was discussing basically how upper class, wealthy white guys could protect their porn surfing from the government. Right, well this panel doesn't look like that. No. We got that. It's refreshing. And just to build on that, I mean in certain countries we're not just talking about marginalized communities but we're not just talking about activism. So surveillance in a place like Iran and in China applies to everyone and it plays a key role in maintaining social control, maintaining control by the ruling party. And so surveillance is meant to say we know what you're doing, don't step out of line. It's not even activism, it's just like regular people who might object to a government policy don't feel like they can express that because of surveillance. And so I think encryption allows those freer conversations especially in closed societies among a much broader base of the society. And I'm glad you asked that. I mean one of the things that has been interesting to me is historically the US government, the State Department in particular has invested in encryption technology. And they've invested it in it as a way of allowing citizens, human rights activists, others to participate and communicate in areas where sometimes there may be concerns if you express yourself freely. And this has been seen as a tool to help develop democracy, to fight oppression, to fight potentially dictators who would tamp down on free speech. And Cynthia I wanted to know if you could maybe talk about if the US changes this encryption policy, we're starting to hear that, right? We're starting to hear the FBI say, look we need some type of backdoor. There's some members of Congress who have said similar things. What does that mean for the US globally? Does that mean that we're sort of stepping back from this commitment to human rights? Does that mean even if we don't say that, that is what the ultimate effect will be for all these individuals around the world whose fights we say we wanna support? Yeah, in way back on it, I don't know the exact numbers here but essentially in the last 10 years, Congress has appropriated and the State Department and the Broadcasting Board of Governors, the BBT has spent I think hundreds of millions of dollars to support internet freedom globally, right? And so this is through a wide range of things but it also includes core support for development and dissemination of secure tools, right? Secure messaging tools, ways to secure data, especially for activists who might be at particular risk and with a particular focus on closed countries as well. And so I think it's definitely, I wouldn't say it's a change, the fact that the FBI is asking for these new powers because they've been asking for quite some time. I think it's definitely a case of the left hand not knowing what the right hand is doing, right? So on the one hand, the US government has always been strongly committed to human rights and to internet freedom. It's been a huge priority outside of the US and meanwhile the FBI and maybe other law enforcement agencies are asking for things that actually undermine internet freedom inside of the US, right? And there's a huge disconnect there. And I think, well, I would say there's been a couple of pieces out there that suggest, you know, maybe internet freedom agenda in the US is dead, right? Like the State Department isn't prioritizing it. But I would also note that Congress actually increased funding for these issues in 2018. So I think it's still very much a very core part of the US's values when it comes to foreign policy. And I think one of the most important things that needs to be brought into the domestic debate and what local and federal law enforcement need to understand is what we ask for here is gonna have enormous implications outside of the US, right? Whatever we ask for here applies to global companies with global technologies that are being used everywhere around the world. And so if you undermine encryption in Apple phones or at WhatsApp, every single other government is gonna want to demand the same. And if the companies have acquiesced to that, they've already re-engineered their systems, there's not that much they can do to really push back against those other requests. And so these domestic debates have enormous global consequences and they're consequences for something that the US has always supported, which is human rights. I think that kind of brings us to really one of the, one of the things that's been put forward is, you know, having some type of encryption backdoor, having a way to access secure communication is necessary. It's necessary for national security. It's necessary for general law enforcement. And I'm wondering if you all can provide your perspective on that. Matt, maybe from a number standpoint, we've seen some of these reports that talk about X number of crimes would go unsolved. And Cindy, maybe from your perspective as well, because I'm sure you work with individuals who both desire secure communications but also might find themselves in a situation where they might seek to prosecute somebody who's committed a crime against them. And if you could both maybe talk about your perspectives how you respond to people who say, look, we need this, if we don't have this, how will we be able to stop crime or protect national security? Sure. There's a lot of different ways I can answer this, but I'll just say on the street level, local person level. In New York, the New York District Attorney, if you go to ManhattanDA.org, there's a whole section from 2015 to 2016 to 2017, reports on encryption and smartphones. And there was even a New York Times, like op-ed piece we're in in like 2015, and it talks about an arms race. Let's get militaristic, it's an arms race between the really naive but well-meaning privacy people, digital rights folks, and then the brass tacks, living in the real world, we need a back door to your stuff, folks. And it calls out Google and Apple. It kind of creates this line in the sand, and this is, it's really silly, honestly, and I think it's a distraction. Encryption is just math. You can't break math, and to say we're gonna stop using certain math formulas just doesn't make sense. It's like banning a sentence or verbs or something. And in reality, yes, there's a frustration there. In the reports, you'll see an increasing number of smartphones that we couldn't get into. But that's like saying there's this device called a SAFE that people, we've heard some people have, and now there's combination SAFES, and it makes it very difficult, you just walk in and take the papers away. Law enforcement is always gonna have many tools at their disposal to do these jobs. One of the great things that you get from default encryption and kill switches, the kill switch based technologies, is everyone has a smartphone, but smartphone crime used to be like 3.1 million smartphones where we're getting stolen in like 2015, or 2013, something like that. And then consumer reports does a great job like tracking the stuff, consumer reports. So as you go down to modern times, that's not even a thing anymore. The street value of a smartphone is next to nothing because they're locked. If you see an iPhone and you take it from someone, there's no use to that. There's not a lot of domestic, regular, local resale value to that, and that comes from the strength of the encryption of these devices, which makes every individual who holds one of these things safe. I remember when you read stories about someone on public transit got hit in the face and someone took their device. It's because you could reuse it, and now it's very personalized, and that only comes from the safeguard in knowing there's not a back door. As a hacker, I can tell you that we're gonna know the back door before you do, and there's plenty of examples of law enforcement, government agencies, et cetera, getting hacked to this day, whether they're from nation states or third parties, et cetera. So if this golden key exists, there's a really great thing you can read about like keeping the key under the door mat, right? And what a really bad idea that is. So at the end of the day, any kind of default encryption makes us all safer on many levels that we're not even thinking about. And a persistent adversary is gonna find the door and know that it's there and then hunt for the key before you could even figure out the rules around who has access to your golden key. So that's my perspective on it. I find that the discussion around sort of solving crime to be a bit complicated in part because we really struggle to get law enforcement to prioritize gender-based violence. And we actually are creating, we got the Department of Justice funding and we're creating an app where victims can log their own digital evidence. So every time there's a death threat against the victim via text message, via social media, they can immediately preserve that evidence and have it forensically sound and not alterable to be able to get it into court because we find so many law enforcement to say, I can't deal with that, it takes too much time, I don't have time to go get the evidence, I don't have time to review the evidence. And so unless they have a defendant who's willing to basically confess, they're not gonna take a case. And so we struggle to get law enforcement to take sexual assault and domestic violence and stalking cases. So for them then to say, well, we need this phone evidence or we can't solve anything. I have a moment of, hmm, well, that's interesting. And I do understand, they're understaffed, they're overworked, they have way too many calls coming in. And so when that happens, the messy cases that are typically gender-based violence are complicated, you don't have the energy for it. It's much easier to solve a different type of crime that has less layers of complication. So there's that piece, but then also, I really believe that victims should have agency over what evidence is collected and how it's used. And so everything from body-worn cameras which are vital for so many communities at the same time if we start using what a victim utters at the scene of a crime against her when she decides she doesn't wanna testify, because one night in jail and three months probation is not gonna mean she's safe. And then everybody's mad at her for recanting when in reality she just wants to stay alive and somehow pay her rent or mortgage. But if we then say, look, we've got this great evidence, we're gonna use it essentially against you, against your will, against your wishes to get that conviction to what end, because you're not gonna have a house when it's over and you're not gonna have groceries when it's over, but we got our conviction today. So I'm really a proponent of encryption because it allows victims to control who has access to the evidence and when, and then the victim can choose to hand over the evidence or say, you know what? Three months probation is not gonna keep me alive, so I'm not gonna participate in this prosecution. So it's interesting I think in hearing some of what you both have said is a lot of people say, well look, law enforcement is gonna be the only one who have access to this information. I'm putting aside I think the technical questions about if is that possible, whether that's true, that's a debate for maybe another day or another panel. But I'm wondering if you all can talk about why isn't that comforting to some people? When I hear that I'm like, I don't know if that totally relaxes me when I think about not being able to secure whether it's my phone or my email or my text messages. I'm just wondering based on your experiences, what if communities reaction to that thin? Is that comforting? Do they say, oh, as long as it's only the police or the FBI or the government who can get this, that's okay and that sort of elays the fears that might cause me to change how I talk or change how I act? Well no, actually that's who we're afraid of the most. I mean, folks in the community are terrified of, I mean, there's a relationship with not just the FBI but local police also that is involved in a lot of these investigations and so that relationship of distrust carries over. So certainly people don't feel safe for thinking just the FBI or the police will have access to the back end that makes people even more afraid. And that's because of the historical relationship these agencies have had with our communities. I think the idea that people can encrypt beyond it being actually useful is very psychologically useful also because I think what I hear everybody saying is it gives folks a sense of agency also and so when we did this crypto workshop in the neighborhood, even more than the people practically learning it, there's really the sense of hopelessness. Even among mainstream America when it comes to surveillance, people really think there's nothing they can do to stop it that you are helpless, you are very small, the government is very large and you just have no control in this situation, there's nothing you can do about it. And learning encryption is just the first step of people feeling like they actually, there are things that you can do. You do have some tools at your hands, some resources that can make you safer and that alone changes people's relationship with that surveillance of being, the one who is hyper-visible, who has all the information collected about them and the government being totally invisible and being the watcher. This one way gaze is sort of disrupted with encryption because suddenly like you said, you have a door that you can choose to close, there are windows. And so just that relationship of being hyper-visible and there being invisible has kind of changed a little bit and so that does something I think also in terms of empowerment. So I think it's a very useful tool but also for in our community which is predominantly immigrants, it was a lot of teaching folks the language first. We had people like really, really basic like 101 into this but even that was useful because people were empowered in this wonderful way. I mean, just to build on that, I mean, in many countries where we work, the police is part of the problem, right? I mean, the police are the main abusers, justice, there's victims everywhere, there's abusers everywhere including at the police in government, in the agencies that would be involved in approving these kinds of things. And this might sound obvious but obviously in a lot of these countries as well, courts are not actually independent, there's no real legal safeguards. I mean, the kinds of things that we can rely on here at the US like a warrant requirement, an independent judiciary who can really scrutinize that warrant requirement of the ability to appeal, that kind of thing. Notice if you've been under surveillance, those things just don't exist in other countries. And so again, this is a global problem, these are global platforms. Whatever president we set here, we'll have effects elsewhere and in a very different political legal context. Yeah, I mean, we know abusers work in every field but they're a non-profit, they're activists, they are law enforcement, they're military. And given that, I wanna make sure that every victim of domestic violence and stalking is equally safe and not just those who are fortunate not to have a coder as an abuse of partner. And so, sorry, breathe in. Oh yeah, no, I was just gonna say and in my community, law enforcement, it's like people have a ridiculously high incident of contact with law enforcement and it's not positive. So you'll find that you're worried about these are the people who are coming to get my data and they're gonna take my data away, but they're also the people who might take your friend's life away. So there's a large amount of extrajudicial killings on the hands of law enforcement, police-involved shootings, where there is no trial, there's no accountability and it's mostly people in black communities where this is happening, right? So yeah, I'll let you in. So I think another thing, I certainly am very conscious of is how the current political dynamic, whether it's in the US or globally, can really affect people's perspective and how they would react to let's say a change in policy that made it more difficult for them to secure their communications. And I'm wondering, you all can sort of speak to, post 9-11 for example, what did it mean for communities to maybe be able to get the tools to sort of close the door if you will, right? And what does it mean for communities now in the US and around the world? What would a change in policy represent to them and how might it affect kind of the day-to-day in terms of what's happening as people increasingly engage in activism but also are increasingly on the side of oppressive tactics? So from the perspective of Bridgeview and my community, the way we see encryption is it's one of the tools in a toolbox that we can use. Reacting to surveillance that comes in many ways. So in our community, physical surveillance is a real problem. We have cars parked on the block. Sometimes we have FBI agents knocking on doors after big events, national events that happen. And all of the Wi-Fi networks are like people named their Wi-Fi networks like FBI surveillance, Vantu, and the NSA is watching this network and like half of the names of the networks are references to the fact that it is not a secure network, right? We also have stingrays, a lot of stingrays that are used which are basically intercepts when your phone pings a radio, a phone tower. There are these like fake phone towers that law enforcement uses. So your phone pings it and sends it certain information. We have dozens of unknown towers in the neighborhood, these physical towers that are actually not registered to anyone that we know. So there are all of these ways we deal with surveillance and confront it. And so encryption is one of those very important tools. People are very aware just from the names of the networks that their digital communications are most likely being watched. And so everyone uses WhatsApp knowing that that is encrypted. People use Signal knowing that's encrypted. And people are really aware from that. So that we would be losing a really important tool in our toolbox I think if that's gone. And it would also make us feel like there are not that many tools in that toolbox that people are really aware of that the average mom on the block knows how to use WhatsApp and knows that it's encrypted, right? So this is an accessible tool that would be lost and I think it would be really dangerous. And there's definitely a relationship between the physical surveillance and the digital surveillance and the unforeseen consequences for when it's gone. There are a lot of them because those two relate to each other a lot. We're in the middle of actually trying to promote not only within the US but with our international sister safety and that projects worldwide. We really want people to be using encrypted communication vehicles if you're going to be talking to victims using them. And so more and more victims don't want to reach out by phone and make a phone call. They want to text or they want to chat. However, we want to make sure that there aren't some skilled techies whether they're 13 or 30, basically getting all that content and the reality is gender-based violence sounds really interesting. It's complicated. It looks like a reality TV show. So if somebody were to intercept a chat message with someone who's going through a really horrific time that would be really interesting for somebody who's warped to post on the internet. And we want to make sure that if victims want help that there is no wrong door. They can reach out by text. They can reach out by chat. They can reach out by phone. They can reach out in person. But if they do any of the electronic mediums, we want them using zero-knowledge encryption. And there's a platform out there that's doing that. And it's very affordable. It's Secure Connect. And we're saying here's an option that for $15 a month, the local rape crisis center can do secure zero-knowledge encryption text and chat with victims. And so we want to encourage that. So here we're on the nonprofit side trying to get our entire community more focused on encryption. Well, there's this larger debate really outside of our wheelhouse talking about encryption from a whole different perspective. I guess one theme that keeps coming up is the idea of trust here. And I think for activists, you have to trust that you are talking to the person you think you're talking to. And you have to trust that the technology that you're relying on to literally keep your life safe is actually going to be secure, right? And I think if the US government or the UK or Australia that I think we'll hear about a little later were to really push for backdoors in these tools. Again, I think activists would lose that sense of trust. They wouldn't know whether or not their government had subverted the tools that they rely on every day or whether some distant government out there, right? In the US or the UK, has subverted a chat tool that they rely on in their country. And that has an enormous impact on people's ability to organize to trust their networks in the link. And any final words before we go to audience questions? You were the only one who didn't get your last word. You know what, I love questions. Let's throw it back to the audience. Yes? Okay, yeah, I, what's up all? I am Balakrishnan Dasher, the professor of cybersecurity and privacy and information assurance at University of Maryland, University College. So I generally buy many of these, you know, strong encryption for a variety of purposes. But let me set up place, I'm double-sliced, and in fact, I'm gonna give you, you know, the title of this panel is Real People Encryption Protects. I'm gonna tell you a couple of instances where encryption actually failed, okay? Okay, here is my, yeah, sister of my colleague, who recently died of opioid overdose, okay? She was, for a while, about six to nine months, she was, she got away from it. She had a big heart surgery. She was later 30, and finally, somebody gave her the dose of opioid and she died. I think everybody in the family wished they knew who gave it to her, who contacted her. They wish, everyone wishes, they could only find out, you know, in our phone what the messages were exchanged, presumably at least before, okay? So I think that's sort of a here's the issue. The question is should everybody get kind of protection? If people have already been convicted before, should they have an even right to use strong encryption? So that's sort of the one thing. The other one is the WhatsApp, okay? Again, WhatsApp is used very heavily in many countries. And often WhatsApp is used in countries like India for a lot of false messages, propaganda and so on and so forth. In fact, there was a case where a person was wrongly accused and then killed by the mob, okay? Because he killed a cow that he stole a cow. In fact, he didn't steal a cow, he just, you know, anyway. So repeatedly I think the issue is, you know, where do you draw the line? Who should be given the, I think, and we have a multiple kind of criterion. Everybody gets a strong encryption, but some people especially if they're being convicted once before or even suspected, you know, essentially they shouldn't get the strong encryption like WhatsApp should not automatically provide strong, you know, strong encryption to everybody. So that's sort of where, you know, I'm not saying there should be a back door because there's no back door technically, but I think we have to stick some of this. I mean, there are real people really hurt, okay? Especially see, in the case of Facebook, Facebook are doing something about fake messages because they can, they can. WhatsApp, there's no way. It sounds like the question you're asking is where do we draw the line, right? And should this technology be available to, you know, a bad actor, so to speak? And I'll let, yeah. And I mean, first of all, I want to say condolence to you, Balak, and your colleague, that's very sad. We all, I mean every person here knows of these and many more stories that we're going to learn about where people are negatively affected by encryption, you know. I'm just, I'm negative reflected to the, by the social circumstance. Well, I don't know, I will tell you that. In that case. I mean, as a hacker, there are people who can use encryption on the dark web to do horrible things. Right. And that's not a social thing. It's that, is encryption that keeps them hidden, right? But there are people who can use cars to rob banks and that people can use cars to take their families on vacations. You know, when I teach at Crypto Harlem, some people are there, they're like, look, how do I do this thing? And I'm teaching you literacy. And if you want to read medicine and then like go forth and find a cure for a disease, that's wonderful. And if you want to read comic books or how to rob a bank, that's on you too. But encryption has not anything to do with that. There's always going to be bad people who do bad things to others. And they're going to use many things from cars, homes, you know, they have locks, we don't ban locks. Laws protect all of us, but some people take advantage of those laws. But that's the beautiful thing of having rights and balance, right? It's this assumption of protecting the many. There will always be. It would be naive to say that there will never be people harmed by this. I see it in my work when I go and travel places and assist people where encryption causes people to go to jail or they're accused of all this criminality because they use encryption, et cetera. There are a lot of negative consequences, but I think the positive far outweighs it. And we just have to remember to hold those stories, know those names, and focused on those realities, but they're true. I just, as a social worker who's been dealing with the ills of society for 28 years, I try to keep the sort of onus on who's at fault for a negative thing that happens on the person that actually pulls the trigger or kills the person. And that the technology was a tool that might stiny the holding someone accountable. But if a victim is killed by her ex-husband, encryption did not kill her. Her ex-husband killed her, and that encryption may or may not make it easier or harder for the investigation. In my world, there's never one way that an abuser or stalker is perpetrating their crime. There's literally like they're doing everything, they're spyware, there's physical surveillance, there's calling friends, there's mail, there's in 25 years ago, if we were solving the same crime of an overdose, there would be no phone evidence. It would entirely be going out, talking to people on the street, interviewing neighbors, talking to community people. I think the challenge is when we have law enforcement who think that's a lot of time, it's a lot of work. And if we could just get this evidence, we could shortcut a lot of time and get to an answer sooner. So it's not at all to say that the encryption isn't related or isn't relevant. It's more that I just try to hold, like if somebody's dealing drugs, I want you to burn in hell in your own brain for that, not say, oh, the encryption is to fault, not me. So go ahead and then we'll go to the question in the back. So maybe if you want to, the mic wants to start walking that way while Cynthia. Yeah, I mean, just to pick up, I mean, a lot of these issues that you've identified, and we've actually been documenting some of the mob violence that's happened in India and other countries where WhatsApp is the most popular communications platform. And I think my takeaway is, first the benefits I think far outweigh the negatives, but also these are much broader social problems, right? The problem of the lack of digital literacy, the problem of lack of education, generally speaking, the problem of people not questioning what they're reading and then going out and committing horrific crimes. I think those are much broader social problems that need much broader social solutions than backdoors to encryption. Homophobia, it's a much bigger issue than encryption. Moderate, for about eight years, I did a lot of corruption blogging of journalists, and I found out that I was under prism surveillance. I'd retained a lawyer who was actually torsion's lawyer, and because of that, I got thrown into this bizarre surveillance. And I felt, when I discovered after I watched a Snowden movie, I felt raped. I felt really paranoid. I ended up now taking medical marijuana for the stress because I kept feeling like people knew every aspect of my life. Now, I got most of my sources through dead drops, but people kept thinking they could email me issues. Somebody emailed me the names of all the rendition pilots, the pilots for the CIA rendition flights. And I never wanted that information. I never asked for that information. And I think that there needs to be tool set up so you can't get stuff that you don't want. And then when you were talking about the issue with surveillance today, somebody actually recently, they put a spy cam in my home. It looked like a little outlet plug thing that looked like a night light UBS thing. And I mean, there's so many ways that people can be put under surveillance. And I don't worry about my government doing anything to harm me. Probably because I was under prism surveillance, they know I'm not doing anything wrong. But at the same time, I just feel as a human being, I feel like I've not been respected. And I'm grateful for the death threat information you said because I sent the Justice Department a lot of death threats. And they never did anything about it, even though they had the phone numbers, they had all the information. But it was just, I was a little fish and it wasn't that big of a deal to them. So I'm grateful for that software. But if you can talk about privacy and the surveillance today with what the, you can buy a spy cam at Costco and spy on your neighbors. I mean, this is an issue that needs to be addressed because I just saw like a Black Friday thing where you can like have eight cameras to spy on people for like $99. So thank you. One of you want to address kind of that, that reality that surveillance is more widely tools to surveil people are more widely available than perhaps they've been in the past and what that means in terms of the policy around encryption and other tools to protect against that. Well, I think it touches on a lot of things. There's an economy and a business for spyware, you talk about, which is involved with a lot of intimate partner violence. There are, it's not just commercial entities like Facebook. It's not just three letter agencies. It could be your neighbor. It could be a person we see. There was a women's rights protest in South Korea was like anti these small surveillance cameras. And, you know, it's as technology gets smaller and as these tools become readily available to any of us, I think, again, you can't solve technical problems, technological problems with social things and you can't solve social problems with tech answers. And this just seems like more of like a societal or social issue that we need to talk about and more openly. So thanks for sharing your story. I would also add that I think like, I wish we had better language for these types of things because, you know, people always say, oh, we're surveilling ourselves on Facebook. And I think like that's really the wrong language to use even with consumer spyware because surveillance is about control and in both domestic violence issues and when we're talking about government surveillance, it's about control and what it does to groups of people. It's not actually, in my opinion, it's not really actually about crime solving. It's about exerting social control on people using these like classic mechanisms and interpersonally, actually, those very violent dynamics, I think there's a lot of metaphors and analogies between the abusive relationship, interpersonally and the abusive relationship between government and these communities, which I see actually in that framework, but this is not the same thing as necessarily as you oversharing on social media or even your neighbor maybe spying on you for, I don't know, real estate reasons or like that these are different things and it's important to actually have different words for these, this kind of surveillance and that is spying or what it is, but because it affects people differently depending on that intention. Hi, my name's Emma Coleman. I work in Public Interest Technology here at New America. And Asya, I saw your film yesterday. It was very beautiful, but I think one of the most striking moments is when the FBI stages that fake robbery to get into people's houses without warrants. How do you sort of talk about these great links that the government has gone to to spy on your community outside of your community without being labeled paranoid or a conspiracy theorist? I mean, I say I'm paranoid like that's normal because this is an effect. There's a cause and effect relationship between the investigations and the tactics the FBI used in my community for a long time and what we live with now. But I also say you can be paranoid and truthful, you can be paranoid and right at the same time. And actually, again, we don't have a word in English for justified paranoia. I'm sure there's a word in German for this, but there's not a word in English for why you were actually like rationally sane and very sane to be worried about this kind of stuff and to have it in the back of your mind. And so that's why I talk about legal and illegal surveillance. We think really deeply about both of those things in our community. So legal surveillance being when you have a warrant and you have an active investigation open and illegal surveillance being things like staging break-ins to people's houses to take information. Also, intelligence investigations and what those look like because they have different rules applicable to them. So all of those things are, and then I also say that like that there is really this gray area between what we know and what we don't know and when things happen in our community, when law enforcement comes into our community and there's a big event, there's a large investigative activity happening, there's something happening and they don't ever bother to actually give an explanation to the community about what is happening. The night of this event you're talking about, they said it was a bank robbery and then the next day they denied that any banks had actually been robbed. So when you do that, that furthers this distrust in communities. When you actually don't know, in fact, you're feeding the paranoia when you do things like that, when there are large investigations and you don't offer people explanations for what's happening. So I would say that actually the paranoia is not just an effect. It's not an accidental consequence of these investigations. I would say arguably it's one of the tactics that the FBI has used in communities of color for a very long time and they did this with the Black Panthers in Cointel Pro explicitly. The Freedom of Information Act records that you can read say the purpose of this surveillance is to create so much paranoia within the Black Panthers that they are discredited among mainstream media that they're discredited among the general population and that was an explicit tactic that was used. So I think that creating the paranoia is an explicit tactic. The physical surveillance in our communities where you can see the car parked there and the person in the car knows people can see the car parked there is meant for you to see it, that that is meant for you to see and this is happening actually with a lot, I've heard this happen with Black Lives Matter activists also who are photographed openly in when they're taking certain actions and so these things are over it and so I would say the paranoia, it's not so simple to just say that this was a consequence or an effect of what happened but it was also an explicit tactic that the FBI used within our communities and this is why it's even more pernicious. So we're out of time. I thank you all for coming and sharing your perspectives and your experiences and thank you all to the audience for the great questions. I'm a senior counsel here with the Open Technology Institute at New America. I just wanna thank the previous panelists for an incredible discussion about the real people that encryption protects and the circumstances that many of them live in. We think of the encryption debate, oftentimes as something that happens really only in the US because we've been having this debate since the 1990s in one form or another but the reality is that this is a debate that in recent years has truly gone global and so including that part of the discussion I think is really important. So I'd like to invite our next panel up for discussion of human rights in the international encryption debate. Sharon Scarlett and Nathan, please come on up. All right, thank you, Robin. I'm Sharon Bradford Franklin. I'm director of surveillance and cybersecurity policy here at the Open Technology Institute at New America and I'm pleased to have the opportunity to serve as moderator for our conversation about the international debate and human rights. As you know from Kevin Bankston's opening remarks to things, one, we are not here to debate whether there should be a mandate for encryption backdoors or why that may or may not be needed. What we are here to talk about is what the threats are and what the threats are posed to the rights that can be protected by strong encryption as you've just heard on the prior panel. And as you also know from Kevin's remarks and Robin's intro, that debate is happening not just here in the US but has also gone global and in particular, we are seeing this debate happening and threats being posed in the United Kingdom and in Australia and my two colleagues here are experts on both of those countries and the situation that is going on there. First, in the UK, the UK enacted its Investigatory Powers Act in 2016 and in Australia, they are right now like literally this week in the middle of debating a proposed legislation there, a telecommunication and other legislation amendment or assistance and access bill. And both of these, both the existing legislation in the UK and the bill being debated in Australia impose grave new powers that my colleagues here will discuss. Seated here with me are Scarlett Kim, who is the legal officer at Privacy International and Nathan White, who is, I have his title here, here we go, Senior Legislative Manager at Access Now. And we're gonna try and have more of a conversation a little bit back and forth up here to talk about these situations. So I'm gonna start with Scarlett and ask Scarlett if you could start us off by telling us a bit about the Investigatory Powers Act in the UK, particularly because, as we understand, that's been a bit of a model for what they're doing now in Australia and just focusing on what kinds of powers it gives the UK government. Sure, so I'll briefly just explain what the act is and then I'll sort of hone in on the powers that present the most direct threats to encryption. So the Investigatory Powers Act is the statutory framework that lays out the UK surveillance powers and its passage can be directly traced actually to the Snowden revelations, which revealed that the UK intelligence agencies were engaging in a wide variety of surveillance activities whose legal basis was unclear. And that led to a series of wide-ranging inquiries and one of the conclusions of those inquiries was that the current, at that time, the current framework governing the UK surveillance powers was confusing and outdated and that led to the introduction of the Investigatory Powers Act in late 2015, which was then passed in late 2016. The act covers both domestic and foreign surveillance, which means that it covers the activities of both law enforcement bodies and intelligence agencies, although the distinction in the activities that those bodies respectively carry out is much blurrier than it is in the US. So as an example, in the UK, the security and intelligence agencies like MI6 and GCHQ can conduct direct surveillance of UK persons to prevent or investigate serious crime or to protect national security. And I think that's sort of an important distinction with the US. But for the purposes of the discussion, I thought I would focus on the powers within the act that authorize the government to compel companies to facilitate surveillance because those are the powers that present the most direct threats to encryption. And there are predominantly three powers under the act that are relevant here. The first is the power to compel companies to facilitate the implementation of a specific warrant for a range of powers, including interception and hacking. The second is a power called technical capability notices and the third is a power called national security notices. And I'm gonna focus on technical capability notices or TCNs for short because those were designed explicitly in part as a mechanism to allow the government to undermine encryption. So the definition of technical capability notice in the act is incredibly broad and vague. It's defined as a notice, quote, necessary for securing that the operator has the capability to provide any assistance which the operator may be required to provide in relation to any relevant authorization, unquote. That probably makes no sense. The idea essentially is to compel companies to build in a permanent capability which will help them facilitate the implementation of specific warrants in the future. And the IPA, the Investigatory Powers Act, further states that a TCN can impose any applicable obligations on an operator and applicable obligations aren't specifically defined. It's essentially an open-ended list. Although the act provides a list of examples which are also incredibly broad and vague. So the first example is obligations to provide facilities or services of a specified description. And that TCNs are designed in part to undermine encryption is very clear on the face of the legislation because one of the other examples in the list that the act provides is obligations relating to the removal of electronic protection applied to any communications or data. I'm gonna stop here in describing the TCN power. I just wanted to, I thought it'd be useful to explain why we think of TCNs as such an extraordinary expansion of the government's power. And there's predominantly two reasons. I think the first is that it's sort of exemplary of a fundamental transformation in the relationship between governments and companies when it comes to facilitating surveillance. So companies have traditionally played a role that we might consider as gatekeepers of information. And traditionally governments have long relied on company assistance to access information. So in the U.S. from installing pen registers and trap and trace devices to intercepting communications or even to accessing data stored on company servers. But with a power like TCNs, the government is essentially transforming companies from gatekeepers of information to gatekeepers of security. And the government perspective may still be that it's seeking assistance to access information, but what they're fundamentally asking companies to do is to substantively alter their systems and services in a way that undermines security. The second really extraordinary feature about TCNs is that they're not tied to a specific operational purpose. So it's not about facilitating access to information in an individual case, but it's about creating a permanent capability to ensure that a company can assist in future cases. And one way to draw up that distinction is to consider the Apple VFBI example. So the Apple VFBI example occurred within the context of the San Bernardino case. But what a TCN would do would be to imagine Apple VFBI untethered to any specific case, but just a kind of order to compel Apple to write custom software to cripple security features with the purpose of deploying that software in later cases. And from our perspective that fundamentally alters the traditional balancing of equities in the surveillance context where you typically weigh the intrusion to individual rights against a specific government purpose. And here the balancing is virtually impossible to undertake because what you have is a need to weigh potential large scale intrusions against a nebulous hypothetical and future set of government purposes. So here in the US we're very familiar with the debate being about a mandate to create encryption backdoors. So the notion is that a provider should maintain some kind of mech or create some kind of mechanism so they could assure that if law enforcement comes knocking out the door with an order they could guarantee access for law enforcement or even exceptional access just for law enforcement. Now the powers that you're talking about with the Investigatory Powers Act in the UK aren't directly a mandate to maintain backdoors. So could you describe in what ways you think are equivalent powers or they can somehow use these powers to essentially do the same thing? Well just to be clear the way that the technical capability notice power is written it could be used to authorize the government to compel backdoors. But recently the UK government has spoken publicly in various forums about their desire to potentially use TCNs to basically alter user features as an alternative to compelling companies to build backdoors and just to give a concrete example platforms offer various user features. So messaging apps for example offer users the option of adding for example a device to an account or adding users to a group conversation. And what the UK government has recently been proposing is that they can exploit these user features themselves as an alternative to backdoors so that the government can essentially be for example invisibly added as an additional device to an account or an additional user to a conversation and in that sense you're not actually directly undermining the use of the encryption on the device. From our perspective we would find the use of this type of power problematic for a number of reasons. So the first is that I think the government's focus on backdoors is overly narrow. Threats to encryption can take many different forms even if it's not a backdoor per se. And I wanna talk just a little bit about the Apple V FBI scenario as an example because that's sometimes described as compelling Apple to build an encryption backdoor but that's not entirely accurate. iPhones permit individuals to automatically encrypt their devices by setting up a passcode and the key to decrypt the data on the device is protected by another key that's derived by entering the passcode. And what Apple did was it devised a series of safeguards to protect brute force attacks against the passcode. And the FBI sought to compel Apple to write custom software that disables those safeguards not to disable the encryption per se. And I think the idea of exploiting user features may be somewhat analogous because it's not about undermining the encryption itself but it's about undermining, it's about altering how a service functions in order to make the use of encryption less effective. One thing to note is that I recognize that one distinction here is that this type of capability may not necessarily as broadly exploitable by third parties but another reason that we consider it so dangerous is that it would broadly undermine user trust in platforms that are used very widely. So one analogy might be drawn to for example the use of phishing by governments. Phishing preys on our confidence in communications from trusted well-known third parties like banks or internet service providers and when governments fish they're kind of contributing to an undermining of trust in those communications. And the undermining of user trust can have really serious ramifications if you think about the way in which digital technologies are so embedded into our lives. Undermining those platforms that we rely upon for sort of our everyday but sensitive interactions can really inject a huge sense of uncertainty I think into our lives and really shatter our confidence in the infrastructure that we rely upon increasingly. And then it's also possible that undermining user trust can also negatively impact user security. So just as an example, if users lack trust in a platform they may not trust if that platform offers security notifications or updates. It's also possible that many users may simply migrate to other platforms that may unknowingly be less secure. Okay, thank you. So you brought a couple threads we're gonna come back to. Yeah, okay, definitely. But let me bring in Nathan here. So I mentioned that in many ways the UK law has been the model for what the bill that Australia, the Australian government is now considering. Can you give folks an overview of what that bill looks like and the threats that you see that it poses to encryption and the rights that we've been talking about here today? Sure, so just very quickly for people who don't know Access Now is a worldwide organization. We primarily offer technical support for users at risk around the world. That's human rights defenders, journalists, things like that. We won't help you install a printer, so don't ask. We, one of the first things we do with all of the people we help is move them to encrypted channels so that we can talk to them and us talking to them doesn't put them in greater danger. So we care very strongly about encryption and we defend encryption worldwide. And so you introduced me as an expert on Australia. I'm actually not an expert on Australia. I am often reminded how little I understand Australia but because we pay attention to encryption threats around the world and Australia is one of the greatest threats to encryption right now, I spent the last year trying to understand Australia. So with that caveat, there is legislation before the Australia Parliament, literally right now, that is loosely based on the UK bill. I say loosely based because it is written in such a way that it is intentionally confusing. They've written this bill to say, encryption is a problem, we need to be able to get encrypted data but we're never going to undermine encryption and if you can do it in the UK, you can do it here so we're not asking for any new authorities anyway. I can try to explain what the details of the bill are but there's a lot of, well they say this but then they actually say this or they say this but then they mean this. The bill is mind-bogglingly broad. It gives the government the authority to compel a user or a device manufacturer or a service provider to do any act or thing. Any act or thing. There are some limitations where they say, well we won't have anything that undermines universal standards but we get to define what universal standards are. There's no pushback mechanism. If you want to go to a court, there's actually no courts that you can challenge us on this. Just we won't do that, don't worry. So the bill is pretty scary and I usually like to be optimistic but this is one of those times where it's actually very pessimistic. There was a terrorist attack in Melbourne, Australia last week and the government responded with this is a wake-up call. We need to short circuit this committee process and we need to pass this bill by December 4th or we are all at risk. So the committee literally right now is deciding whether or not to cancel my appearance on Thursday night in order to just immediately pass the bill as written right now. Kind of scary. But it is loosely based on the UK bill and it is structured around technical capability notices which would allow the government to order a company to build in the infrastructure to be able to comply with the later order. It also has an entire section on government hacking which has received almost no attention because the threat of encryption is so strong but if we had more time to talk about section two I would also be screaming the sky is falling in section two. So focusing on the section one where they picked up on these technical capability notices that bill includes a provision that we thought was fairly welcome that says that communications providers quote must not be required to implement or build a systemic weakness or systemic vulnerability. And it also says that the government may not prevent providers quote from rectifying a systemic weakness or a systemic vulnerability. So that sounds good sounds like no back doors but if you could explain a little bit why that doesn't give the kind of comfort you would think and what other powers there are that may or may not undermine that. Sure so systemic weakness is not defined anywhere in the bill and it sounds good. If it's something that's gonna you're gonna build onto a phone sure that would apply to all phones right. The government says not so much we get to decide what that is. They in testimony have said specifically with Apple versus FBI if they were to go to Apple and say we want you to build custom software that will remove protections from a single individual phone that is not a systemic weakness because we're only talking about a systemic a single individual phone. If Apple says well wait if we build it for one phone that will be applicable for all the phones that's a systemic weakness. There is no process for Apple to challenge that. There is no court that they can go to and say but wait this is actually a systemic weakness you don't understand the technology. It is entirely up to the Australian government specifically the attorney general to make that decision. There's no pushback in the bill whatsoever. They've also in their testimony specifically said in these cases if it only applies to a single user or a single phone they would not consider that a systemic weakness even if it is applicable widely. So the example for the group conversations that I send you a text message and it appears on your phone and your laptop. If the government says we'll add another user that's the government and don't tell the user that we're now also listening the Australian government would not consider that a systemic weakness. Even though by forcing a company to be able to do that they could do it for any number of users and if there was a way for somebody a third party to exploit that it would be exploitable on any number of users. So they've written in this whole section of no systemic weakness no back doors but really that's kind of a talking point that doesn't actually have any useful protections. Okay so both of you have alluded to this proposal that we've sometimes talked about it as the ghost user where the government whether it's GCHQ in the UK or the Australian government gets itself, demands that the company add it as an additional participant in your group chat so it sees the messages and then suppresses the notification, right? So if you could talk a little bit about how that might why that is a concern, how that amounts to or doesn't amount to a similar thing to an encryption back door and leave it at that. Yeah I mean I think I would just reiterate the two points that I made before which is that there are numerous ways that threats to encryption can emerge and they don't necessarily have to require intentionally weakening the encryption itself. It can involve weakening the sort of mechanisms by which we use encryption to make it more for example user friendly. And so our perspective is that the government's attempts to basically weaken those features that make encryption essentially usable is essentially as much of a threat as targeting the encryption itself. I mean one thing to note is that the UK government in talking about these features have talked about how the way encryption actually works in practice is imperfect because humans are imperfect. So one of the reasons why you have for example an added device to an account feature is because people often use their phones and therefore their ability to kind of decrypt on the device the communications and data residing on the phone. And so there's something I think quite disturbing the government has said so what we're gonna do is basically exploit human fallibility. Encryption is basically not infallible because humans are not infallible and therefore we're going to exploit that infallibility. And I think that's something to remember which is that companies have in many ways tried to design features that take into account the fact that humans are imperfect and what the government is arguing is that as an alternative to weakening the encryption itself what would be preferable is to kind of exploit the features that companies have expressly designed in order to make encryption essentially a widespread and sort of easy feature for us all to use. Do you want to add on? No. Okay. So I want to ask you both if you could sort of flip this and look very specifically at the impact of these kinds of government powers on human rights. Privacy and freedom of expression and other human rights that you may identify that are particularly threatened by these kinds of government tools. So I guess I would start by saying that there are a number of international human rights experts and bodies that have explicitly recognized that the role that encryption plays in enabling the exercise of both the rights to privacy and freedom of expression. And therefore have also explicitly recognized that states essentially have an obligation to protect encryption as a critical aspect of their protection of those rights. I thought it might be useful to focus in particular on two reports by the UN Special Rapporteur on freedom of expression. One in 2015 and one that came out this year because those are basically the most detailed articulations of how human rights law should apply to encryption. Since this is an American audience, I thought it might be helpful to explain what a UN Special Rapporteur is. It's an independent expert that's appointed by the UN Human Rights Council to report and advise on human rights from either a kind of thematic or country specific perspective. But more importantly, the Special Rapporteurs essentially provide guidance on the scope and content of various human rights as they're enshrined in international human rights instruments. So I wanna highlight a number of aspects of the Special Rapporteurs report that helps flesh out a little bit more the relationship between encryption and the rights to privacy and freedom of expression. So first, the reports describe encryption's role in helping enable the expression of these rights. And they describe encryption as ensuring a zone of privacy so that communications can be exchanged securely. And that also it allows people to freely seek, receive, and develop opinions and ideas. And also importantly, the reports recognize that encryption is really critical to anonymity. And anonymity in and of itself has been recognized as critical to enabling the expression of both the rights to privacy and freedom of expression. And the linkage between anonymity and both of those rights have been recognized by international human rights court such as the European Court of Human Rights and also has been recognized by the US Supreme Court here as part of their First Amendment analysis. Because of the role encryption plays in enabling these rights, the Special Rapporteurs has made clear that any straight state restrictions on encryption have to meet what are core human rights principles, which are typically framed as legality, necessity, and proportionality. And what's interesting is that the Special Rapporteur found in a survey of countries around the world that many states have been unable, even in general terms, to identify situations where restrictions on encryption are necessary to achieve a legitimate government purpose and noted in particular that many governments downplay other investigative mechanisms at their disposal. And then also noted that where states do even identify an arguable legitimate government purpose, essentially backdoors and other measures that have a broad deleterious effect on large groups of people are inherently disproportionate and are therefore presumptively unlawful under international human rights law. I thought what might also be relevant to tie in to the point about this idea of adding ghost users is that the Special Rapporteur has identified that threats to encryption take many forms and that's not just about intentionally weakening encryption through backdoors. So just as examples, he's highlighted criminalizing the use and dissemination of encryption tools, laws requiring the registration and approval of encryption tools, and also the use of hacking as an encryption workaround. The last thing I wanted to just highlight about the Special Rapporteur's sort of analysis of the relationship between encryption and the rights to privacy and freedom of expression is the role that companies play. So the Special Rapporteur has highlighted that companies play a critical role in safeguarding the rights to privacy and freedom of expression, including because of their role in promoting or in some cases, compromising encryption. And here the Special Rapporteur relied heavily on the UN guiding principles on business and human rights. And just one example is in applying those principles, he noted that companies should really be undertaking due diligence, including by assessing the role that encryption and other security enhancements can play in protecting privacy and freedom of expression. And went so far as to suggest that the responsibility to safeguard these rights may even require certain companies, such as messaging services, to establish end-to-end encryption as a default setting in products. That was incredibly thorough, and I agree with everything I'll just add because I don't think you said it. The Special Rapporteur's name is David Kay. He's fantastic. Everyone should follow him, Google him and read all of his reports. All right, anything to add that you've emphasized? Because access now obviously focuses on the human rights frame in addressing the threats that these legislation are posing. I don't want to be redundant. That was a very thorough answer. I will just say there's one quote from David Kay. I think it was the 2015 report that I always think of. He identifies encryption as a precondition to the rights of freedom of expression and privacy, which is a nice, easy way to think about it. One other area that folks have been focusing on with the threats posed by both of these pieces of legislation is provisions requiring a great deal of secrecy, a lot of nondisclosure requirements and penalties for violating them, including imprisonment. So let's start with you this time, Nathan. If you could focus on some of the secrecy provisions that are issued, particularly in the Australia bill and the threats that those pose. Sure, so obviously there is a secrecy provision that if the government orders a company to do something, they are not allowed to reveal it. There is supposedly a court process by which they are allowed to challenge that gag order. However, there's nothing spelled out in it. The bill just says you can use the regular courts, but there's no guidance to a company on who you would be able to go to if you have a safe harbor to actually hire lawyers, to interview lawyers. It really seems like an area where they haven't fleshed out anything at all. Another reason not to pass this bill tomorrow. And then there's also an emphasis on users that has gotten quite a bit of attention, where if an individual user were not to comply or to tell someone that they were given an order, there are penalties up to 10 years, which could be a lot, but really depending on what the crime is, it's a little bit unknown. I've tried to get an answer in the Australian context for is that comparatively a lot compared to other legislation or not, and the law firms I've been discussing it with just say there hasn't been that kind of an analysis done, but there are very strict secrecy provisions that are incredibly unclear how anybody would actually interact with them if they were given one of these orders. And Scarlett, there are similar secrecy provisions in the Investory Powers Act, yes? So there are similar gagging provisions in the Investigatory Powers Act. Basically there are nondisclosure provisions attached to both warrants and technical capability notices. Interestingly, the nondisclosure provisions attached to warrants contain specified exemptions. So a company, for example, might be able to report to a relevant oversight body or an outside legal counsel that they've been served with an order to facilitate a warrant. But technical capability notices have actually like an iron clad gagging provision, so there are no carve-outs, essentially. You have to request permission from the relevant minister for law enforcement bodies, that would be the home secretary for companies to disclose to any third party whatsoever. I wanted to add there are a couple of other features about the UK system that essentially further shroud the use of TCNs and secrecy. The first is that there's no notification requirement for any surveillance warrant at all, and typically where a warrant is facilitated, I think by company assistance, you could imagine that notification of the issuance of the warrant might help provide some clues as to whether or not the surveillance itself had been facilitated by a technical capability and noticed potentially underlying company assistance. The other feature of the UK system related to secrecy is that evidence derived from interception, and the definition of interception includes access to stored communications is prohibited. So basically that evidence is prohibited from being introduced in criminal proceedings, and that's another way I think commonly in the US system that individuals determine that surveillance was conducted and that there were problematic issues with that surveillance because they see the evidence and it gives them a clue as to how that evidence might have been derived. An ability to challenge it. Yeah, exactly, exactly. Okay, so turning in a slightly different direction. Australia and the UK and the United States are three of the five countries forming the Five Eyes Alliance. For folks in the audience who may not be familiar with the Five Eyes Alliance, the other two countries are Canada and New Zealand, and this is an intelligence sharing alliance that goes back to World War II and has in recent years, since I think 2013, also these five countries have engaged in something called a five country ministerial to consider various policy issues. So my question for both of you is do you think there's any significance to the fact that the biggest threats that we are seeing to encryption are coming from three of the five countries in the Five Eyes and if so, what lessons do you draw from that? Why in fact, yes I do. So as you mentioned, the ministerial meeting of the Five Eyes for the last several years, I don't know how long they've been meeting. For the last several years, they've been putting out reports saying what they met about. Really interesting to people like me who've been following it. The last two years, they've said their number one issue was how they were going to collectively deal with encryption and last year before they went into the ministerial meeting, the Attorney General in Australia said, we are going to solve the encryption debate this year and we're gonna put forth legislation. That's actually what got my attention that I started paying so close attention to Australia. They went into that meeting, said, how are you doing? What are you doing? How are we dealing with this? And Australia came out and said, we are the place to do this. They modeled their legislation after the UK Act. But I think one of the reasons why they're so gung-ho on Australia is there is no right to privacy in Australia. There's no bill of rights. As Americans, it sounds very odd to us, but the idea of individual privacy just does not exist in the same way in Australia. And so I think that they looked at it and said, Australia's a really great place for us to get this legislation passed. We can show that tech companies are gonna be able to live with it. We'll get what we want and then we'll be able to export that to the other five eyes countries. I don't really have anything to add to that except that I agree 100% that there is definitely a coordination amongst the five eyes, not only between the intelligence agencies but the law enforcement agencies as well. And that I think Australia and to some degree the UK is considered a sort of testing ground to see what kind of mechanisms the government can actually compel companies to implement. And that once those mechanisms are in place, it makes it much easier for agencies in other countries to kind of point to that as an example of what's acceptable. Okay, so we have time to take a few audience questions. Do we have a microphone somewhere? Okay, so I'm gonna, two ground rules. First, please identify yourself in a deal of your affiliation and second, please actually ask a question. So this is Eric Wenger from Cisco. We actually filed testimony and I appeared in front of the committee that's considering the bill that you're gonna be testifying in front of on Thursday. And I think I wanna see if you agree with this or not. I read the difference between the technical assistance notice and the technical capabilities notice as being the assistance notice more like implementation of a warrant in the US and then the capabilities notice as being more like Calia in its reach, except it would be as if we would be delegating to the Attorney General the ability to decide what capabilities need to be included in scope of Calia. And then just by way of quick background to go along with that question, we filed comments on three areas relating to those capabilities notices. One was, we said, as you noted that there's no definition of what a systemic weakness is. The law says that the government can't demand the creation of a systemic weakness, but there's no definition of what that exactly means. We saw some testimony from the law enforcement authorities about how broadly they think that the definition would have to be. The second thing was, we said that there needs to be a mechanism whereby you can go into court in order to be able to challenge that the determination by the government because the Attorney General might make what it believes is a good faith determination that something is not systemic, but we might believe that it is. And then finally, as you noted in terms of the scarlet that similar to the UK law, we read this as mandating that you could not notify the public about the creation of the capability, which would fly in the face of policies that we have about documenting the existence of law enforcement capabilities. So I wanted to share that additional information, but then ask that question about whether or not you would relate the assistance notices and the capabilities notices essentially to the implementation of a warrant versus something that's more like a Kalea-type authority. So I'm gonna let Nathan take that question, but I do wanna also mention since I talked about some of the proceedings that our organizations up here, OTI and Privacy International and AccessNOW have all participated in some coalition comments, also joining with another of other companies filed in Australia raising some of the concerns that Eric raised and some additional ones about secrecy and so forth. But Nathan, do you wanna take the question? Yeah, Eric's exactly right. So I went over this really quickly, but the first schedule of the Australia bill allows the government to do three things. One is to go to a company and ask for something, like, hey, do you have information? Can you give it to us? The second is to go to a company and order them to give something in their possession. That's, there are things that are listed, but the things listed say, among other things. So your source code is in your possession. There's nothing in the bill that says they can't come over and say give us your source code. They can't say give us your users, give us your metadata, give us your logs, give us anything. The third, the technical capability notice, only the attorney general can order, but can order a company to build or do anything. And that can be, we are later going to give you an order to say we want the encrypted data and we want you to do something, whatever it is, whatever we tell you to do, in order to be able to comply with that later notice. It can also be things like, hey, put this little black box in your network and don't tell anybody that it was there. It could be, hey, let us play around in your land office to see how your infrastructure works. Let us see what your infrastructure is so we can figure out how to get things without it. It is very broad and I think from an American audience, sure, it kind of makes sense that when you say access, you say you have something in your possession and we have a warrant and we're telling you that you have to give it to us. The capability notice, sure, it's like Kalea, that we're going to say you must build things with the potential to be able to comply with later orders without the protections of Kalea. Do we have other questions in the audience? I'm just wondering whether you think, if the Australian bill does get passed and has as dire impact as it appears it might, might companies, is there any possibility that a company would say, Apple or somebody would say, look, we're not going to do this. You have to make a choice or is the Australian market just too big? Because on the one hand, the Australian market's really big. On the other hand, this is a test case for Australia but also for Apple and it might be able, it's kind of who backs down first at that point I would think. Yeah, that's a really important question and I think it's one that's difficult to answer in part because companies don't know. They haven't been there yet. Lots of companies threatened, oh, what are we going to do if they pass GDPR in Europe? I don't want to pull that out. I mean, you never really know until it happens. I think I happen to know that a bunch of companies are participating in different surveys right now to figure out, well, if this does pass, what are we going to do? What are we going to move? One thing to note would be this would apply even if you don't have offices in Australia, if you have customers who are in Australia. So the impacts are really too opaque to be able to answer within 30 seconds and I think a more interesting answer for the 30 seconds would be to say in the UK context, this bill has, or this act has been passed for over a year. Do we have any reason to believe that the government has actually used this power in that time? Well, so in terms of the last question, based on all the ways in which the use of TCNs in strata and secrecy, it's very difficult to know. I think one thing to note is that during the debate around the Investigatory Powers Act, a lot of major tech companies spoke out incredibly critically about many of the powers in the act, including technical capability notices, which just as in Australia, essentially gives the government the power to ask companies to build or do anything that will allow them to later facilitate a future warrant. I think my anecdotal reading of the situation is that a lot of companies are sort of in this position of waiting to see what a notice is actually gonna look like. And I think if you had, for example, a notice that would compel the building of a back door, you might actually see at that point, a tech company with the resources to fight back against that in a number of different avenues, legal in sort of the public domain and actually doing that. But I think the fact that there hasn't been a huge outcry post-bill about any specific notice, keeping in mind the gagging provisions, I mean there would have to be some kind of fight over the gagging provisions, but the fact that there hasn't been an enormous outcry from companies about the power itself post passage of the act suggests that the government hasn't yet wielded that power in a way that's making companies incredibly nervous. But I don't actually know. And one other thing to mention is in the Australia context, one of the issues that we raised in our coalition comments is that there's right now no meaningful procedure for companies to challenge notices, so sort of short of pulling out if there were any possibility for amendments to try to create a robust procedure with an actual standard that they have to live up to where if a company got a demand that they said, actually this is causing a systemic weakness, that they could challenge it and push back against it. Let's ask Eric, Eric, are you guys gonna pull out? It's tough, I think all the companies are gonna say, well if it passes, they're gonna have a club to threaten us with. Let's see if they ever actually use that club and let's see how much it hurts, I guess. Okay, well if you would join me in thanking our panelists for a great conversation. You're so fairly prepared, I think. And we're now going to break for a 15 minute coffee break, but please be back here promptly by 2.15 for the first of our two keynotes. Folks, if everyone could just start moving towards their seats. So again, thanks everyone for making the time to come out here today. We've had a really fantastic discussion so far about the folks who are behind the algorithms that encryption really protects the communities that are most at risk for surveillance and the people whose physical safety and personal freedoms are really dependent on encryption and also had a really great conversation about the state of the encryption debate abroad. As we know, things in the UK and Australia are moving pretty quickly and changing a lot on this front. But we've also got a lot to talk about here at home. And so with that, I'd like to invite our first keynote speaker, Bob Anderson, who is a former FBI Executive Assistant Director for the Criminal Cyber Response and Services Branch overseeing all criminal and cyber investigations to join us on the stage. So thank you very much. Thanks. Hey, good afternoon, everybody. Thanks for letting me come over and talk to you today on this. I think it's a very important and current topic, especially when we talk about encryption and cyber and how it affects all of us in our daily lives and how the private sector and federal government are affected by this. What I'd like to do is talk to you guys for 15 or 20 minutes. And I'd really like to get into an engaging two-way conversation with questions on this. I think one of the things that would help everybody here, at least from my perspective, is kind of walk through when it comes to encryption or cyber in my lifetime, kind of where we've been and where we're going specifically about the issues that we're talking about. Because I've got to tell you it's been changing absolutely so much, especially in law enforcement, which I spent almost 31 years in before I went out to the private sector three years ago. But I think it's a very relevant conversation. And one thing that I think will surprise most people in this room, in the last couple years, my perspective on encryption as it pertains to data inside the private sector has dramatically changed from when I was the number three or four guy in the FBI in charge of 24, 25,000 people around the world. It really has. And I'll kind of walk you through how I've got to where I'm at. And then if you got something that you want to talk about before we get to that point, just jump in. I think it'd be a good conversation. So look, as we all know, the rate at how cyber, artificial intelligence, data, bad guys, bad gals, how they're attacking us, all that's changed dramatically. Several years ago, if any of us would have said in the old days, we would have been talking about 15, 20, 25 years ago, the old days of cyber two years ago, just flat out two years ago, anything past that time when it comes to data, encryption, AI, how bad guys, bad gals attack us around the world completely changed. So I got into law enforcement in 1986, and it was kind of simple. I was a Delaware state trooper at the time. There was no videos in your car. There was no portables. There was no cell phones. There was no pagers. There was no nothing. And crime, when it happened, or any type of theft of data or information, right? Somebody bad had to touch you. They had to go to your house. They had to break in your car. They had to stop you on the street to try to get that information. You fast forward several years later when I become an FBI agent, and then as I rise up through the ranks of the FBI, it's amazing at how quickly when we talk about data and cyber that things changed. And what do I mean by that? In the FBI, probably back, even only about 2010, 2011, most of the conversations in the morning briefs of the director of the attorney general were not about cyber at all. The men and women that were in the organization at that time would realize that. And I think what people need to understand is two thirds of the FBI right now wasn't even in the organization during 9-11. It's changed that dramatically. It wasn't until around 2011, 2013, where the conversations dramatically changed. And everything was about cyber, whether it was cyberterrorism, cyber counterintelligence, cyber espionage, and how do you protect the data, right? Hence when we start talking about encryption and we talk about the private sector, how rapidly this has actually changed. I don't think a lot of people understand that. I don't think a lot of people even know that. When I talk about specifically encryption or data, what really comes to mind to me is around 2013, my first real experience outside of criminal and counterintelligence investigation, we talk about major cyber encrypted data thefts was Ed Snowden. I ran that investigation from the day it started. It surprised me that not one single solitary alarm bell went off in one of the most sophisticated technical organizations in the world. And I think that catapulted, quite frankly, for at the time Bob Mueller and then after that Jim Comey, how important cyber was becoming, not just as a mode of communication or data or encryption or making sure our PHI and our PII was safe, both in the private sector and in the federal government, but how damaging it can be if somebody was able to get that. So when I talk about this stuff, I'm looking at it at that point from really the federal law enforcement, right? But in 2015, something dramatically kind of put this on the front burner for all of us, I think. One of the last things that I was in charge of before I retired, most of you remember this, in 2015 was the San Bernardino shootout at Christmas time. And that triggered, which was all over the news, a fairly extensive confrontation on trying to obtain information off an encrypted cell phone or potentially multiple encrypted cell phones. And when I was sitting where I was sitting in the organization worried about more attacks and worried about there could be possibly more damage to people around the United States, I was very focused, so was the director and other people, the attorney general, of trying to get that data. And as most of you remember from that time when that was going on, met by huge resistance. And from where I stood then, I didn't really understand it. You know, for 30 some years we would go to a court, we would go to a federal judge or a state judge or whoever, you would get issued a subpoena or a grand jury subpoena, and then you would provide it to whoever and you would get the information. This is one of the first times that I could ever remember and especially a, which later was a counter-terrorism event that we couldn't get the information and I really didn't understand it. And unfortunately, my perspective when I went to the private sector in January of 2016 has dramatically changed and let me tell you why. So when I went out to the private sector, I ran a global information security practice, I'm doing the same thing at Mike Chertoff's company down the street now at the Chertoff Group. And my team was responsible for remediating and stopping in those three years, almost 2,000 breaches around the United States and abroad with U.S. companies and U.S. companies that were based not only in the United States but in other countries around the world. And the one thing that struck me immediately was the fiduciary responsibility for those companies that are being entrusted by the clients that have given them information. I didn't care if it was healthcare, I didn't care if it was banking, I didn't care if it was just traditional PII information, personal identifiable information, that they were entrusted by those clients, whether it was a cell phone, whether it was a computer, whether it was an encrypted app, whether it was an encrypted chat room, into a contract to say, I'm gonna keep your data safe. And honestly, after all the breaches that I've been involved in in the last three years, I do think that opening back doors to some of this technology is worse off for the people, the clients that have employed these private sector businesses, then it would be to somehow work through how we would get that maybe without that type of data needed. And I can tell you that this is a completely 180 degree different view for one reason, because when I came out here and I started working with and representing and trying to help the clients in the private sector protect the data of millions and millions and millions of clients around the world, it never occurred to me, not because I guess I wasn't smart enough, but I was looking at it through sort of a myopic glass, honestly. A lot of my former colleagues may or may not agree with me, but I really believe nowadays when it comes to encryption and encrypted data, that there has to be a conversation. And I really believe we're kind of stuck, this is my personal opinion, in a loop that started back in 2015, at the end of 2015. If any of you had been where I was when that shooting was going on and saw the pressure that was building not only from out in the LA field office, but with everything else going on at the time in the intelligence community, even though it didn't come true, we were very worried about more attacks. When I step back and look at it three years later, I'm not so sure that we couldn't have got that information potentially for some other venue. And what do I mean by that? I'm talking specifically about the FBI, but the one thing I think the conversation needs to include, and I don't know if most people know this, and a lot of countries that I visited in my prior position, and I went to 37 countries, on behalf of the FBI and our government, most of those countries have one federal police force, or they have one central intelligence force. In the United States, there's over 7,000 or 8,000 police forces. There's 36 just in the District of Columbia. And when we talk about encryption, it's not as simple as saying, well, the FBI needed this, or the DEA needed this. You have to look at the totality of what's going on across the United States. And I think one of the biggest things, quite frankly, that nobody's had the conversation about, and I truly believe the federal government, the FBI, the Department of Justice should be the leaders in this, is when you're talking to private sector businesses, and you're looking at, hey, listen, how do we work as a team not to intrude on any of the clients that you're representing by protecting their data? But how do we look at educating the tens of thousands of law enforcement officers across the country into other ways that potentially some of this data can be harvested? And what I mean by that, I don't mean like circumventing the rules, but what I'm here to tell you is I can guarantee you that most of the cyber expertise across this country, as you start to get away from major metropolitan police forces, LAPD, NYPD, Chicago, FBI, ATF, Secret Service, DEA, those abilities are surely degraded. Not because the men and the women can't learn them, but because of technology that is being used that you can obtain through social media, through other open source events. A lot of the information, quite frankly, that in my 30 some years, we just relied on through subpoena power, or relied on between grand jury subpoena or going before a federal judge and asking for it. And I honestly don't think that conversation has been had. And the one thing I'm positive of, I can tell you, in my experience, is this has never gone back, right? We're never going back to where we were two years ago, let alone 20 years ago, before any of us had to worry about any of this stuff, right? So the encryption debate until we start having a conversation, and what I mean by that, is a legitimate conversation about, and I'm gonna harp on some things that went wrong inside the FBI a few years ago, we have to have legitimate, real conversations, and what I mean by that. Some of you might have read or received or saw in the Washington Post and other newspapers several years ago, that the FBI published statistics about we couldn't get into almost 8,000 cell phones, right? But somebody was very astute and looked back at 2016 and said, well, wait a minute, you only couldn't get into 880, I think, the number was in 2016. How did it rise? Almost 6,900 cell phones. Later on, at the end of 2017, the FBI broke back, apologized and said, hey, listen, that number was unbelievably inflated through to some mistakes, it was less than 1,000 cell phones. My point is to this, and I'm not trying to badger the FBI, I love the US government obviously and the FBI, but what I am trying to say is, those numbers need to be accurate, but the bigger portion of those numbers, in my opinion, is not how many cell phones or computers or apps or encrypted chat rooms could you not get into, I think the real number should be, okay, listen, out of all that stuff, how many of these items actually prohibited you from getting the information that you needed from some other way? And I'm here to tell you, in my time in the government, and I spent a lot of time, you know, plus or minus depending how you look at it, at Congress, at the Senate and at the White House, a lot of time, the last two years, both testifying and answering questions on a variety of different things that we were involved in, and I've never heard that conversation. And I think that's a realistic conversation that I think we at least need to be trying to have as people that are sharing this space and are gonna continue to share this space from here until eternity, because it's never gone backwards. I have seen a few district attorneys and district attorneys' offices across the country trying to get together to have conversations with private sector companies, to talk to them about where do we start? A lot of that is, you know, we're gonna let you opt in, it's voluntary, you don't have to talk to us if you don't want to. I also think though, quite frankly, that the conversation when it comes to encryption needs to be much more specific. And what I mean by that is, when we're talking to private sector companies in any of the areas that I've discussed in the last several minutes, is what is the parts that you're not in any way, shape, or form due to your fiduciary responsibilities to the clients that are employing you that are off limits? And honestly, from a law enforcement perspective, I've never heard that in a conversation before because as I've told you, for 31 years and running some of the most complicated cases in the FBI by the time I retired, that was not something that ever came up, ever. It was always when you had probable cause, when you had enough information to believe, you went to a federal judge, you went to a FISA court, you went to a grand jury. And as most of us have seen, all that has happened since San Bernardino and in most cases, nothing has happened. The information hasn't gone back and forth. There's been a stalemate. And unfortunately for us, the people that are living in this type of environment right now, there's been no way forward. There's been no clear way to delineate the lines on how we're gonna look at this as a country and as a nation and as an intelligence services around the United States. And that's one of the things that I think that I'd like to hear your comments on when we talk here in a little bit, but one of the things I'm telling you that I've not seen that happen, the last thing that I will tell you when it comes to encryption and surveillance and all the other things that you've heard thrown around in the news media for the last several years. I think what's really important to understand when it comes to that is I don't think anybody to include the cutting edge tech companies and electronic companies are gonna have any clear idea of how far and how quickly this is gonna go. I don't mean that in a bad way, but I think when we have these conversations, some of them have to be in the theoretical. Why do I say that? I'll give you a perfect for instance. Does anybody in this room know what a deep fake video is? Does anybody? There's a few people in the back. That's great. Real quick. So deep fake videos are videos that three years ago, you would clearly look at it TV or on YouTube and know that somebody put Bob Anderson's face on another body and then allowed through different techniques that person to say whatever to try to impersonate Bob Anderson. Because of the advancements in artificial intelligence, deep fake videos right now are I think one of the most significant threats out there because you can't tell. The individuals will blink. They're trying to figure out if you can tell by blood pressure through lasers on the skin when you see the videos. These advances were not even thought of two or three years ago. Matter of fact, the briefings I used to get on it before I retired was like look, this is nothing that's ever gonna be a worry. Now, I will here to tell you that almost every intelligence organization in the world is looking at these things as a real threat. Because think of it, CEOs can deliver messages to billion-dollar companies. Is it really the CEO? Generals can be delivering messages to battlefield troops. Is it really the general? You get where I'm going with this. That was not around. I'm telling you, two and a half years ago. So when we go to encryption, when we have these conversations, right, to try to figure out which way do we go and how do we do this in a way that we protect the rights of Americans but at the same time protect the national interests of the country, I think some of these talks almost have to be in the theoretical. Because when it comes to encryption, I don't even think how far that we actually understand how far it can possibly go. The last thing I'll say, you can look it up on Google, but there's thousands of encrypted apps out there, thousands. Whether it's text messages, encrypted text messages, encrypted email, right? So the point is this, and I think this is a significant point, these issues are only gonna get tougher on us and I think we have to start the conversations now, especially between the government and the private sectors to try to figure out what's the way forward. We've got like about nine minutes left, 10 minutes left because it's just got the Q card. I'd really like to get some of your guys and gals opinions on this. Yes ma'am, and then we'll go over here, sir. Hello, my question very quickly is about the OPM hack. A lot of people, like myself, we got that letter. We believe that our data should have been encrypted. People, friends of mine were worried that people were gonna find out that they had affairs or blah, blah, blah, but it didn't happen and that data was probably, maybe it was a router, however it happened. Can you just kind of give us a quick update of what's gone on with our data and the government encrypting our personal data that they have collected? I can tell you, I ran the OPM hack. My data's all gone. 23 million of my friends, right? Like you, all of our data's gone. It was definitely hacked by a nation state. It's interesting that none of it's come up for sale. I should tell you what they're at least looking at the data. Let me just give you my honest opinion of the United States government IT infrastructure. And I'm not being sarcastic. The men and women are working tirelessly to protect our country, and I will tell you it's hopelessly out of date. One of the things that I lobbied for and I push for every chance I can is I think the Congress and the Senate really needs to review the laws that how federal government agencies actually bid out contracts to receive updated equipment. In most cases, quite frankly, when you do these bids, and depending on the budget cycle and if the government's shutting down or not, the money that you were given three years in a distance for some update to an IT infrastructure for getting encryption, I just mean updating your email system. Can't even pay for it. And that's a huge issue. And it goes back to the education issue I was talking about. You know, and I'm being bluntly honest up here, by the way, several times I tried to give information, I was the number three guy in the FBI, back to the FBI on breaches and stuff that I would see on my cell phone every day with the clients that were handling, I couldn't get into the FBI system that easily because it was just not meant to take the way that it's responding nowadays. Now they're working on that, but going back to OPM, I will tell you, I think it's a long way out and I'm not being sarcastic, I'm just being honest. Sir, you had a question back here too? Yeah, hi. I work in applied cryptography, so I've thought a lot about these issues. I think one of the things that I feel pretty strongly about is that if there is a backdoor in software that will eventually be something that any sophisticated attacker can get access to. And so I wonder, is there a world where it's ever okay to give the FBI a backdoor knowing that China and Russia and essentially every nation state will have access to it? So, two questions I'll answer in both. First one is, I totally agree with you. Now, on backdoors, I think it opens up severe risk. Even besides nation state attacks, right? Just traditional criminal or hack-to-visit attacks, it's horrible nowadays. So it opens up severe risk. Nation state, forget it. You got somebody like China, Russia, some of these other countries, not so much North Korea. Iran and the banking industry, it's significant. When it comes to giving the FBI whatever, I honestly think that's an independent decision by whatever company or organization that's doing it. But on the backdoor side, and that's really what's changed my opinion on a lot of this, is I don't think it makes it safer. I don't think there's not a need to be able to communicate and try to help each other, especially when it comes to protection of our own country. But I definitely agree that the backdoor issue is significantly, is significant risk. Yes, ma'am? Yeah, you're the boss. Thank you, Sharon Bradford Franklin with OTI here. So you talked about your 180 degree change in views and how you previously, you think looking back, maybe you had a somewhat of a myopic lens that you were looking at things with. Do you have any thoughts on how we can reach those who are still in the FBI and in law enforcement to help them to understand the perspective that you now share? Yeah, I think that's a great question. So one of the things that I think needs to happen is, and I don't know which side it comes from, and I don't think it really matters, but I think there's some major industry leaders both in the phone side of the house, computer side of the house, the app side of the house, the software side of the house, that I think need to extend at least an aisle branch to start talking. And the reason I say that is, and I don't mean, and I mean at high levels, and I'm not being sarcastic, it can't be at the mid-level, it can't be, I mean it needs to be at the director of the FBI, attorney general level, and it needs to be at very high levels at the board of these companies. Not even going into it thinking anything is gonna happen after the first five or 10 or 15 meetings, but I think there needs to be that initial communication. I would highly encourage any side to reach out. I think if you look at everything that's been going on politically and just globally, right, everybody's busy, everybody's got a lot of stuff going on, this issue, I cannot think of an issue that will permeate our society, the federal government, and the USIC more than this issue. And like I said, it's gonna get more complex. And the only reason I know I'm right on this is eight years ago, maybe seven years ago, and people that are cyber people in here will know this. If you got hit with a ransomware attack, you called the FBI, NSA, DOD, chances are we can unlock your computers, chances are we get your data back, you wouldn't have to pay anybody. Nowadays, you could call in whoever you wanna call in, and I don't think it's happening. When I was in the FBI, I told you don't ever pay a ransom. I guarantee you 99% of the clients that I've had in those cases pay ransoms. They just do, because they can't unlock their system. They can't pay 100,000 employees, like they can't. Going to the encryption thing, I see it going that direction. So there's gonna be more engagements, and what I don't wanna see is I don't wanna see another San Bernardino, because I lived it. And I saw the tension on both sides at the highest levels of this government and at the private sector. And I wanna get there before then. And Katie, who's a partner of mine from the Chertoff Group in the back, she's heard me say this in a million meetings, and I would say it to Chris right now, the director of the FBI, or anybody. They need to be that initial dialogue, and we need to talk about how we're gonna make this work. Again, protecting the rights of the people that the companies have, engaging contracts to protect their information. All the way in the back. Hi, Kevin Bankston from New America's OTI. Thank you so much for being here. And for candidly talking about your change in perspective, because it's very rare in this town in particular for anyone to ever say, you know, I thought about it, and I've changed my mind, I was wrong, and here's why. So thank you. But my question, you, I think, correctly cited a key missing piece here, which is a discussion about how do we fill the technical education gap for the tens of thousands of local law enforcement agencies to teach them how to, you know, use other alternative sources of data to fill their evidence gap, like how can they take advantage of all these changes in technology rather than simply being myopically focused on the encryption thing in particular. Do you have any particular thoughts on what the DOJ or Congress or anyone else can do to help make that situation better and actually give local police the tools they need to catch the people. Great question. I think the last one we're gonna have time for, but it's a great question. I highly would advise in any good way to the Congress or Senate, they need to be thinking way out. I briefed the Senate and House Intelligence Committees probably a couple hundred times in my career, right? Starting back with Rockefeller. That tells you how long ago on that board. I really think they need to allocate funding for the training and assistant not only of the federal and state and local law enforcement, but also help the private sector, right? Monetarily provide some of this training because honestly, I'm telling you the truth. 20 years ago, because if you look at my bio, I've arrested a lot of spies. I actually really don't like spies. So back then, right, when people are trying to do this type of like look or training, it was much easier. Nowadays, this changes constantly. So no one's gonna be able to allocate in their budget, right? In their little town and their little sheriff's office or whatever, even in big federal agencies, the ability to keep that training current. And I really believe we need to look way out instead of budgeting the way we budget for a year or two years, maybe three years, and then pulling a lot of that money back. And the reason I say that is in the meetings that I've been in up there, that's not to focus. And I'm not blaming anybody. The focus is the crisis of the day usually, right? That's why I was always up there. But when we talk about budgets, they kind of refer back to their old traditional, the way we do these budgets. And I really think nowadays you can't think like that. That's part of the problem with this. Like when I say, and this is a big deal, when I say we need to be like theoretical, like thinking out, that's not a real good word in law enforcement when it comes to long range. It's usually black and white over a period of time. So I think that really needs to be done. Listen, I really appreciate everybody sitting here and listening to me. I'm sorry, sir, I think, can I have one more? Are we done? We're done. She's telling me we're done. I'll talk to you afterwards. Thank you so much. Thank you, Bob, for your remarks. I think it's a really great way to lead into our last panel, which we'll be focusing on industry and the role that encryption plays in fostering innovation and protecting consumers' data and, of course, enabling companies to do the important work that they do. So with that, I'd like to invite our final panel on to the stage for that discussion. Do you have the term on, or are we good? I mean, can you just flip the input? They're all on. They're all on. Yeah, it's going to make sure. Good afternoon, everyone. I thought it was morning for a minute still. My name is Jack Gillum. I'm a reporter at ProPublica, where I cover a lot of issues regarding tech and algorithms and civil rights. This panel, I think, is going to be fascinating. I hope it will be. Just because it gets beyond, I think, the area that we normally discuss in the previous encryption debate really about communications and the government prying eyes to your private thoughts. This is more about consumers and innovation and e-commerce, pretty much the things that a lot of ordinary folks interface with every day when they might not realize it. I'm just going to start from the end, down with Jeff. I, everyone, is going to introduce themselves. We're going to go into a little bit of a free-wielding but discussion, and then we're going to save about 20 minutes for questions at the end. So be prepared for that. So I guess we'll start. Great, thanks very much. I'm Jeff Ratner, and I handle cyber policy issues at Apple. So I'm Eugene. I am part of the Android Security and Privacy Team at Google. I'm Kate. I work on privacy and security issues for Engine. We're a nonprofit based here and in San Francisco, and we advocate for pro-startup policies. My name is DeVroote Mitter. I'm the CEO and founder of ArmourText. We provide secure collaboration capabilities for critical infrastructure and other regulated industries. My name is Tom Gannon. I'm on the US Public Policy Team at MasterCard. Happy to be here. So I'm going to start with Tom, for he's the closest to me. Sort of at this area that seems to, I think, get little attention in the common space when we talk about encryption. And that is the area on payments in which the security of which I guess is the linchpin of your business. I'm just wondering if you can talk a little bit about encrypting payment data around the world beyond even shipping pin and how that underpins the financial system in particular your work. Sure. So MasterCard, as all of you know, is a global payment network. We operate between issuers and acquiring banks, merchants banks or consumer banks on the issuing side. We're essentially a B2B payment network. And so our customers are banks. Security is a top concern of ours. It's something that we spend a lot of time and energy and resources in the encryption space. All of you probably have seen in the US the evolution of physical card payments over the last few years moving to chip cards. The chip on payment card is an encrypted way in which people can use their cards at a point of sale in person environment. It's meant to encrypt the payment card data so that when you're using your card at a store, for example, that chip rather than sending your 16 digit account number over, it sends a random one-time authorization code essentially keeping the payment security information or the PAN with you and your bank. It's not a new technology necessarily, it's new to the US. It's been, I think, an evolution for the US to adopt chip over the last few years. It continues to be an evolution both on the issuing and merchant side. In the online environment, we're continuing to invest in new technology and new security, things like tokenization, which is a form of technology that allows card not present transactions, online transactions to be secure, are more secure than they are today, essentially taking your card data, your 16 digit account number and removing it from the transaction so that cyber criminals can't steal your payment card information. That technology is being deployed more and more in the US and around the globe. Obviously encryption is a big part of that because your card number is encrypted when a tokenized transaction is used with an online merchant, for example. But it's an evolution in the way in which payments are operating for consumers and merchants and issuers and there are other innovations, I think, to come that will have the effect of making payment card data, consumer card data, useless to fraudsters and cyber criminals that are out there that are trying to steal information that they see as valuable, namely payment card information. So when we talk about tokenization, that's something I've always been personally fascinated with in the idea of if there's a way to make credit card skimmers obsolete, maybe that's one way we go about it. Gene, this gets into your realm a little bit with Android Pay, but just beyond that, I don't know if you wanted to talk a little bit about that a little bit, but one thing I was really fascinated with was these other applications that really rely on encryption, whether or not they're medical devices, whether they're mobile driver's license, I just wanted to see if you could touch on that. Yeah, thank you for having me here. The lights, don't look in the lights. So taking a step back, so when you think of encryption, I think Tom made a good example of encryption in transit, there's encryption at rest, but broader cryptography, especially with mobile operating system and device, it's used throughout, right? You boot your device, there's an integrity check that makes sure that it's a trusted known good version of the operating system when it boots up, you type in your PIN, then data gets decrypted, these apps obviously use key material to communicate with backend resources and other people, encryption's used throughout, and prior to this role, I spent time as a product manager on various products, and one of the products that I worked on the most was secure email, specifically for government use, and I always come back to the three use cases when it comes to cryptography are authentication that only you're allowed to access your data and nobody else should be able to access that data. There's non-repudiation, which means you know that that data came from a trusted source that you can verify that they are who they say they are, and then integrity of that data, message integrity where you know that data hasn't been somehow tampered with in the process of it being transmitted to you, right? So with those three things combined, when we start looking at what we can further do with cryptography and future use cases, like Tom was talking about with these mobile devices, it's an interesting space, so one example since I come from a public sector background is everybody talks about two factor authentication, right? You have your smart card, you have your, it's got the something you have and the pin for that card is the something you know, and so when you start looking at smartphones, they become really powerful devices for beyond just two factor, like I look at smartphones as truly multi-factor Swiss army knives, so to give you an example that you can store key material on the phone, right? So that's the something you have, you have a pin to the device, that's something you know, and then you have all these other contextual layers that you can leverage to help figure out at the right time, at the right place to do something. So whether it's where you came from, where you currently are, the health status of your device, whether it's in a good standing, you know, all these things, and actually a lot of folks are starting to implement something like that with the right credentials, where you know, the phone becomes a, you know, could be used for physical, logical network access, right? So these are kind of future use cases, a couple other use cases that are really interesting and kind of coincide with our latest Android OS release with Pi. We introduced two really neat features. One is Strongbox, so Android pretty much all OSes have started relying on more hardware back security, relying on the actual hardware to provide cryptographic operations, and as a result of that, you know, the data becomes, the cryptographic material becomes much harder to exfiltrate, and so with this new API Strongbox, we allow, you know, merchants, app developers, whoever to start storing key material in discrete, tamper-resistant hardware, which makes it much harder to exfiltrate or pull off the device, right? The second piece that we've recently come up with is what we refer to as protected confirmation, and this feature is actually really cool. It's basically a trusted, what we call trusted user interface, and it's actually lower level, so it's not at the OS level, it's below that in the trusted execution environment, and it's a really interesting use case, and this goes to the scenario of controlling medical devices. So what happens is, if you try to use an insulin pump, for example, and you try to say, I only want, you know, this much insulin, if some malicious application was on the device and able to impersonate, it would be able to send the wrong dose and could harm the person, of course, with protected confirmation, and this is something we did a demo at Google I.O. The way it works is, the insulin pump vendor has an application, they call an API, and basically the information, the amount of the quantity that's gonna get put through the insulin pump would basically be shown in this trusted user interface. The user has to hard press a button twice, and then what happens is that information that's cryptographically signed, and then the insulin pump, which is the relying party in this case, would be able to verify that, you know, the intent was correct and it wasn't something that somebody tampered with, malicious, like a malicious application can tamper that information, and at that point, it would send that to the insulin pump and then we'd get applied to the person. And just to back up real quick, so when we talk about an insulin pump and requiring that sort of cryptographic trust, we're talking about, say, I'm just envisioning here a pumping at somebody's house or at a place where it needs to relay remotely to, say, a physician or somebody to review that or... More of a Bluetooth, like, it's like diabetics that have it attached at all times, and they just need to apply it at that time. Sure. But the key is it could be over any network medium at that point because you could verify the dosage amount. Sure. And I think just to piggyback on what you said, I think this gets into a larger and broader conversation that we were discussing backstage about people using devices to really to authenticate into not just, you know, their account information on the device, but applications and devices running in their homes and even critical infrastructure. And I think, you know, especially with Apple, we take encryption so seriously and we wanna make sure that our users are the only one that can get into their devices. And I think increasingly we're seeing the importance of that as the kind of the growth of IoT and the interconnectedness with our devices really just grows, I think, right at the beginning. Well, and to build on that, I was gonna go to you next about the importance, I guess, what you're saying is to trust, right? The idea that it's just really beyond, you know, brand loyalty to a company like Apple and being able to trust that when I purchase an iPhone or I sign up for, you know, when I use iMessage, whatever application it might be that my information is protected. And so I just want you to talk a little about how trust, which seems maybe obvious, how that underpins a lot of what we're trying to do. Absolutely. I mean, it's critically important to Apple. And I think it really, to all of us, and I think to the public at large, trust and security are going to be kind of the pillars of the future development of technology and the internet. And I think we're seeing that play out in debates around privacy here in Europe, that consumers need to trust companies that their information is gonna be protected, that it's gonna be private. And I think encryption is one of the only ways we know how to do that well. And, you know, as we're seeing this explosion of new tech on the market, I think the only way that will continue and the only way consumers will trust these devices and allow them into their homes and into their lives is if it is backed up by the best efforts of the companies represented here and others to protect that information. So I think there's an increasing recognition of that and how important that trust is and really I think encryption just underlies that to a large extent. One key or critical part of that though is also getting tech companies to realize that they have to be very judicious in determining what they are ever going to store or transmit across the network. The encryption is there to protect it from your device to their network. Once it's there, there's still possibilities of leaks and breaches. And we've seen a lot of that take place recently. There's abuse of that trust. We have to start moving towards the world where we're saying, hey, do you really need to hold on to that? And do you ever need to have it in the first place? If not, don't. I would agree. Well, I was actually gonna get to the next. At ArmourText, your clients include defense and critical infrastructure and where these communication protections really have real world effects. I mean, these are in terms of lives lost or saved. I'm just curious if you can talk a little bit about beyond encryption just among from the thought of a prying eyes of a government and more of our government, the US government and more of foreign governments, foreign actors, hackers who really once inside a pretty critical application can do very serious irreparable damage. So we obviously provide secure collaboration capabilities that allow different critical infrastructure and regulated industries to be able to communicate internally and feel secure that both their communications are protected from external threats and adversary that we'd be trying to listen in as well as potential insider threats from the administrators that may have gone through. We've seen examples of that take place here. When it comes to lives lost and or some of those adversaries though, you have day by day more and more examples or more and more warnings being delivered to critical infrastructure industries by places like the FBI and the DOJ and others saying, the adversary is now on your network. They are listening in. And when we think about industries like the nuclear energy industry, electric utilities, places like hospitals or other critical infrastructure, if you know the adversaries on the network, they've compromised privileged credentials and are able to listen in, they're able to potentially do a handful of really harmful things. One, bring the network down. Two, bring down the infrastructure itself which can lead to lives lost. Or three, gain critical intelligence and or compromise IP that would be helpful. So some of our customers include places like defense manufacturers and or government advisory services. We're very literally, they're communicating about technologies and or updates to technologies that they don't want our adversaries in other countries to be able to reverse engineer. And so they're realizing that more and more they have to protect their internal communications but also the communications with their suppliers, their vendors, their manufacturers, their partners because that entire supply chain is Intel that one of those adversaries if they were able to compromise it could use to affect technologies that have real world applications like sniper rifles or the optics used on sniper rifles even to binoculars and other systems that are military uses or government advisory services when they're advising other countries on how to bolster their security but also communicating about what may be problematic today. Where are you falling short? That insight or that knowledge upfront acts as a roadmap or potential adversary. And so more and more companies and governments and others are starting to say, hey, we have to be very protective of this kind of communications even if it's just DevOps chatter that is the roadmap for a hacker to get in. You mentioned a little bit about, say, made not to hospital chains or large infrastructures being hacked and I'm thinking of the MedStar hack here in Washington was a few years ago. This is a big chain. It has a lot of hospitals, clinics, a big IT staff. This is sort of naturally segueing to Kate. There are companies, startups that don't have that infrastructure even if they wanted to. I just wonder if you could talk a little bit about the effects of sort of undermining encryption on those startups, especially when being in a startup environment, you have to deal with the fire of the day or you're still working on sort of like a first tranche of VC funding. The last thing you're worrying about is the people who are on your network. I'm just curious if you could talk about that a little bit. Yeah, sure. So often when we talk about encryption, we talk about kind of the apples of the world and the good stuff they do and that's great. I think that's obviously kind of top of mind for most people when they think about this, but there are so many small companies and startups out there that rely on encryption that don't have kind of the technical or legal resources that a company like Apple does. So if the government were to come to them asking them to undermine encryption, they don't necessarily have the budget to fight it out in court. They don't have the technical resources to hire engineers to build in a backdoor and then hire more engineers to defend that backdoor because the backdoor is not just for the government obviously. If the government can get it then any kind of malicious actor can. And so when we think about the encryption debate more broadly, it's important to remember that not everyone has the resources that these big companies have, but people certainly expect the startups to take their security seriously as someone who, as a user of a startup, you don't want to think that, oh, they're too small to protect my data. You wanna be pretty sure that they're keeping whether it's your social media profiles or your photos or emails or your text messages or your financial data, whatever it is, you wanna make sure they're keeping it safe. And so trying to deprive big companies but especially small companies of a tool to do that is really short-sighted. And a startup is usually not in a position to kind of bounce back the way a major company can if there's some kind of data breach. We've seen plenty of data breaches over the last couple of years and we all still go to Target. Most people still go to Target. We still use certain financial services, but if you're a startup but you don't have that name recognition, that kind of reputational hit can really take you down. And if you are operating on a bootstrap budget, that could be the end of your company. So I think the effects of this debate are disproportionately large for startups who are usually small and underfunded to begin with. I just wanted to go back a second about the access and availability to encryption. And it seems like a fairly basic conversation but this is something that has always personally troubled me is that there is a sort of technical knowledge debt where people, in other words, who have the ability to understand encryption can make things better than people who don't. And I think there is a host of morality and inequity issues that come up with this. I'm just curious if we can just talk a little bit about whether it's product development on the Android side or more space in the startup side that makes this type of technology more ubiquitous and safer so that anybody, whether they have an advanced degree in computer science or they're just a person wanting to form a startup where they can start and how that landscape has changed to make encryption more available and easier to access. I would say cryptography encryption is a very specific skill set much more above and beyond just a traditional computer programmer. And so I think this is up to the OS providers to really democratize the capabilities. And I think Apple, Google, Microsoft, all the big guys have done that. They have platform APIs for app developers to be able to do all these things and they're continuously making those better so that they don't have to reinvent the wheel necessarily if they don't have the expertise and they can rely on these capabilities that the OS is already providing. And they're kind of future-proofing that some degree because these OS providers are continuously improving those capabilities and making them stronger and more impervious. But what we do need is a level of cooperation to start to occur between the larger companies and the startups as well as the rest of the ecosystem. The reality is, is the reason end-to-end encryption needs to exist in a product signal or in WhatsApp or even I-Message is because the earlier protocols we had like SMS and MMS had failed to deliver a viable security model, right? The telecoms were involved in that. They helped implement those protocols. They helped maintain that. They helped determine what applications were on the handsets of the devices for the good go and yet they failed to solve those problems for so long that it led to Apple implementing I-Message with end-to-end encryption. And it's Apple's I-Message end-to-end encryption that then falls off when the minute you're communicating with someone who's on Android. And then Android has it in certain places but not others and it falls off somewhere when you go and return. WhatsApp starts to implement back-to-back but it's only when you're within the WhatsApp wall to guard it, same thing with signal. We have to start developing a level of cooperation between the big boys that says, hey, guess what? We could build interoperable end-to-end encryption around things like this as well. And that's something that we need to start doing. Two, just to jump on and piggyback on their comments. You know, I think one of the key features of good security is that especially in the consumer device world is that it's easy to use. And that's why I think it's so important that the platform providers are building it in behind the scenes as much as possible. And so a good example of kind of that is, I think before Touch ID, about half of iPhone users had passwords, numeric passwords, the rest just didn't have any security. After Touch ID, that number very quickly exploded to about 90% of users. And Touch ID requires you to enter a passcode. And so you have that extra level of protection that a lot of people were foregoing because we had made it easier. And I think that's really an important point here. One thing that I would add is good security, I agree, is transparent security. It also needs to be measurable security. And that's where the taking a standard approach or having somebody vet it. And I think Apple publishes stats on Touch ID and impersonations with Google. We published some stuff around that as well around false acceptance rate, posture acceptance rate benchmarks that you need to attain to provide that nice measurable security. But it's that ubiquitous, that ubiquitous secure communication, collaboration and other things. That's where that cooperation's gonna have to occur between all the different vendors, the larger vendors especially. When you're talking about consumer applications, you're talking about enterprise, different story you can mandate down that all the people within your organization use technology X. But for the consumers, we have to communicate with everyone and everyone in our network. Everyone and anyone in the network may have all different devices. You're never gonna tell that level of ubiquity and that ease of user experience and knowing for sure that yes, this communication was protected, no matter which person in my phone book I was communicating with until you start to use interoperable security communications. I was just gonna say, I find that statistic Jeff very remarkable, a 40 percentage point increase just with the advent of Touch ID. That's sort of an area, it's not just about getting buy-in and encryption from the public, but it getting a buy-in on just security in general. I mean, an ordinary user of an iPhone in other words might not know that his or her device is encrypted, but they know that they put a passcode on it so somebody else can't get into it in a very rudimentary sense. I'm just curious, is it because it was cool, it was like this new thing that hey, I could put my fingerprint and open my phone and it by extension required a passcode. Is that why you think you saw it? I think certainly the allure of it has some factor, but I think in a lot of ways is just the feedback that Apple was getting was that people just were checking their screen sometimes between like 40 and 80 times a day and they didn't wanna enter the passcode every time and so we came up with a way that they really didn't have to and just making it easier to get into the device but still have that layer of security I think was probably the biggest factor for the increase so there certainly is, and we saw this with Touch ID to Face ID, there certainly is this, the allure of kind of the new product and the new feature, but I think it's also, it really goes back to just the ease of use and the willingness to then actually accept additional security. That's one of the few instances in which you have an improved user experience and get security at the same time, right? Typically we talk about security and user experience and one goes up, the other one goes down, very rarely do they go with tandem and those kinds of technologies are few examples and the few examples where that definitely takes place. I also think that was a good carrot and stick example because once Touch ID was available, same with Vimeo Shackle Android, the OS is able to kind of nudge the user into setting a passcode, politely saying you should really set a passcode because it's easy enough now to unlock it more periodically with your biometrics so it was kind of like hand in hand with that. And that also became easier too to, I mean I have to admit I have an iOS device so that's sort of my question about Touch ID. Is it then that it seemed to extend pretty quickly to the financial world where if I logged in with an American Express or Bank of America or whatever provider, that's how I did it and it seemed like it created this, as you were saying, this sort of ubiquitous security. It really made it easier for consumers. To financial side, I know this is a little awkward that I'm standing right next to you but when it comes to tokenization, I'm wondering if that, if this is sort of the, I'm gonna make a broad brush sort of question here, if that is the future here when it comes to making much better encrypted secure currency or financial transactions. Is there going to be a point where we're just going to soon bypass chip and pin altogether and go to this tokenization approach with a mobile device? Yeah, I think so. I mean, you know, PIN is a way to authenticate yourself and so it's a, you know, it's one way to authenticate a transaction but PIN really kind of only works in the physical world. It doesn't really work in the online world when you're going to buy something with an online merchant. It doesn't work as well. It's also a static way to authenticate yourself. Biometrics, whether it's Touch ID, whether it's you're measuring your heartbeat, whether it's some other facial recognition, some other kind of unique identifier to you is a much, from a security standpoint, is a much better way to authenticate that you are, in fact, the person making this purchase. It's much more secure than, you know, asking for the three digit number on the back of your card. So I think whether it's tokenization or whether it's some sort of other dynamic, more secure way in which to authenticate that you are, in fact, the person who is using the card to make the purchase, whether it's the card's in the device or whether the card is physically with you, I think that's the direction that we're going in. There's always a balancing act between security and kind of the friction that people experience in making payments. Payments is not something that someone gets up every day and wants to make. It's not something that people necessarily get excited about. And so part of the balance that we have is, how do we create a secure environment for people to make purchases, but at the same time, you know, take away some of the friction that exists in doing that everyday task. And so it's a bit of a balancing act, but I think security has to be sort of there and in top of mind in whatever is deployed. It also has to sort of work for the consumer. So not every consumer is gonna wanna authenticate themselves in the same way, for example. So having multiple ways in which to securely authenticate you in making a purchase, whether it's in the physical world or whether it's online, I think it's important because it gives people options. When it comes to this discussion, there's always looming in the background legislation. And I know that Jeff, you and I had talked briefly about these assistance and access bills in Australia and in the UK. Given the fact of the sort of importance to really to e-commerce, to pretty much every facet of our digital lives when it comes to encryption, I'm just curious how those arguments are can sway or not sway legislators who are trying to really create what some crypto folks say is really the impossible to really have good security and that is a fact we're doing encryption. Sure, I mean, the bill that's before parliament in Australia now is contains quite sweeping new authorities for the law enforcement and intelligence services in Australia. We've submitted public comments on our own and in conjunction with others that raising some serious concerns that these powers can be used to undermine security and encryption and really weaken the user experience and the security and the trust of all users in Australia and all over the world. And I think Australia right now, there seems to be quite a rush to pass this legislation and we've pointed out some of the potential ill impacts of it and I think we are hopeful that the government will take a step back and maybe think through some of the long-term implications of this type of law, which would allow them to force providers to make changes to their products in myriad ways. And I think we don't even know the full extent yet of how the authority would be interpreted, but it's clearly quite sweeping. I don't know if anyone else has been tracking this, but we're certainly concerned. Yeah, I'll say from a conservative perspective, obviously it's complicated enough having to deal with one country's rules, but when you add in different authorities in different countries and users in different countries and users who are subject to varying rules, a global landscape of dramatically different and in different ways, privacy invasive and security destroying measures are just something that most small companies can't even imagine being able to comply with. So it's especially damaging if you're a startup who maybe you have a legal team of two and those lawyers are not experts in getting access and assistant requests from the Australian government or whoever it is. I think that's a, yeah, it's hard for everyone, but it's especially hard for small companies. Yeah, I mean, as a startup founder, you oftentimes don't even have a legal team of two. You have a shared lawyer at some large firm that's doing some work for you and you don't have a $3 million legal set aside in your VC financing just to deal with adjudicating every single request that comes from every single new municipality or government or other authority to say, hey, well, guess what? Carl Laulos, you asked me to do the following, please comply right now, you're 48 hours. It's a, sometimes I'm technically feasible from an engineering perspective, but we certainly don't have the kind of financing that a Google or an Apple do to sit down and say, yeah, well, we're gonna tie you up for the next, you know, five months over this issue. This is not happening. So I was gonna ask this and this is a question for folks who may not be steeped in this when it comes to an access bill, say, in Australia or wherever, where you were asked to provide a decrypted message, right? And let's say it's in the case where you as the company don't hold the encryption keys, you don't hold any of the key material on it, basically no physical way that you can unlock that. What happens? Do you just not do business? I mean, what do you not do business in the country? If you're a startup, do you not, you just decide that we're not gonna make this part of our market? And I mean, what's the, where does the river hit the road with something like that? That's an excellent question. We have not been confronted with that situation. We have not built any back doors or decrypted communications that we weren't able to. We've held the line pretty firmly on that. I think that, you know, it's very different, I can imagine, for smaller companies and startups than it is for Google or Apple. And, you know, I'll let some of my colleagues speak to that, but we have not been confronted with that stark choice. Yeah, and I think it's almost easier to just say, okay, we won't go to that country to limit your expansion plans. And we saw this somewhat with GDPR in Europe and probably a different debate, but it's sometimes easier to say, we can't play by those rules, so we'll just limit our user base to countries that have rules we can play by. Right, I'm just curious, like what do you do to say, Australia says, oh, you can't sell an iOS device, you can't bring it into the country and be on a, I'm just fascinated about how you actually get that to work. We have about 20 minutes left coming on. Arsa? Oh, we have until four, but that's right, we were put in. I wanted to go back a little bit and talk about these, we talked about access bills in Australia and the UK, and I'm not, I promise I was not gonna go back to the old encryption debate. But when it comes to the area of trust, we had talked about a little bit, about the sort of ubiquity of encryption and how there seems to be sort of two sides of the way the public use it. They look at encryption, especially if they're not steeped and they use this very often as this must be something where the bad guys must use it, they're trying to hide communications, they're trying to keep it from the police. Meanwhile, they don't bat an eye when it comes to say, given their credit card information to Amazon to buy a toaster for a gift or something. They don't really, they think those are two different things. I'm just wondering is the ubiquity of encryption is that going to lead to a better education among the public of why this is so important or is it the other way around? So maybe we, you know, laws are not written in such a, I guess, cryptophobic fashion. Whoever wants to tackle that. I would say like from my perspective, at previous companies where we built products that met certain requirements, at the time the OS providers couldn't meet those requirements, we were able to meet those requirements. So the analogy there is even if the OS providers didn't provide the proper security that provides support to everybody, there's always gonna be a vendor out there that will meet that requirement and they could be set up in a country that doesn't have to apply laws and it's up to the person that's importing that technology in, but somebody's always gonna solve the problem of protecting the data end to end. So it might as well just be the OS providers because it democratizes it for everybody. Yeah, and I think the ubiquity has sparked a demand and I think we've seen, as I mentioned before, we've seen that in the concerns over privacy that have really exploded recently and I think that once the platforms and other providers have been able to provide these services in a relatively easy to use fashion, the consumers are gonna continue to demand that and if companies in the US or elsewhere are not gonna be able to provide that, I think other companies will fill that void and I think that's just the fact that encryption is math, encryption is available all over the world. I think it's, and I haven't checked this recently, but I think almost two thirds of the encrypted messaging apps are developed overseas and so I think it's very important that our companies are gonna be to continue to compete in this area and to be able to offer the best security available. And I do think as encryption becomes more commonplace, there's less and less a sense that it's something for the bad guys and we've, Robin, this is Robin's point, so I'm just gonna steal it, like we've heard of domestic abuse groups who say we wanna protect victims of domestic abuse and they're using encrypted apps to communicate and get to safety and if you undermine encryption, you're undermining their encryption too and it's not something that we work with plenty of companies who have no intentions of providing services to the bad guys, they provide services to plenty of good guys or neutral guys even and they wanna make sure that their security is protected and Jeff's point about competition from abroad is especially true for the big tech companies, but I mean, a small tech company can't compete if another country comes in with a better, more secure service and if the user is interested in protecting themselves, they're gonna go for that more secure service and that means that a startup here who's maybe their security is purposely undermined by the government, they're at a disadvantage. So, but I think your original question was about whether or not the ubiquity of encryption and its day-to-day use cases and its implementation on the operating systems or from the device manufacturers will lead to a world in which our legislation becomes less cryptophobic, right? And I might have a slight different view on this than some of you guys. I actually think it won't because it's not necessarily doing that today, right? We have more ubiquitous end-end encryption communications capabilities through every single major provider of an operating system and or hardware system today that we didn't have five years ago, a little over a decade or two decades ago and yet the push from law enforcement, the push in the last national election here in the United States, we're both presidential candidates from the two primary parties in the US, we're pushing for saying, hey, we want encryption with backdoors, it's technically feasible, we'll get Silicon Valley on our side, we will make this stuff happen. That was still there and the reason that was still there is because enough people who supported those candidates and the people who were talking about this kind of legislation supported their views, right? If it was so wholly unpopular that congressman X or congressman Y or congresswoman Z was going to face a backlash from their constituency because, you know, encryption was so important to their constituency that they were willing to say, hey, you know, we're willing to forego whatever law enforcement or other arguments, you know, their needs potentially into this. Those congresspeople would drop that argument almost entirely. So I'm a cynic on when it comes to Congress. They don't do anything out of the morality of it. Typically they're doing it out of what will work for their voters. So I think, I mean, one, I'm not sure this issue resonates at the ballot box. I think that we've seen past Congress's attempt to pass legislation or at least consider it and those efforts haven't succeeded. I think there was a difference in the question whether it was will the ubiquity impact legislation or will it impact kind of consumer demand? And I think those two things will obviously relate, but I think as consumers get used to having this security available, that demand will increase. And I think it will become something that shouldn't be taken away or weakened. And clearly that will affect the appetite for legislation, but I'm not sure if I, you know, in terms of evaluating the availability of encryption on kind of the voter's choices, I think that's harder to parse out. And so one of the things I think the appetite for encryption is not going to weaken for consumers. They're going to want a better, more protected iPhone, a better, more protected Android device. But will they then turn around and say, hey, guess what, you as a legislator, if you're proposing encryption legislation that might weaken any of this stuff, I'm going to specifically turn that against you. No, because they're going to keep trying to sell it exactly the way they are in Australia, which is that, hey, we can do this without causing any weakness. We're not, we're going to follow the known standards. We're going to follow X, Y, and it's a broad language to get it past the voter. And I don't think that's going to get turned away by the voter at that point, right? They're not going to say, ah, I now understand encryption so well because ubiquitous therefore, I'm going to demand that you make a law that actually makes sense. I just don't see that happening. I think there's still going to be this disconnect unfortunately, because we, while we're making it ubiquitous, we're not necessarily improving the education around how these technologies actually work or where their fundamental failings would be, whether it's the human or some other implementation issue. We're not really educating around that yet. But I'm not sure we've seen that watershed moment yet. Like I think if Congress were to schedule a vote for next week, and I went home and told my mom, by the way, if they approve this, you're more likely to have your iPhone stolen because it'll be easier to break into. Like, I don't know if my mom would call, but like I think that is an argument. Like we haven't had that vote scheduled yet. Like I think it's hard to rally folks about something that's maybe two years away. Which isn't to say, I mean, yes, we should be educating, and I think we are educating to the extent that we can and we can always do more, but there hasn't been that real test yet. I think it's also just, it hasn't hit the broader masses yet. There was somebody who had the proposal for a potential way of doing this, and it quickly got peer reviewed and a lot of negative feedback on that, right? So, but that's a niche audience reading that information. It's not something that gets publicized to the average consumer. So it just hasn't hit them yet. They haven't found a way to personally relate and realize what they would be losing as a result of something like that. So, and that comes back to the education, right? In both those cases, it was the education that's gonna make the difference. And that's why I don't think the ubiquity of the technology itself is gonna solve this challenge. It's gonna really be the education aspect that we all have to start taking on. And I mean, look, I've been able to rile up ordinary people about the Freedom of Information Act. So I feel like things are possible with these things. Anything's possible. And I think in the serious side, the reason why we were able to do that, or I was personally in sort of my own fervor, was that once you were able to sort of draw a line, not like the sort of large policy issue about government records, but basically boiling it down to, they have information that you paid for, you should have a right to access it. You know, it's a very clear thing. The same way that from what I remember covering at the AP, the right after Snowden, the Snowden Disclosures, up until that point, the Bill of Rights, the parts that people were the most familiar with, and speaking of ubiquity, or the First Amendment and the Second Amendment, maybe a couple more. After Snowden, I mean, you could probably do a Google Trend search and keyword searches, Fourth Amendment. I mean, people who had nothing to do, by the way, with policy or even lived in Washington started talking about the Fourth Amendment. Because the Snowden guy was everywhere and there was this huge discussion about what does the government, going down to the local police force, have a right to look at? And maybe, I'm not saying that it takes these big disclosures to push that forward, but you're right, this isn't gonna be like a dinner table issue with mom. But if you frame it in a way of, okay, well, all of your personal information, and by the way, mom, you just got your identity stolen a year ago, maybe this is gonna make that problem worse, and maybe that's making it sort of, and I'm not saying dumbing down, but things are complicated, people have a lot to worry about beyond what we do. And maybe it's making it understandable and digestible is the way we go in. I'm just, maybe that's the way we do it. That actually, my own personal life, that's something that paid off quite a bit. Way back when, when I was trying to learn forensics on older operating systems, my wife had to do phone, and I showed her how to recover some data and kind of freaked out a little bit. And then one day, I showed her how to like, was able to man in the middle of her traffic on her phone, I got into her Facebook account, but I love my husband and she replied, I've been hacked, but, you know, but the relevance of that story is that like, that personal impact to her made her very security conscious from that point on, she had a complex password on her phone, she was very conscious about being on guest Wi-Fi somewhere. And so it takes that personal factor of how does that impact me and my own personal life to make them appreciate what potentially they could lose out on in the future. Sure. And to that extension, if I can add my own observation, when at the Associated Press after Snowden, you know, we became very quickly, and I pushed for this even before then, about the importance of using encryption or even just general, you know, operational security when it comes to journalists dealing with sources. I mean, journalists, whether it's in Washington, whether it's dealing with a local police official who's trying to leak you something about corruption, I mean, we primarily deal in source information. And this is not disparaging anybody, but getting very accomplished, even Washington reporters, to not direct dial somebody at work where they don't have to subpoena, by the way, their own phone records, if you know, either in that office, getting them to use something besides, you know, like signal, which was nascent at the time, or iMessage Beyond SMS, to some form of encryption was really foreign to them. And it just, it was just something that required more work. And it's like, I have enough to do, I have my own system of doing it. And now, you know, fast forward a few years, I don't know of a single journalist in Washington that doesn't use signal. So there's signature. Yeah, right. Yeah, they put it in there. It's on my business card, in fact. Well, it had been, and then they were like, we're not putting that on anymore. But at least it's in my email signature. So things change, but even among the most, what should be security conscious of people, or, you know, sensitive, you know, these are journalists. The classic thing we like to say is, you know, which is true, we will never reveal the source, because that's, you know, integrity and trust is important to what we do. But getting them to sort of go that extra mile, it's not that they were, you know, stubborn as a mule, it's just they didn't really know what the, how to, they didn't really know how to process this all. And some of you were like, well, what do I do? I just cover politics, you know. And it's like, well, you can cover politics, but then two years later, you get leaked the Romney 47% video that by all accounts derailed as 2012 candidacy. And someone is surely going to be asking the source of that, or trying to find it out. So, and that started with the relationship you built three years before, so. We have a few more minutes before we get to questions, but I'm just curious, Kay, if we can just talk about startups, what are the biggest issues that you see when it comes to information security or, you know, IT security problems about, you know, groups that may not have like a large budget to take care of these things? Sure, so I think kind of like the top of mind biggest problem right now is, you know, making sure that there aren't state-by-state laws that require different things, which is obviously on the regulatory side of things. And I think I could talk for a whole hour and 15 minutes about the California Privacy Bill, but specifically on this issue, you know, it has a provision about how like, if your customer data isn't redacted or encrypted and it bleeds, you could face statutory damages that could rack up very quickly to be incredibly detrimental to a company who's just getting off the ground. And it's almost effectively creating like a safe harbor if you encrypt your data. And so, if you're a startup in California, you've got to be encrypting your data basically at this point. And then to hear from the federal side or from state and local law enforcement, but we don't want you to encrypt your data, like I'm not sure how a company who's getting off the ground can navigate that. There's already so many time and budget costs associated with making sure your user's data is protected, especially in an age where consumers are, granted, you know, my mom doesn't really know that her iPhone is as secure as it is, but she knows that she's had to replace her credit card two times or whatever it is. I do think there is a push to be more conscientious of that, those concerns. You wanna make sure that your users feel safe working with you and that for most small companies, the most effective way to do that is to encrypt their data. That's a pretty accessible way, that's not too expensive. And it's something most companies choose to do if they're being smart about protecting their consumers. So to face on one end, you gotta do this thing, but if you do this thing, the government might come after you. I'm not sure how you can navigate that in a way that isn't incredibly costly and time consuming. To be fair, part of the problem though has to do with the fact that the majority of our business models in any of these tech companies are data-based and so access to that, our ongoing access to that data by the company itself is critical to then be able to generate revenue in the future, right? In some way or fashion, either data mining and or selling that data or doing something else with it, right? So, you know, you can encrypt the data and put safe cards in place for encryption in transit and encryption at rest. We all remember what happened with, was it the Anthem, the big healthcare breach that took place? And what's gonna argue that they weren't HIPAA compliant and what's gonna argue that their systems weren't encrypted at rest and in transit and that they weren't applying appropriate levels of database encryption for bulk encryption. More than likely, you know, I'll bet everything I got to say that they probably did that part correctly. Someone was still at the break end because they still needed access to the data in a healthcare context or an insurance for a context that makes sense. For a lot of these companies out in the valley, the services they provide, except for the fact that they don't want to have to develop a more robust business model, they don't really need access to that data because they're not processing in any way that's gonna add value to you as an end user. It's about their business model and backend. So if instead, they were able to start moving away from that kind of, oh, we give everything away for free but we'll use you as our product on the backend, business model, we'd start to move much further into this realm of, actually, we don't want access to your data, we're getting encrypted in such a way that we can't data mine in the future and if we can't data mine it, we can't be subpoenaed for it, we can't be hacked for it. We can sort of move away from those problems. That requires a shift from the business model side. I think there is more diversity in the business model side. I mean, I do think this idea that, yes, all Silicon Valley startups are simply serving you very specific ads because they know everything about your life is prevalent and that's certainly kind of the narrative that reigns on the hill. But I mean, there are plenty of valid reasons to collect data and keep the data that maybe aren't critical to the company but it's like, yes, it's easier to verify your identity this way than this way and so maybe we decide that we wanna collect this one piece of data and set these three and maybe this one piece is more sensitive but it's easier and it's easier for the customer and it's easier for storage, which isn't to say that, yes, data minimization is great, making sure that companies don't keep or use stuff they don't need, totally makes sense. I just think it's hard to apply of one solution fits all here because every startup can be so different and their data infrastructure can be so different and their business model can be so different and what they need can be so different. So it's hard to say like, yes, everyone should just do this and it'd be fixed because it varies wildly, company to company. So I promised I would leave some time for questions, we have about 20 minutes left until four o'clock. Do we have a microphone to send or it is over there? If you can, just speak up and if you wanna, yeah, sorry, there you are, I'm sorry, I didn't see you, again, complete legs. I wanna note that in 1994 when we enacted Kalea which is the communications assistant for law enforcement officials, that was in response to the FBI worrying about new technical innovations thwarting their ability to access communications and if you look at that and compare that to Sam Bernardino, then there's certainly a progression where perhaps because Kalea was on publicly regulated entities, it was easier to pass but Sam Bernardino did not result in a new law being enacted and I'm wondering if you view that as a progression in the way the American society thinks about these issues or if that's just a lucky artifact of the times. And if I can add on whoever wants to answer that, is it because the FBI was able to use Celebrite successfully in that case that it obviated a need for new legislation too? I don't wanna piggyback too much, but. I think that's orthogonal because if it were a modern iPhone they wouldn't have been able to do that. That's true, right. You know, that's an excellent question. I don't have a good answer about why we're tracing Kalea through Sam Bernardino. I think obviously the issues are a lot more complex now. There is a greater proliferation of devices that I think back then it was, you were talking mostly landlines and I think we've moved quite far away from that. And so the issues are far more complex and I think they implicate a much larger amount of personal data and security. And so I think that certainly plays a big factor into why I think governments have been, have had a hard time approaching this issue. Hi, Sharon Bradford, Franklin with OTI here. So obviously companies like Apple have marketed your products based on strong encryption. It's very much a part of your marketing platform. But I'm curious, particularly for startups or others represented, maybe even in the financial world, to what extent do you see strong encryption and we protect your data as a part of a business strategy and a marketing campaign to consumers? We absolutely see that. I think for a lot of small companies, privacy and security measures are a comparative advantage and they can say, this isn't a hold up in the device space. I don't think that there are a ton of device startups that we work with. But certainly we have seen a lot of online platform competitors come in and say, or we're not gonna sell your data or we're not gonna mine your data or we're gonna make sure that you're not seeing ads that creep you out. We're gonna make sure that your data is protected. So it is absolutely, I think the more, there's like a virtuous cycle here where folks are more interested in being protected and so companies are more interested in providing protective technologies and so folks are more interested in being protected and it's kind of ratcheting up. Obviously, we're not in a perfect world yet and not everyone has secure messaging or secure emails or whatever it is, but I do think that there's kind of an upward trajectory for both. For us, it's absolutely critical to our entire business. Our business would not exist if we couldn't point to the fact that most of your online cloud-based collaboration capabilities leave you wholly vulnerable to both data mining, subpoenas that bypass your legal and or bulk hacks of those because they're becoming the highest ROI targets in the cloud, especially for critical infrastructure regulated industries, they're increasingly aware of that and so when we go and talk to them about the fact that our true end end encryption is user plus device specific and here are the nuances of what that enables. Here's how that allows you to be both a preserve user experience, if not improve user experience around security, while at the same time moving the security bar so much higher up the ladder that it becomes the compelling reason upon which our customers buy. If it wasn't for that security argument, we would have zero customers. If it wasn't for the failings of the other side of the house where they produce really cool technologies that work for other industries that don't necessarily that level of security, we wouldn't have something to benchmark against and say, ah, we can give you that level of cool capability set, but in such a way that you were still protected as an industry, that's our entire marketing campaign. Yeah, I would just jump on that. I would say that in our space, it's very important. We've all probably experienced data breaches. We've all experienced numerous kind of card reissues when there's a data breach that's out there. In the payment ecosystem, your only collectively is secure is the weakest link and so security, I think, is coming upon everyone in the chain to continue to sort of elevate their game when it comes to security. In terms of MasterCard, our transactional data is largely anonymized aggregate data. We collect four pieces of information when you use your MasterCard. The merchant's name, the total amount of the transaction, where the merchant's located, and your 16 digit account number to process the transaction. So none of that in some or in individual parts ties back to who you are, where you live, what you bought, how many items you bought. And so we as a practice, we only capture information in the minimum amount of information we need to make the card work, whether you're at a store or whether you're online. And so that's our practice and I think it's a deliberate practice. But yeah, security is always top of mind and I think increasingly, as we see more and more instances of hacks or data breaches, it becomes even more important. One of the things we're seeing in the startup space, especially in long health tech startups recently, is them wanting to say, hey, you know what, we recognize that we may not hold PII or PHI today, but two years from now we will. And so their DevOps Shatter Today is a roadmap into how do you break into their system two years down the road when they do have something of value. And suddenly we're seeing adoption take place in industries like that where they're starting to say, hey, we just actually start protecting our DevOps Shatter Today beyond just the critical systems that we may have to protect or minimize data on in the future, even for what we do have, otherwise we want to start protecting that as well. And so again, it's that security first marketing campaign has a role to play in a significant portion of the marketplace in the US, right? It's something like 30% of all workers actually work in highly regulated or critical infrastructure industries. And so there's a significant market to go after if you can put that security message forward first. I would just like to add that on the Android side. Also, you know, we've had full description with the file based encryption. We continuously adding new features and security related functionality, you know, leveraging more of the hardware, even the backups are using the screen hardware encryption in the cloud at this point. So I mean, I think encryption is table stakes everybody, right? So, I think this gentleman right here. Hi, I know a lot of you want to use the fact that you can provide at most security set up a marketing tool to create trust on all those things. So, I mean, that makes a lot of sense, but that does not mean all the meta information also have to be encrypted or protected. For example, you know, the data that passed through might be encrypted. How about, for example, for call, who called who? Or for example, in cell phone, the location, okay? So all those information you can still provide to law enforcement so that there'll be less need for providing backdoor for the thing that you could, you see what I'm saying? Essentially, you can still help some legal authorities with the other information, which I normally call meta information, that you're not obligated to protect, am I correct? So I think I can answer your question. I mean, Tim Cook has said that when we have information we do provide it to law enforcement pursuant to appropriate process. And I think we understand the importance of working with law enforcement. They've got an important job, and I think we are looking for ways to be good citizens. And so a couple months ago, we actually announced a couple or three new initiatives. A dedicated training team, we're gonna set up an online training tool for law enforcement and also a dedicated portal, which will help make it easier for law enforcement to make lawful requests. And so I agree with you. I mean, when there is data that's available pursuant to lawful process, we will share that, and I think we have a responsibility to do that. Okay, so that's what I'm saying. There's a lot of meta information, you're not legally promising, or you're not contractually promising to your client that they won't be released. If there is a court order, for example, where the client was when they were traveling. So it depends on, I think you're asking about what we contract with. I mean, so right, I mean, I think so it depends on the different services that people avail themselves of what data is available. We try to be very transparent about that. And we actually publish law enforcement guidance, which has kind of two purposes. One, it tells customers what data is available if they use certain services. And the other purpose is to help law enforcement understand how to go about requesting data that may be available. And so I think you raised an important point, and we certainly understand that we have a responsibility to work with law enforcement. Actually, right, yeah, next. Hi there. These days I've been hearing about the concept of side channel attacks, which are, to my limited understanding, attacks that can exfiltrate data based on electromagnetic consumption, audio power consumption, or other signals. If these types of attacks can exfiltrate data and bypass encryption, what are some of the security solutions that you're seeing especially in the areas of a critical infrastructure and finance to this type of attack, or is this just something that's largely theoretical and not happening very often? I could say from a mobile device perspective, that's one of the things I mentioned earlier. So with Android Pi, we release the ability to generate and store key material and tamper resistant discrete hardware, which is more impervious to these side channel attacks or timing attacks and so forth. So again, it falls upon the platform guys, the OS providers to make it easy for the app community to be able to leverage all this functionality to just further secure their data, right? Their application data on the device, or use that same storage for key material for data and transit as well. Hi, so the conversation earlier about fingerprint authentication reminded me of the fact that under the law with the warrant, the government can force you to open your phone with your fingerprint, but cannot force you to give them your password. And so along the lines of what we're just talking about, and I know it's a little bit of straight from what we're talking about, but I think it goes to the heart of sort of what's gonna happen going forward with tech and encryption and legal requirements. Did you think about that when you were deciding whether to implement that tech? Did you think about advising people? Because as we think about tailoring, you could actually tell people you're actually narrowing your constitutional ability to be compelled by law enforcement to give over information or not. And you can sort of tailor your own comfort level of both of those things by choosing this form of authentication versus that form of authentication. And does that tell us something or do you have an overarching view towards what your company's interests are in protecting this information? In other words, do you have any interests? Are you interested in helping people avoid unwillingly being compelled to give their information to government? Or does that interest only come into play when it represents the cyber insecurity more broadly? So I think we don't make security upgrades to frustrated law enforcement, we just don't do that. When we find that there are weaknesses that are being exploited, it's our responsibility to the customer to patch those and to upgrade them. So I think when we were looking at things like touch ID and face ID, we're looking at the security benefits. We're not doing an analysis on kind of the constitutional trade-offs that those may implicate in kind of, in edge cases. I think we're looking at getting our users from 40% who are putting any type of passcode security to 90% and over. And so that's just for us. I don't know if you want to speak to that as well. I'll say from the Android side, with biometrics introduced, we do give the user the ability to, of course, it's up to the user whether they want to use it or not. And there's also a feature where if they want to temporarily clear biometrics, they can do that. And one of the things that we try to do is, you know, being an open source platform, we kind of try to really be transparent about what features we implement, how they work. So I think we talked about this feature at IO. We published a blog post about it. It's been picked up by the media. You know, maybe we could do a better job of creating FAQs about it as well, but it's something that is available for users. That's a good point. The same functionality with the ability to take advantage of these technologies or not. Yeah, very back. Yeah. Feel quick, Sharon, about that voice of a moderate. I was asked recently to go to a conference and ask questions about Afghanistan and taxpayer spending. And I was told to get the iWatch 4 and the new iPhone before I started asking questions about the issue. And I wanted to ask about the security because I know that the authorities are content of the SOS feature. So you obviously interacted with them when you came to the safety or the security controls. Or I guess it calls 911, my watch if I fall. So you did work, I guess, harmoniously with law enforcement in these devices. Can you talk about that and how you created this SOS function? So unfortunately, my expertise is not on the SOS function. But I mean, essentially 911 is a fairly well-known and public number and getting the device to make that call under certain circumstances. It doesn't require us to work secretly with any government agency or law enforcement. So well, good, good. I hope you don't need it. But it's no, so I mean, I think that's probably the best answer I have for your question. Just one more question, gentlemen, here in the back. Thank you. There was some discussion about possible response if the US considered some kind of legislation to pry open cryptographic features. Does anybody in the panel know what happened in the UK when that was passed and what the discussion is in Australia and why it would necessarily be different in the United States, especially in the wake of a possible terrorist incident or criminal incident? Happy to take it. I think what we've seen in the UK and what the debate right now in Australia is that they were enacting laws to get to the starting point, to give themselves the authority to compel providers to take actions that, in our opinion, could undermine the security of our users. We have not, at least from Apple's perspective, we have not seen in the UK an attempt to enforce that and actually to compel Apple to make changes to the product that we can use security. And so I think it's still very much a concern, as we've said publicly in Australia, that we think it's ill-advised to enact these kind of sweeping surveillance statutes that could be used in a number of different concerning ways. But I don't know when we will face that make or break decision where the government actually moves to compel us to make changes to our product in the UK or in Australia. We did see that, obviously, previously here. And we argued successfully that it was inappropriate of the government to require us to make changes to our products that would undermine user security. Well, I think that's about all the time we have. I just wanted to say thank you to the panelists and for everyone showing up and asking questions. So thank you. And we're very happy and honored to be able to welcome today Congressman Hines. Congressman Hines has been, and we can take seats. I'm terribly awkward with introductions. Congressman Hines has been representing Connecticut's fourth district for the last decade. He spent eight years on the House Permanent Select Committee on Intelligence in the last four years as ranking member of the NSA and Cyber Security Subcommittee on that committee. So thank you so much for joining us today. Thank you. We're so pleased to have you. And so I imagine in eight years on the House Intelligence Committee, you've seen quite a lot among those things that you've seen has been an increased focus on cybersecurity. Every year, the intelligence community provides a worldwide threat assessment. And cybersecurity seems to inch higher and higher each year. Can you talk a little bit about the intersection of cybersecurity and national security and sort of where you're coming from with that and what worries you? Sure, sure. And thank you. Thanks for the opportunity. New America for pulling this together. And we'll have a chat. And if we have time, happy to take questions from the crew here. Yeah, so coincidentally, the 10 years I've been doing this corresponds pretty closely to really a very substantial shift in thinking within the federal government around cybersecurity. So I think I've actually only been on the Intelligence Committee for six years. But I was on the Homeland Security Committee prior to that. And when I was a freshman, senior people in government were just beginning to cotton on to the threat associated with cybersecurity to the government, to the, obviously, the national security agencies of the world, all the letter agencies. But when I arrived 10 years ago, there was just not a high level of understanding, a high level of commitment to really hardening the federal government's systems and data storage. That changed pretty dramatically in that 10-year period, whereas you say the heads of all of the security agencies now always emphasize this as something really important to the federal government, but obviously as a real vulnerability to the country. And there's been a concomitant uptick in resources devoted. Now there are no heads of agencies who I don't think, like in the private sector, don't worry about this issue. And of course, we get the fairly discouraging but regular drumbeat of data walking out of our most secure facilities, whether it's CIA or NSA. And so there is a very high level of urgency around this. Coincidentally, I went on the Intelligence Committee literally three weeks before Snowden did his thing. And so here I am sort of just trying to figure out what NSA stands for. And all of a sudden, the world is on fire. And again, discouragingly, we've really seen that while I think the intelligence apparatus is a lot more focused on this than they used to be. They have come to realize that data protection is a really hard problem to solve. And sadly, I don't think they've fully solved it. Yeah, so and I think one of the things that we've been discussing a lot throughout the event today is how encryption is really one of the best tools that we have to protect data. So much so that in previous years, former Secretary of Defense Ash Carter, former NSA director Admiral Rogers, and even more recently, William Evanina, who's the current director of the National Counterintelligence and Security Center, have raised real concerns about the prospect of an exceptional access mechanism and encryption backdoor in technologies. Can you talk a little bit about what concerns there are with national security when it comes to introducing an encryption backdoor? Well, let's start with what I can tell you with some certainty, which is that right now, there's really not much legislative momentum to speak of around the idea of a backdoor. Yes, Senator Feinstein and Senator Burr, I think, have not given up on their efforts now over a couple of years. But even if they come out with something that they call a bill, I think it probably has difficulty getting any traction whatsoever. And that, of course, reflects the substantive concerns around privacy, first and foremost. The left and the right often meet on privacy issues. You saw this with the section 215 and the 702 debates. But there's also, I think, a growing realization that if you deliberately create vulnerabilities, nobody is safe. And not just the good guys, but the bad guys can, who in many instances are less constrained and as talented as the good guys, could exploit those vulnerabilities. And, of course, the reason you hear and Admiral Rogers take the approach that he did is that while there's a limit to how far this discussion can go, it will not surprise you to know that not all of the government's sensitive data and not all of the government's sensitive transmissions occur over proprietary networks or hardened networks, right? We rely on encryption to keep our very sensitive national security information safe. And so if you're an Admiral Rogers, you both appreciate the need, the absolute life or death need to keep secrets within the government through strong encryption. You also, perhaps even more than somebody like Jim Comey, appreciate the offensive capabilities that we have and that we want. And so you're gonna have a slightly different perspective than if you're a Jim Comey who really wants to break into that iPhone in San Bernardino. Yeah, well, and so I think this brings us to some previous statements that you've made about encryption. The idea that encryption is a fact of life, which is I think something that the FBI and other state and local law enforcement have run into and seek solutions for, but at the same time you've said that it's not a problem that's gonna go away simply because Congress passes a bill. Can you, what is your perspective on that? Is this because encryption has gone so global and it's just available to anyone who can access the internet? Where are you coming from when you think about that? Yeah, so again, I don't think there's much danger that Congress is just going to pass a bill and most of that resistance to the idea, any sort of extraordinary access or backdoor is probably driven by privacy concerns and deeply felt privacy concerns on both left and right. The idea that there should be a deliberate vulnerability so that only the good guys can get into my medical records is just that doesn't sit well with most. And I think as time goes by, it sits well with fewer people. Now, this is a representative body I serve in, right? I mean, emotions matter. So if all of a sudden there is a San Bernardino times 100, you saw what happened with the Patriot Act after 9-11, all of a sudden all bets are off. So again, if you saw an event where the FBI could credibly claim it might have been preventable, which I think they've had a hard time claiming to date, sentiment could change and your representative sentiment could change as well. But so right now there's really not much appetite for anything along those lines but then you get into the sort of more technical concerns that people can intuitively understand. Again, if you legally require American technology companies to provide extraordinary access to the US government, well what about the Chinese government? This is the classic problem of who's the good guy that has extraordinary access. I'm pretty sure that's not a position that any of the technology companies want to be put in. It's a no win for them as Google and others will tell you as they try to do business in China specifically. And then again, we've alluded to this. If the vulnerability is there, the bad guys are also pretty good at what they do and that just I think makes people nervous. Then there's a legal question too. I mean, this came up with Apple in the San Bernardino case. It would be a fairly novel thing legally as Apple argued in San Bernardino to actually require for the government to require the construction of a product that was deliberately designed to pick a lock if you will. That's certainly not where Kalea is and it's kind of a novel legal concept. And so I think you'd have a real struggle with that as a concept too for the lawyers to work out and I'm not quite, I'm not a lawyer and I'm certainly not an expert on Fourth Amendment law but that's a novel concept that the government can command somebody to affirmatively build a product that not only serves the government needs but arguably damages the strength and marketability of the company's other product. So there's a lot of, again, you intuitively understand in the San Bernardino situation although again, that was a post facto rather than pre-event situation. You understand the sentiment but I think when you really get into the guts of either the technology, the operational questions or the law, it becomes very, very complicated. And so when we think about the law in the US it doesn't seem like there's a lot of momentum. I think one of the things that we are facing internationally is that you have countries like Australia and the UK that are starting to think through legislation in the case of the UK, the Investigatory Powers Act has already passed and so there's a question about how that gets implemented. In the case of Australia there is a bill that literally this week is being considered that could be interpreted if interpreted aggressively by their governments to allow for those respective governments to compel US tech companies to build their products in certain ways. Do you worry about the effect this could have on US and global cyber security? What is the kind of interplay that the international encryption debate has with this debate here in the US? Yeah, it's a great question and I've represented as of industry here. Australia is scary, in some ways China is scary, right? I mean just the two or three orders of magnitude different market sizes. And of course you've got a spectrum of cultures around privacy. The UK has never been particularly aggressive on the privacy of its citizens. Germany I would argue is on the other end of the spectrum. They're pretty sensitive about anything that gives the government access to citizens' private information. So I guess that is what it is. There are market forces, China will do what they do. It will be very uncomfortable. I know it is uncomfortable for the tech companies. Australia probably a little less so. But at the end of the day again, the reason I feel like this is a little bit of a silly debate, how are you going to keep, let's just use an example, how are you gonna keep an Australian citizen from using WhatsApp? I mean I assume there's probably some technical way to do that, there's probably a workaround to that technical way of doing that. So you get into a world where, and I'm not expert in what Australia is talking about, but there's probably a world where, in some ways it's a scarier world because if Australia somehow succeeds in ending strong encryption in Australia, two groups of people are still gonna use strong encryption in Australia. One is gonna be the sort of hardcore privacy people. The other people are gonna be the bad guys. And so again, I just, I sort of wish we could move beyond this idea of backdoors and you should have Senator Feinstein here to articulate the opposing point of view because I've got a, not a universally held point of view. But let's ask our law enforcement and intelligence agencies to do what they get paid billions and billions, or use their billions and billions of dollars of resources to keep up with the technology, not to try to stop it. I just, historically speaking, trying to stop technology is always a losing bet. So I want the NSA, by the way, I'm sure this is not a universally held sentiment in this room, but I want the NSA to really think hard about how to crack encryption because I want them to do that. But that doesn't necessarily mean the answer of course is to create a technical vulnerability. Yeah, so I think we definitely agree with you on that. And it's certainly not a universally held point of view, but I do think that the debate increasingly moves toward how much information is protected by encryption, how many different types of people need it. Earlier at this event, we had folks from human rights activists, journalists, representative from the national network to end domestic violence, talking about how critical encryption is to protecting all of their different constituencies. In addition to that, I think one of the things that we've seen that relates to the importance of encryption from a national security perspective is how much data is at risk now in our elections. You're about to assume the chairmanship of the NSA and Cybersecurity Subcommittee, I imagine, that the Intelligence Committee in general and your subcommittee in particular will be very focused on meddling by foreign nations in our elections and disinformation and things like that. Can you talk about the role that encryption may play in protecting our elections going forward? At one level, that's an easy question to answer in the sense that technologically speaking, we kind of know what the standard should be for our election systems. They're highly distributed, that's a good thing. There's still 12 or 13 states that don't have paper backup for optical readers. That's a bad thing. They should change that. You heard enough about these machines breaking down that you sort of think to yourself, gosh, they should make better machines. And stuff like voter registration records should be protected and encrypted. That's not hard to say. The harder part of this problem, which is a bit beyond the scope of this conversation, is that to meddle in an election, the dumb way to do that is part of the way that Russians did it. It's dumb to break into the DNC and try to, or John Podesta's email. You're gonna get caught. We're gonna see it. You'll get caught. The smart way to do it is the other thing they did, of course, which was to try to use our very open media environment, our social media environment to pull at threads and try to widen rifts and throw racist red meters right out there and find those cracks in the American body politic and aggravate those cracks. That's a much harder problem to solve because we have a free society in which people can say what they want. So, again, at some level, it's a pretty easy answer. Our electoral apparatus ought to be strong. But the more pernicious stuff is really when somebody decides they have an interest in really lighting fires under a particularly targeted population or something by playing to their prejudices or angers or grievances. Yeah. So, and I think one of the things that, hopefully, people will take account of going forward is the importance of security mechanisms, like encryption, like multi-factor authentication in order to protect against those kinds of attacks. So, you don't get the data that underlies all the thread pulling, right? You don't have those emails to use as ammunition in our electoral processes. So, I think I'm just gonna ask one last question before we kick it to the audience, for some audience, Q&A. But that is just generally when you're looking at the next Congress and many of the issues that you're gonna undertake and conduct oversight of, and it sort of goes beyond encryption, but what do you anticipate being the priorities as you go forward in the next Congress and your subcommittee, but in the House Intelligence Committee more generally as well? I would, some of the, a lot of the tough issues got resolved in the last couple of years to some people's satisfaction, actually to my satisfaction, I like on a lot of things I tend to be in sort of a pragmatic, fairly moderate place on this stuff. I thought the USA Freedom Act was a good compromise. I mean, I'm glad the NSA is not storing our metadata anymore. You know, I'm glad that they'll have access to it through a warrant-like process to the providers. 702, again, I was more or less happy with the way that came out. It wasn't perfect, but good, we're sort of past that, those big issues, which were hard because like on a lot of issues, the American public doesn't spend a lot of time getting into the intricacies of 702. And so people have very strong emotions that are often based on imperfect understandings of what's really going on. By the way, that's true members of Congress too, but that is what it is. So I think some of the hard stuff has been dealt with. I would say that, you know, we're gonna continue to invest in our agencies in ways that will make them incrementally better around attacking other people's networks. A lot of us, most of us, all of us, will continue to work hard to make sure that that technology does not get turned on to the American people and is used carefully internationally. That's a big deal. There are some really interesting things out there technologically like AI, like quantum computing that could sort of change the environment in fundamental ways that may require kind of a rethink. And of course there's always the possibility of another, as I said, sort of San Bernardino times 10, so that it spends, you know, 10 weeks in the public imagination rather than two weeks that could change the weather around these questions. But I do think for the Democratic majority in the House election security will be much more of a priority. My Republican friends were a little constrained from really going hard on election security because it pointed uncomfortably at the whole question of why and did Russia help Donald Trump? And so they were a little bit constrained by that. We're not constrained by that. So I do think we'll be much more aggressive in the area of election security. It'll be interesting to watch this debate. I don't know who's here from the social media organizations, but there's gonna be continued pressure on the technology on Twitter, on Facebook, on the platforms, to police themselves that candidly will be uncomfortable. I've struggled personally, again a little scope, a little beyond the scope of the conversation, but, you know, Alex Jones, who's this monster who denies that Sandy Hook happened, my house is 25 miles from Sandy Hook Elementary School. I mean, nobody gets angrier at what Alex Jones says than I do. But I watch with some discomfort the position that Twitter is put in, you know, you got to take them off the platform. And my discomfort stems from, I sure don't want the government controlling the information I have access to, and even less do I want Jack and his buddies controlling, so you see my problem here, you know? And I don't have a good answer for that. I sort of do. It's not one that's satisfying to my constituents. You know, hey, you're citizens of the United States, right, that comes with some responsibilities, including being a good and thoughtful and critical consumer of information. You know, please God, don't look to the government or to Jack or to Mark to curate the information you get. Now, there's a line, right? You know, I don't, you know, incitements to violence, et cetera, but that's going to continue to be a debate that'll probably be uncomfortable for the tech industry, and even for those of us who would like to see more of the solution lie on the shoulders of American citizens to be critical consumers of information as opposed to being protected from stuff that's uncomfortable. Yeah, I think that's really important. And I think we've got an audience who's eager to ask questions, so I'll open it up to you in the middle of the room. And if you could just state your name and your affiliation, and instead of offering statements, be sure to ask questions. I will offer a statement, no, I'm kidding, I'm not. I'm Justin Lynch, I'm a journalist with Defense News and the Military Times, and I'm also a fellow at New America. Congressman, thanks for coming. In the Trump Administration Cyber Security Act, it's a strategy that was released three months ago. It talked about, and I quote, modernizing electronic surveillance and computer crime laws, and it talked about working with Congress to do that. I'm wondering if they discuss this with you, if you kind of know what they're talking about, if you ever need to update what they mean by modernizing electronic surveillance and computer crime laws. Thank you. I don't know what they mean by that and that has not been an area of particular engagement with the Congress. Sadly, my committee, those kinds of questions would live probably primarily in the Judiciary Committee but to some extent in the Intelligence Committee. My committee sadly spent the last two years under the soon to be ex-chairman, trying to find exculpatory and confusing data around the President. So we were sort of out of the game on a lot of this stuff. But no, to answer your question directly, no, I don't know what they mean. I can imagine what I, let me tell you what I hope they mean. There's lawyers in the room who understand this a lot better than I do, but we have a bizarre patchwork of privacy protections and that should be modernized and reformed. I mean, as you know, an awful lot of the law is built up around the old twisted pair, wire line technology that doesn't really work for the IP world. Somebody smart, which probably rules out much of Congress, but somebody smart needs to think about the fact that we wanna treat information in a more uniform way with respect to privacy as opposed to what kind of wire it happens to travel over. And so, there's relics. Turns out, I'll get this wrong, but emails that have been held for 60 days, 100 days, all of a sudden they're subject to less fourth amendment protection and emails held 99 days, why? And so I do think, again, I'm not answering your question because I don't know the answer to your question, but I do think that there is a lot of room to be a lot more thoughtful about how we protect data that is not based on the technology over which it travels, but it's just based on it being, you know, my medical records versus my shoe preferences. To me, that feels like a more logical way to think about what kind of protection should be afforded rather than is it in an IP world or a twisted pair world? Hi, Congressman, I'm Alan Wheeler with the Cherdoff Group. My question is basically about your thinking on the sort of vulnerability equities debate. And as you alluded to, the NSA needs to have vulnerabilities or looking for vulnerabilities in platforms, encryption, in order to sort of do their work. But as we sort of discovered somewhat recently, the NSA doesn't necessarily keep a hold of all their tools as well as they perhaps should, which is led to some of the industry to sort of talk about or call for some level of those vulnerabilities, some level of knowledge sharing there. And I was sort of curious as to what your thoughts were on sort of the balancing act there and sort of the correct approach and if there needs to be sort of a more codified standard, but obviously acknowledging the limitations of sort of intelligence conversations. Thanks. Yeah, yeah, great question. And I'm not sure I have the right answer. When NSA discovers a flaw in somebody's operating system, do they keep it and exploit it or do they let people know? Industry will always be on one side of that and the security apparatus will probably always be on the other. Given the nature of my job and appreciation for the things that really help us in our national security, I probably have a bias towards, let's have those guys up at Fort Mead really working hard to find the vulnerabilities. And then when there is a sense that it's no longer a particularly valuable tool, let's share it in such a way that fixes it. And it is kind of a Machiavellian way to think about it, but we happen to be better at this stuff than most people, so that works if you're better than others at it. That's not necessarily an ethical structure, but it's a practical structure if your job is to keep people safe. So I would probably have a bias in that direction. I also think it's incumbent on organizations like the NSA to keep policy makers like me and folks within the executive and elsewhere really up to date on advances in this area. And when I say in this area, I don't just mean breaking into networks and code breaking, but in much more broadly speaking, technology that could revolutionize the field. I talked about AI and quantum computing earlier. Those two things feel to me like they might be, this is I'm sure a tired metaphor, but they might be akin to nuclear weapons in the sense that they change the way you think about security and encryption. So again, I wish I knew what exactly the right answer is. I probably have a bias towards making sure that they find those vulnerabilities and exploit them right up to the time that it is no longer a proprietary advantage for the country and it's national security to do so. I think we have time for one or two more questions. Hi there. I'm Sean Lingus with Cyberscoup, a reporter with Cyberscoup. Thanks for being here, Congressman. I'm wondering how with the new Congress, how you might set the tone in terms of cyber security, sort of at least policy signals, specifically around encryption. You might have seen that the, I mean the so-called crypto wars, the discussion over whether law enforcement should have backdoor access was kind of renewed when the US and the Five Eyes partners released a statement a few months ago expressing concern about the proliferation of encryption. Given those signals from the administration, I'm wondering if Congress sees a need to step in and at least put some legislation out there, have a conversation about the importance of encryption and that sort of thing. Thank you. What kind of legislation are you alluding to? Well, there's been legislation in the past. I mean, Representative Liu has put out a bill kind of preempting states from tackling this issue when in policy ideas in that realm. Anything that sort of affirms the positive value of encryption given that there haven't sort of mixed signals from administration stretching back years from both parties. Yeah, I'd be surprised if you saw legislation per se, you know, advancing the cause of encryption. I think rather there will be a sea change in the house. You know, Jerry Nadler is a very different guy than his Republican counterpart. Adam Schiff is a very different guy than Devin Nunes. Both of them are your sort of classic left-leaning liberals who will be much more concerned with privacy protections, I think, than their Republican counterparts were. That doesn't necessarily mean that legislation will move. I do think it means that it is that much less likely that a Feinstein Burr effort is gonna gather any momentum at all. I do think this is something of personal interest to me. I do think there will be a renewed emphasis on trying to jumpstart the establishment of international norms in whatever form that might take. That is not something that's been prioritized by the White House or the Republicans. So I do think that a lot of us will be very interested in saying, hey, there are common interests between us and our worst adversaries whoever you wanna put in that blank, the Iranians. We all have an interest in going after rogues, non-state actors who have offensive capabilities, just to name one area of common interest. So I think it's gonna be more a defensive play against the whole concept of access and a higher premium put on privacy concerns and probably an emphasis on working with the world to come up with a better set of articulated norms around cybersecurity. One last question. Yes, in the back. Hello, I have a question about, this is mostly about digital security. What about health security? Is there an overlap? Like 25% of the economy is healthcare and there's gonna be a lot more telehealth. You think there's a way to get a public option in there and maybe use some of the money for, encryption, upgrades as well as health insurance upgrades. Just wanna snuck in an undermain question. Let me address that very quickly. So here's my best guess on healthcare. You're gonna see a major effort to shore up the Affordable Care Act. You're gonna hear a lot of talk about a public option and an optional buy-in to Medicare. The single payer, the Medicare for all is a little bit contentious within the Democratic Caucus so I don't think that's gonna come steaming right out of the gate. I am pretty sure that getting healthcare right, at least in the near term, is not gonna generate revenue. It's gonna, having lived through the Affordable Care Act wars, it's going to require more money to better cover more people so I don't see that as a likely revenue source. There is a tangentially related really important thing which is the protection of our healthcare records, particularly in this day and age of just remarkable ability to read the genome, to predict diseases that individuals might be susceptible to. I mean HIPAA was always critical in this world where the genome may tell us a lot more about ourselves than we can even imagine. I think we really need to be focused on, I mean everybody focuses on their credit cards and that's good but I don't know about you guys but as interested as I am, I have not done 23 in me because I just, to me, that feels like information that is at the very core of our privacy interests and it probably would be well worthwhile our taking another look at compliance and security around our medical records. Yeah and for what it's worth, I haven't done 23 in me either for that exact reason and I do think this ends on a really good note which is when you think of the kinds of information that we're protecting with encryption, our healthcare records, our times DNA records, our familial trees, this can be some of the most personal information to us and I think encryption is a critical tool to protecting those data just as it is to protecting our communications, trade secrets information and information that's absolutely essential to national security and to the agencies that you oversee in your role in the House Intelligence Committee. So thank you so much for taking the time to speak with us today. We really appreciate you coming out here and having this conversation with us. Thank you. Thank you so much. Thank you very much. Thank you so much. And I just invite you to go out and enjoy your phone at the atrium. Thank you everyone for coming.