 So let's officially start and let me welcome you to the 2020 ethics village. It's always lovely to see you. Privacy and civil liberties oversight board member Travis LeBlanc is that's a mouthful. I'm, you know, given that we are, as everyone knows, we are all in the midst of a worldwide pandemic. We're doing this virtually this year, obviously would love to see you in person and be having this conversation live in Vegas, but we will make do with technology and see where we go. So let me give a brief introduction of who you are and then we'll jump right into questions. So Travis LeBlanc is a board member of the Privacy and Civil Liberties Oversight Board and before he was confirmed to what is known as the P-CLOB, the acronym for the board, he was appointed by the US Department of Commerce and the European Commission to serve as an arbiter under the EU-US Privacy Shield Framework. Because the way that the P-CLOB is set up, only the chair of the P-CLOB is a full-time employee, so to speak. It means that everybody else generally has other full-time jobs, which you do. You're a partner at Cooley, where you're on the management team of Cooley's litigation department and you've also served as vice chair of Cooley's cyber data and privacy practice and Travis is a leading authority on cyber security, data privacy, telemet communications and the regulation of emerging and disruptive technologies. Now, before all of that, Travis was chief of the FCC, Federal Communication Commission's Enforcement Bureau during the Obama administration, where he spearheaded hundreds of enforcement actions involving consumer issues such as false advertising and the Telephone Consumer Privacy Act, unfair competition, regulatory compliance and fraud, waste and abuse of government programs. And of note, especially for this conference, he brought the first data security action by the FCC and he led enforcement actions resulting in fines together totaling over $100 million, which I think was a record for the FCC's Enforcement Bureau. And before all of that, Travis was a senior advisor to former California Attorney General Kamala D. Harris and a special assistant attorney general of California where he oversaw California's complex litigation and policy in areas such as technology regulation, high tech crime, cybersecurity, privacy, intellectual property, antitrust and telecommunications. Indeed, he created the privacy unit that is responsible for enforcement of California's Privacy Act and created California's e-crime unit, which has the primary mission of investigating and prosecuting multi-jurisdictional criminal organizations, networks and groups that perpetrate identity theft crimes, use an electronic device or network to facilitate a crime or commit a crime targeting an electronic device, network or intellectual property. And before all of that, he served during the Obama administration as an attorney advisor in the US Department of Justice's Office of Legal Counsel, which advises the president, attorney general and executive branch agencies on the constitutionality and legality of the programs and activities of the US government. So welcome. I'm first gonna focus a bit on your current role as a member of the Privacy and Civil Liberties Oversight Board and ask you for those watching who may not be familiar with what the P-Club is and does, whether, could you please give us a little background about the board and the projects that the board has been working on of late? Well, thank you, Stephanie, for that very kind introduction. I also wanna thank the Ethics Village for having me join you all virtually this year at DEF CON. I was in Vegas last August and was around DEF CON as it was taking place and got to really see a lot of good friends who I will miss seeing, not being able to see at least in person this year. Listening you talk about my background, Stephanie, you definitely made me feel like an old man because there are a lot of years that are there going back, but that's okay. That is A-okay. Maybe it's just that you've done so many things in a short period of time. One could look at it that way. That's where all my hair went. That's the explanation for it. Look, I'm thrilled to be here and to share with folks, some of the work that we're doing at the P-Claw. The Privacy and Civil Liberties Oversight Board is an independent agency within the executive branch. It was created out of a recommendation from the 9-11 Commission, which as you all probably may recall, one of the main challenges that the US faced around 9-11 was the lack of intelligence sharing between the intelligence community and law enforcement. That resulted in a host of new authorities that built in a number of surveillance tools that could be used by the United States government to help prevent another terrorist attack in the future. Well, along with those surveillance authorities was a desire to ensure that the privacy and civil liberties of US persons was being balanced against the use of those authorities to protect the nation from terrorism. The Board is run by five members that are all appointed by the President and confirmed by the United States Senate. Three are always members of the party of the President. So currently the majority of the Board, three members are Republicans and then two are Democrats. I'm one of the Democratic members of the P-Claw. As Stephanie mentioned, the chairman is full-time, but the other four members are all part-time members, which means we balance our work at the Board with employment outside of the government. In my instance, I'm a partner at a law firm, Cooley. And I have been on the Board since last year. We are in the process of reviewing several projects, but our jurisdiction primarily extends to the programs and activities that are meant to protect the nation against terrorism. Much of the work that we do is classified. We all have security clearances at the highest level and are not prevented from gaining access to any information because it is classified. Our work generally breaks into two buckets. One bucket is advice and the other bucket is oversight. With respect to advice, we are generally in the capacity of being asked by a component of the intelligence community about a program or activity that they are engaging in or are seeking to engage in about the extent to which the program could better protect or does protect privacy and civil liberties of U.S. persons. Those advice projects we typically do not publicize, in part because they're often classified, but also in part because we like to incentivize the intelligence community to seek out our counsel and have a little bit of independent advice being provided to them about ways that they can improve the privacy and civil liberties protections of the programs and activities of those components. The second bucket is our oversight project. And this is where we often will elect to look at particular programs or activities of the government and decide which ones are worthy of having the P-Clobs independent oversight. We have several of those that are ongoing right now, but we've just recently finished up a review of the National Security Agency, the NSA's program to collect call detail records, hear me refer to you as CDRs, but essentially the NSA for many years operated a program where it collected large scale metadata about telephone calls, primarily outside of the made to numbers outside of the United States, but there were ways in which it could catch calls within the United States. We found in our report that this program was ineffective, that it was needlessly intrusive into the privacy and civil liberties of US persons and that it was extremely expensive. In the few years that it operated, since 2015, the program cost about $100 million. And for that $100 million, where they collected a very, very, very large number of details about telephone calls, the NSA produced 15 intelligence reports. Of those 15 intelligence reports, the Federal Bureau of Investigation, the FBI, concluded that only two had information the FBI was unaware of. So for years in collecting a massive number of calls by US persons, there were only 15 reports that were generated from this. It was sort of shooting, it was literally looking for a needle in a haystack to try and find something. There were also numerous compliance incidents and data integrity concerns with the program as well. So much so that a little over a year ago, the NSA decided on its own that it should shut the program down. And we certainly concurred in that decision and put out a report to improve transparency into how this program had operated. In my view, it was the digital equivalent of the bridge to nowhere. The NSA has deleted all of the records associated with it and the authority for the program has now expired. We're doing, in addition to the CDR project, we have several ongoing projects that we're working on right now. A few examples of those are, we're looking at the use of open source intelligence by the Federal Bureau of Investigation, essentially gaining access to commercially available databases or how they might mine social media or other open source materials for investigatory purposes. We are looking at the terrorist watchlists that are out there and sort of standards around who gets on them, how you get on them, how you get off them, what protections are in place to protect privacy and civil liberties. We're looking at a Department of Treasury program, the terrorist finance tracking program as well to examine that. And then we are looking at a program operated by DHS, the Department of Homeland Security, looking at biometrics and the use of biometrics in airports. As many of you may be aware, the Department of Homeland Security uses a series of biometric collection tools in airports, everything from fingerprints to facial recognition technology. Our project definitely has a huge focus on facial recognition technology and it's used by the Transportation Security Administration, by Customs and Border Protection and by airlines and others. To sort of examine right now what is mostly a pilot program, but that DHS certainly anticipates rolling out long-term at all entry and exit points in the United States going forward. So that, Stephanie, that's a lot of information about some of the work that we're doing. I definitely did not share everything, but hopefully that answers the initial question. Absolutely, you've given a lot to talk about. And so let me just dig a little further into two specific things that you mentioned. The report that this P-Club produced on the Call Detail Records program, this of course isn't the first time the P-Club under different members, I should say, looked at the Call Detail Records program. One of the things that caught my eye about the current board's evaluation of this program is you all were really able to shed light on the cost of this program versus the efficacy, the results. And I think you really illuminated for perhaps the federal government and certainly the public that one of the things that needs to be considered is whether we're really getting a bang for all buck with these kinds of programs. So were you surprised at how expensive this program was versus kind of the limited end product that it produced? Oh, I definitely was surprised about the disparity. I mean, when I saw 15 reports and I know everything that had to go into operating this program, I knew it was extremely expensive. As you may be aware, under the statutory authority that was available to the NSA, they had the ability to pay telecommunications carriers to actually participate in the program. And so that is an expensive endeavor as well that's involved. So it was very important to me that we understand the cost of the program because as we are evaluating the impact on privacy and civil liberties, we ought to compare that to the effectiveness of the program and also the cost of the program. A billion dollars to intrude into people's privacy and civil liberties and get almost nothing from it is not really worth the program in terms of advising Congress about whether or not the program should be extended or advising the executive branch about whether or not it should be ended. So I do believe that it is critical that we have an understanding of just how much these programs cost so that we can really weigh the benefit of the programs against the expenses to not only the privacy and civil liberties, but also to the public fisk. So in agreeing with you on that point, I should have said at the beginning that my name is Stephanie Pell. I'm a professor at West Point, but anything that comes out of my mouth, although this interview is about your views, but anything that happens to come out of my mouth are my views and do not represent the views of the Army, West Point, or the federal government at large. So anyway, but I agree with you. So moving on, you mentioned an ongoing project where the P-Club is looking at DHS's use of biometric technologies in airports. I'd like to broaden the biometric technology discussion a little further with you and just recall the story that many of us have read in the Washington Post about a man named Robert Williams, who is an African-American man who was arrested at his home in front of his children and held for a night in jail to a misidentification based on a facial recognition algorithm. So in other words, he was arrested based on this misidentification for a crime he did not commit and was held in jail overnight. And I would say that unfortunately, we shouldn't, as horrible as this was, we shouldn't be terribly surprised as we've known for some time that facial recognition technologies disproportionately misidentify people of color and women. So how would you address the disparate impact that facial recognition technologies have on communities of color? And more broadly, what concerns do you have about how the U.S. government collects and uses information produced by biometric technologies? Yeah, we all know and have known for a very long time that facial recognition technology has a big impact propensity to misidentify people of color and women. There are numerous studies on this. In 2018, the MIT Media Lab produced a study that found that there was an inability of facial recognition technology algorithms to identify or misidentify people of color. Last year, the NIST, which is a federal government agency, put out a report looking at 200 facial recognition algorithms and finding evidence of bias along issues of skin color. There was bound to be a time, if not many times that have happened, but one that we now know of where a person of color would be misidentified by a facial recognition algorithm. I refer to these as race breaches. This is essentially a breach that is taking place in the algorithm. I think that going forward, if the government is going to rely upon facial recognition technology, it has a responsibility, a duty to ensure that people of color, women, are not disproportionately misidentified or adversely impacted by these particular technologies. And part of that is by making sure that before any facial recognition technology is deployed, that the government or any public authority has given a real strong consideration to the impact that this particular algorithm will have in this particular use case on communities of color. And that means really beginning to think about, not just does this particular program impact privacy and civil liberties, but recognizing when that impact disproportionately impacts a particular community and ensuring that the government is taking mitigating steps or taking any steps to remediate that disproportionate impact. It's about making sure that we are thinking about the deployment of a technology like a facial recognition technology that we're thinking about equality. And that from the beginning, we are building in place protections that are designed to ensure that these programs do not operate in a manner that disproportionately impacts people of color. I call this equality by design. Many people may be familiar with the concept of privacy by design. That is building privacy into a new product so that it respects it from the beginning and doesn't collect any more information, for example, than it needs to. Well, we ought to do the same thing around civil rights concerns. We ought to build equality into the very design of the product from the beginning so that ultimately it minimizes any impact that it's gonna have on communities. I don't know that at the end of the day that's ultimately going to solve the problem with bias in facial recognition algorithms. We've seen a number of corporations over the last two months begin to pull away from their facial recognition products that were being sold or marketed to law enforcement. IBM, for example, has stepped back from it as have Microsoft and Amazon all making announcements. And we've seen several cities now and states begin to look at whether to ban facial recognition technology and its use by law enforcement. San Francisco's done it, Boston has done it. In fact, the Massachusetts legislature is, it looks prime to ban it in the state for use by law enforcement. And I think that there's no doubt that one of the primary factors that are motivating these companies as well as these governments to limit the use of facial recognition technology is they haven't often gotten the issue of bias along racial lines, correct? And as long as that's there, we have to be mindful about rolling this technology out so that we don't end up in a world where we now have digital biometric technologies that essentially are discriminating against people of color in the same way that so many people have experienced for decades or centuries in this country. We don't want a world where one segment typically has to go through a secondary inspection or additional questions or is presumed guilty before just by their image. That is the world that we're trying to avoid right now. And the last thing we wanna do is hard code that into the 21st century by beginning to deploy new technologies for the government to use that could ultimately disproportionately impact and be adverse to people of color. So you are really calling for, it's not just a, it's not a new discussion but broadening the discussion not just about privacy by design and looking at the privacy impacts of government use of certain technologies but a specific focus on how the use of those technologies could have a disparate impact on marginalized or communities of color. Because if not, we risk using technology to further facilitate racism at scale. That's exactly right. I mean, these algorithms are created by humans. These humans may have biases or the way that the algorithm is trained could be biased that those are easy examples. But what I am saying is we shouldn't just think about the impact of a program on this amorphous concept of privacy that really focuses on the privacy of the majority when there is a disproportionate impact on vulnerable communities or disproportionate impact on historically disenfranchised or disadvantaged communities, we ought to be looking at that program and scrutinizing it even more. And I think that's what's missing right now. What's missing right now is we often just think about privacy writ large. We don't actually think about how the privacy of the minority is being disparaged by a particular program. The last thing we wanna do is have a world where the privacy and civil liberties of the majority is protected while the privacy and civil liberties of the minority is disregarded or otherwise disproportionately impacted. We don't want privacy to be a luxury good to anyone. Right, and at times it's had a tendency to be that way, unfortunately. So I'd like to now talk with you about a letter you wrote, I believe last month because we're still in July, to the acting secretary of DHS in your official capacity as a P-claw board member. And you wrote to express your concerns and to raise questions about a DHS program requiring all air travelers to submit to mandatory DHS administered temperature checks or thermal imaging before boarding a commercial airline. What were some of the concerns that you expressed in your letter to the acting secretary of DHS and have you received a response? And my understanding also is that some members of Congress have now taken up this cause and concern with you. Yeah, I have several concerns about this program. It is the plan, as I understand it, is for the TSA to administer temperature checks at security checkpoints in airports. My understanding is that this has been requested by various airlines, in part because they don't wanna administer them and because they don't wanna pay for the temperature checks that would be administered. I have strong concerns about the efficacy of temperature checks at or fever checks at detecting whether someone is infected with COVID-19. As you may know, there are many reasons why someone would have a fever. COVID-19 is one reason, but there are a host of other reasons that range from an illness to a condition, a heart condition, for example, or maybe the person had just been running through the airport to get from the check-in desk to their flight because they're late. I'm also equally concerned that if someone wanted to suppress their temperature or their fever, that that's also easy to do by taking aspirin or ibuprofen, for example, or just putting a little ice on your head for a little while before you walk up to the checkpoint. Thirdly, it's not apparent to me that TSA agents are trained in asking questions or administering public health examinations. And so, it's not, I have no idea how TSA would plan to train these agents, which whose mission is supposed to be the security of air travel, not necessarily the public health of air travel. I'm not saying that there aren't public health experts in the government that do have the training to do this, namely, the Center for Disease Control, the CDC would have this expertise, but the CDC has already publicly stated, or at least information has been made publicly available that they don't wanna participate in this program because it's not effective. So there are concerns about efficacy, there are concerns about the authority that TSA has to administer this program, and there are also concerns about the government's collection of this information. What are they collecting? What are they going to do with this information? What database is it going to go into? And how is this going to impact travel? Are they going to ban individuals from passengers, from traveling for 14 days? Are they gonna create a no fly list or add you to the no fly list? What happens if you come with a doctor's note that actually says I just got tested yesterday and I don't have COVID-19, even though I have a fever? Or more importantly, who's going to reimburse the individual for the plane ticket that they won't be able to take? Are they gonna be guaranteed refunds by TSA? Are they gonna be guaranteed a seat in 14 days? What if they're going for their cancer treatment? Are they gonna be denied access to the right to travel to their doctor? Or are they going to be told to just go home and stay at home and wait? It just seems that there are so many questions that not only haven't been answered about what they're collecting and what they're doing with this, but also the impacts and how this is gonna impact the traveling public. I also have concerns about the disproportionate impact that this program may have on people of color. CDC has put out several reports on how COVID-19 is disproportionately impacting people of color. African-Americans are more likely to die from COVID-19 than our other groups. African-Americans and Latinos are more likely to be hospitalized and Native Americans are more likely to be hospitalized than other groups. If we assume that COVID-19 disproportionately impacts people of color, then we should assume that a program that is designed to root out COVID-19 is going to ultimately end up disproportionately impacting people of color. And it is extremely troubling to me that the right to travel and travel by air when in many instances, there isn't an adequate substitute that would be available to those folks that they would somehow be denied the right to travel. And that could impact your job. It could impact your health. It could impact your family or personal situation in ways that are, I think, potentially deeply troubling. And so I'm hopeful that DHS will rethink this program. Several members of Congress are also on both sides of the aisle, Republicans and Democrats seem to be concerned not only about the efficacy but the impact on privacy and civil liberties of this program. And hopefully TSA and DHS will come to its senses and realized that it shouldn't be in the business of administering temperature checks in airports. And if they do want to actually increase the actual safety of air travel with respect to COVID-19, there are other measures that numerous experts have reported on that airlines, for example, could take to better secure air travel and to better ensure that it's a healthy experience for all passengers. Well, I look forward to what, if any, responses are forthcoming. So as you mentioned before, the P-Club had broad discretion and authority to conduct oversight and provide advice to the US government with respect to the implementation of executive branch policies, procedures, regulations and information sharing practices relating to efforts to protect the nation from terrorism in order to ensure that privacy and civil liberties are protected. So this authority therefore covers the activities of both law enforcement and intelligence agencies. And given the scope of the board's discretion within the context of terrorism programs, if you alone could determine the priorities for the P-Club, what projects would you add to its current slate? So a very good question. There are many, I think there are so many issues within our jurisdiction for us to look at that the challenge we really do face is balancing the limited resources we have with the projects that really do stand to benefit from oversight by the P-Club. Among the topics that I would prioritize would be stingrays. Many folks may be aware of the existence of a cell site simulator technology that would allow someone to intercept telephone calls as well as text messages if they run a stingray near your location. I think it's important for us to examine the use of stingrays by the intelligence community and the efforts to protect the nation against terrorism. I'd also think that there'd be some value in the P-Club looking at when components of the US government decide to use hacking tools and beginning to think about what standards are around those, who they can be used against, what approvals need to be put in place, how they can be secured to ensure that they don't get into the hands of an adversary which unfortunately we've seen happen more than once. Thirdly, I'd wanna look at a full review of the Foreign Intelligence Surveillance Act, Pfizer. I think there are a lot of issues there that we could explore. Everything from the use of foreign intelligence surveillance information in criminal prosecutions to the approvals for those authorities by courts, whether intelligence courts or traditional article three federal courts, your normal US district court to looking at the amicus program that operates under Pfizer. But I think there's just a lot of work around Pfizer for the board to do and that itself could be full time. I'd also wanna look at domestic terrorism issues. Typically the board in the past has spent almost all of its energy focused, if not all of it focused on foreign terrorism, yet we know that the number one threat to US persons is actually domestic terrorism. This has been the case for centuries. This isn't a new evolution. And in fact, every year it's a greater threat, but we don't spend a lot of time really looking at the privacy and civil liberties impacts around the domestic terrorism. And then I think probably finally, I'd wanna look at the sharing of national security information with non-national security agencies, such as immigration and customs enforcement. Really beginning to think about the extent to which national security authorities are being used for non-national security purposes. And I think that's an important task for us to take because more and more government databases are being created that are intended to integrate, consolidate information from a lot of sources. And once they're in there, they're very hard to get out. It's sticky data once it's in there. And so I think it's important for us to get a greater understanding of a lot of these intelligent sharing authorities and tools that are being used. Well, that is certainly a worthwhile and very expansive list of projects. I will just take moderator's privilege and say perhaps that is an argument for more resources for the privacy and civil liberties oversight board. So I'll piggyback that comment to the next question and say given the privacy and civil liberties concerns that spring from law enforcement use of new and emerging technologies, one example, you've talked about a number of examples, facial recognition, other biometric technologies, big data analytics, you mentioned sting rays in a counter-terrorism or terrorism-focused context. But could you see an important role for the PECOB to play beyond the specific frame of privacy and civil liberties in the context of protecting our nation from terrorism? And I guess what I'm asking is, should Congress expand the PECOB's jurisdiction to oversee and address privacy and civil liberties concerns that reach beyond the nation's counter-terrorism efforts? Well, look, far be it for me to decide for Congress what the jurisdiction of our agency should be. That's a question for Congress and they regularly consider it. In my view, privacy and civil liberties concerns have only increased over the last decade or two decades. And those concerns about government or even corporate surveillance are such that there needs to be some agency that is looking into those issues. One easy example of that is everything that's happening right now in COVID-19 in the pandemic. We have a lot of debate right now about contact tracing and a lot of debate about what authorities are needed for public health agencies to collect data about each of us, not only to help find a vaccine, but also to be able to go back and contact individuals that someone who's infected may have been in touch with the prior 14 days. There are a lot of expansive authorities that are being used by government agencies because we are in a pandemic, quarantine, for example. You know, I mean, imagine the impact of a quarantine on privacy and civil liberties. Yet we don't actually have an agency in the federal government that's tasked with looking at the privacy and civil liberties impacts of all the activities around the pandemic. What agency is going to look at that, for example? And we don't have agencies that are devoted towards looking at the privacy independently, looking at the privacy and civil liberties impacts of the general activities of the federal Bureau of Investigation, for example. So at least in my view, I think it's high time for the United States to have two things. One is, you know, a basic federal privacy legislation we need it. And number two is, you know, an agency or more, or two, you know, choose the number of agencies that are committed to the task of protecting the privacy and civil liberties of the US public. And that may be in the corporate, but it also, frankly, should be in government and making sure that there is some independent check out there on the expansive use of surveillance authorities, whether or not they involve terrorism. So switching gears a bit. As we mentioned at the beginning, in addition to your role as a Peacob board member, you're a partner at the Cooley law firm with a very interesting privacy and cybersecurity practice. And of course, as a practicing lawyer, everything you do in that role is not public. And in addition, due to obligations to your clients, there are limitations on what you can discuss publicly. However, there are two interesting cases involving the computer fraud and abuse act that you've been litigating that have received some press coverage. And to my knowledge, these are the first two cases of their kind. So the first case that I'm gonna refer to is the Broidy case, where on behalf of Mr. Broidy, you sued the country of Qatar and a number of individuals for their role in hacking and leaking Mr. Broidy's email. And as one news outlet reported, quote, this is a case about a hostile intelligence operation undertaken by a foreign nation against American citizens who have spoken out against that country's support for terrorism and who have entered into significant business relationships relating to defense and counter-terrorism with a rival nation. The other case involves a lawsuit by WhatsApp against the NSO group and Israeli based spyware firm alleging that NSO's Pegasus spyware was, among other things, used to hack phone systems of 1,400 users between April 2019 and May 2019. There are very interesting aspects of both of these cases, but let me ask you, because these cases illustrate how foreign governments are increasingly attacking individuals, hacking their phones, networks, what do you think we should do about this problem? There is no doubt that numerous nation-states are targeting U.S. persons and corporations in hacking campaigns. Not only are they, sometimes they are doing this to gain access to embarrassing information about an individual and then to release it. Other times they're using it to gain access to the intellectual property of a company which they will exploit either by giving it to a competitor or releasing it. Other times they are seeking access to engage in a disinformation campaign. All the while, no matter how much a company or individual invests in cybersecurity protections and safeguards, many of these nation-states have almost unlimited resources and ability to continue to target that entity. Even when you have the best security, they are likely at some point to find a way in. That way in may not be through the CEO. That way in may be through the CEO's executive assistant or the SISOs, spouse may be a way in there. These are sophisticated campaigns that are being engaged in against Americans. And when these are brought to the attention of law enforcement, they're limited in what they can do. Yes, it's possible to indict a Chinese person or two or three, all right? And someday those people may be brought to justice. But today that's not gonna stop that company that's been impacted by this from having a lot of trouble as a result of the hack. And unfortunately, litigation too only has its limitations due to a law called the Foreign Immunity Act, a different, the acronyms are all that say the Foreign Sovereign Immunity Act, which essentially gives any foreign government immunity from litigation in the United States with a couple of exceptions. It is now time for the United States to really rethink the Foreign Sovereign Immunity Act and to rethink the extent to which it ought to be used as a defense by foreign nation states that hack US civilians and US corporations. And unless the government really steps up to protect us, we are all limited in our ability to do that. There has been some legislative movement on the hill looking at this particular issue in Washington, DC. I'm hopeful that Congress will be able to address this issue because it's not just that the privacy of US persons is being put in jeopardy, but there are hundreds of millions, if not billions of dollars that are also being put in jeopardy as a result of the ability of foreign nation states, rogue nations, as well to target US persons and companies for cyber attacks. That is a, this is a complicated problem to solve. And I imagine in addition to what you have mentioned, we're going to have to consider how this reflects, of course, on US policy as well and how we engage and use our capabilities with respect to other people and other foreign nations. So I will not elaborate beyond that, but I want to very much thank you for your time and giving us some deeper insight into the work of the Privacy and Civil Liberties Oversight Board. Hope to continue this conversation with you in person, maybe in Vegas next year. So please be well. Thank you so much, Stephanie. It was great to see you, a wonderful conversation and I too hope we can continue this conversation next year in Vegas. Take care.