 Good morning, West Coast, and good afternoon, East Coast. I'm Daniel Sargent, Associate Professor at the Goldman School of Public Policy at UC Berkeley. And I'm excited to be here today in partnership with the Center for Security and Politics and the Center for Long-Term Cyber Security to host and moderate a discussion about the risks and opportunities of emergent technologies with two leading national security experts and lifelong public servants. Former Secretary of Homeland Security Janet Napolitano and Senator Mark Warner from Virginia. Our event today will be an hour-long fireside chat followed by 30 minutes of audience Q&A. Please use the Google Form linked in the event description to submit your questions. Please allow me to introduce our distinguished guests for today's discussion. Senator Mark Warner was elected as the U.S. Senator from Virginia in November, 2008 and reelected to a third term in November, 2020. He serves on the Senate Finance, Banking, Budget and Rules Committees, as well as the Select Committee on Intelligence where he is the chairman. From 2002 to 2006, he served as governor of Virginia. The first in his family to graduate from college, Mark Warner spent 20 years as a successful technology and business leader in Virginia before entering public office. An early investor in the cellular telephone business, he co-founded the company that became Nextel and invested in hundreds of startup companies that created tens of thousands of jobs. Janet Napolitano is a professor of public policy at the Goldman School and directs the Center for Security and Politics at UC Berkeley. Previously, Janet served as the 20th president of the University of California and formally as Secretary of Homeland Security under President Obama. She is a former two-term governor of Arizona, a former attorney general of Arizona and a former U.S. attorney of the District of Arizona. Professor Napolitano, I am pleased to turn this conversation over to you. Well, thanks, Professor Sargent and good afternoon, Senator Warner. It's great to see you. Nice to see you, Madam Secretary. I hope you had a good Thanksgiving and I know that you all have a very busy schedule over the next few weeks. So we really appreciate you taking this time to be with us and share with us some of your thoughts about new technologies, emergent technology, cybersecurity, but other issues as well. So let's just dive in if we can and what do you view as the key emergent technologies that the United States needs to be preparing for? Is it AI, quantum computing, anything else? What are your thoughts there? Well, Janet, one, it's great to see you and it's a great opportunity. I appreciate this opportunity and for Professor Sargent, thank you for the introduction. As you know, but I didn't say, I was fortunate enough to get involved in kind of an emerging technology back in the early 80s. The beginning is a wireless industry. I just recently, literally, an hour ago finished a Zoom with all of my interns and to see the looks on their faces when cell phone technology, I said was a cutting edge emerging technology in the early 80s and they were kind of like baffled by that because it seems so old school at this point. So I'm not sure I can fully answer that question of what are the emerging technologies? I know from the intelligence standpoint, we look at artificial intelligence, we're looking at quantum computing, we're looking at hypersonics, we're looking at supply chain issues around things like semiconductor chip switchers, both legacy chips and kind of cutting edge next generation. But I think about last week, our committee or two weeks ago, our committee had a fascinating brief on what has taken place in both biotech and bioengineering. And there was a great analogy made actually by a professor from Stanford and a startup, not so much startup company in life sciences that's done quite well and making the analogy that biotech and bioengineering or the equivalent, where we are now is equivalent of 1975. And we think back to ones and zeros in 1975 there was a lot of, on the horizon, I'm not sure we could have fully predicted. So I would put that list out there, but I think we have to include life sciences. I think we have to include the challenges and opportunities, especially from a national security standpoint of overhead satellites and what all is happening there. And I think we just also have to be on the, recognize that no matter what definition we put around emerging today, we have to be nimble enough to amend that so that the next thing that comes out of Berkeley or Stanford or MIT or Virginia Tech, that we're flexible enough in our thinking to move as technology moves. Yeah, yeah, you used to have a pretty funny line you used in speeches about Nextel. Do you wanna share that with the audience? I would always point out the fact that having involved in Nextel and co-founded it, that I was the only politician that said, even when I was speaking, leave your cell phone on because if it goes off, others people hear an annoying sound, I hear cha-ching, cha-ching. And I was talking about that joke last night and I realized even making that joke today, Janet, would probably not, would not go over well with most generation Z kids because they're saying, well, who uses a cell phone to call somebody on a voice call anyway anymore? Yeah, time goes very quickly both in tech and biotech, which I'm very interested that you raised. Like, what do you think are some of the risks that we already know of associated with new technologies? I'm reading the new Henry Kissinger and, not just Henry Kissinger, it's also Eric Schmidt and another on AI at this point. And obviously I think AI poses some of the most dramatic issues as we think about the ability for machines to outperform humans, the ability for machines to not only outperform, but take on a decision-making process that may be independent of humans, the idea that we would turn, whether it's defense systems or other choices over to machines where they're not able to do that. Over to machines where there may be algorithmic bias, too often built in by people that look like you and me as opposed to people who look like the rest of the world, how we sort through all that. I'm not sure that we are sure of at this point. I do think with like, let's just start with AI for a moment. If we look back, I would argue, if we look back now to the late 90s, 1996 Telecom Act, which put in place things like section 230 and kind of gave the total freedom around social media platforms to kind of break things with no impunity or with total impunity. You know, I think I would have a more fervent debate. I would come as a policymaker and saying, maybe we should have put at least some initial guardrails in place or have a trigger that says 10 years from now, we're gonna put some guardrails in place because we weren't sure, you know, as we've thinking about social media platforms, what both good, but also I would argue just the dark underbelly of social media and now putting in place some of those rules and regulations, it's been frankly, one of Congress's greatest failings that we've done nothing. Zippo, California obviously has moved further ahead on privacy, but on other guardrails, we've not been able to put any in place to what has become a fairly mature technology. When we think about AI, which makes the issues around, you know, social media platforms look puny in comparison, the idea that we're going to allow this kind of set of technologies, daily machine learning, AI, I mean, the fact that we don't even have full definitions around each of these to evolve on its own kind of in the wild without any guardrails and then think that, you know, 10 years from now, we're gonna come back and put guardrails, that worries me out. Now, if you were to say, all right, what guardrails you put in place, I don't feel unconfident enough to answer that by any means, so I don't wanna, you know, but I would hope that we would, I was at, when I was in London recently at Google DeepMind, there was some fascinating things going on there. They're moving ahead on a series of areas that at least on first blush could benefit humanity. I wish there was an equal set of academics, you know, ethical theorists, lawmaker policy makers, thinking about how we ought to also put some, some guardrails around what's clearly going to be something that's gonna fundamentally change all of our lives. And I just worry at times in some of these technologies, when we wake up to the reality of it, you know, it may be too late, the life sciences issue, where the CEO was talking about the idea around nitrogen, I believe was nitrogen around the need for that in terms of fertilizer and how, you know, how much money we spend trying to create these fertilizers whereby soybeans had the ability to do this process, literally through the root structure. So the idea that, you know, kind of, I thought with like biotech, they overpromised and underdelivered for years, but literally they're talking about now, grafting that kind of biological engineering onto, you know, corn or wheat or other products. And you know, you could obviously, environmentally it's the right thing. You know, it sounds all these wonderful upsides, but for every upside you think in bioengineering, there's also the chances of a dramatic downside. So how we sort this out in a world where frankly, our policymaking in Washington has gotten extraordinarily slower, as you well know. And what I almost feel, I won't keep going on on this question, but it's an important one, one of the things that is really concerned me over the last few years has been the fact that America, and I would almost argue the West writ large, has retreated from a lot of the standard setting entities that are taking place as these new technologies on an international basis are developed. I see this as a former telecom guy on 5G where frankly we were asleep at the switch and China. And when I say China, my beef is not with the Chinese people, it's with the PRC and Xi Jinping's policies, but they kind of flooded the zone and took over the standard setting entities around 5G. That has implications well beyond the technology standard. So how do we think about each of these emerging technologies in a more holistic way and not simply be chasing the technology or chasing the venture capital that's making money out of this technology? Yeah, I think that's right. And so I take it you would recommend that the administration lead an effort to re-engage internationally with standard setting with these new technologies? I think that when we think about standard setting, the short answer is yes. And when we think about standard setting, it's more than just again as a telecom guy which frequencies and kind of the basic kind of technology nuts and bolts. I think implicitly we build our biases in and when those biases are coming from democracies, we build in the notion of some level of transparency. We build in the notion that maybe there should not be total governmental control. We build in the notion that if a call was going from, Berkeley to Buenos Aires, it maybe shouldn't be routed through Shanghai, the way wallway equipment would route that. And I do think there is a chance here to kind of re-engage an alliance of the willing amongst democracies around the world that does not and should not be American only driven. And I think there is a real willingness in the inside the administration. I've had a long talk with Secretary Blinken about this recently where they are putting in place both kind of an emerging technologies division at State Department, but also around this standard setting body because in the past what happened is we would send both governmental people, but also private industry would come to these standard setting bodies and generally speaking kind of the West drove the agenda more. I don't want to focus entirely on wallway and 5G but this was the case where government stepped back particularly under Trump. And then on top of that private sector was not sending as many experts of these associations and entities setting the standards and China flooded the zone. So I think this is one in a variety of areas where I think there could actually be a reemergence of American leadership in combination with our allies. Yeah, Senator to what extent does the ability of Congress to be agile and nimble and proactive as opposed to reactive require the interaction of both parties? And what do you see as some of the issues with the kind of partisanship we see out of DC now? Well, it's, we would both. Or are these issues, if I might add, are these issues some that members of both parties are willing to engage in? Well, I'll give you the good news and the bad news. And obviously, your time is both Homeland Security Secretary and as governor, you led always in a way that was also that effort of trying to be bipartisan. And one of the values of bipartisan does not mean that the objective solution set is necessarily better. But it does mean that particularly in our country when we go from one team controlling power to the other team controlling power, we don't re-litigate the issue again. Whereas if we do it with only one team, it's constantly being re-litigated, which I don't think makes much sense. I do think on technology issues, there's still more agreement. I do think one of the great frustrations I had coming from the Intelligence Committee and kind of being the first to expose some of the manipulation done by the Russians in using Facebook and other platforms is that we still have not been able to move at all in that area. We've not been able to move in privacy. Again, I point out California has moved in privacy. Obviously the Europeans have as well and we're ceding that traditional leadership. So that would be the bad news. No section 230 reform, even though Facebook on my daily Politico brief says they are in favor of section 230 reform, we've not been able to meet some consensus. Yet I'll point out two areas where I think that we are finding agreement. One on the need for America to up its game in the global and domestic production of semiconductor chips and seeing the shortages there. I mean, the Chips Act, which is part of what's called USICA, a broader-based research bill, got 68 votes in the Senate. It's crazy to me the house has not taken it up, but that is an area where bipartisan and some big money, $52 billion on semiconductors and 2 billion on 5G and O-RAN open radio access network. Congress put it to mark. I'd also say we have right now the defense bill is up and one of the amendments we're trying to get on that I think we'll get 75 votes is around an area that you have a lot of expertise, cybersecurity, as you know, we have no mandatory reporting requirement for cybersecurity incidents. So thank God, SolarWinds reported and Colonial Pipeline reported, but neither of those entities needed to even tell CISA which is the new entity that was, I know you supported that's ultimately been created to try to have that kind of not regulatory, but cybersecurity, domestic cybersecurity expertise, there was no requirement to report. So we've worked to put a reporting requirement with appropriate indemnification and privacy protections and others because you've got to not only tell the government, the government's got to then share it with other folks in the private sector. And those are two examples of where there is still bipartisan and I take some more than a little bit of pride in this. I think the intelligence committee, which I'm proud to be chairman of, we kind of view ourselves as the facto technology committee because there is no committee of technology in the Senate as you know, we're in the house for that matter. House has got a science and technology committee, but it doesn't have as broad a scope as some of the things that I think we're looking at. We stay bipartisan on these issues, some AI and a quantum to concerns about hypersonics. So there's some good and some bad coming out of this, but it is an area where in the past, America would have already has exerted its leadership in each of these areas and our failure to do so I think will has cost us. Yeah, yeah, I think that's right. Yusieca is an interesting bill that was pushed by, as you said, bipartisan members of the Senate. I think Senator Schumer played a leadership role there and getting it through the Senate. What all is in Yusieca besides the CHIPS Act? And I wanna return to the CHIPS Act in a moment. Yusieca has basically $52 billion for CHIPS, $52 billion for 5G and next generation beyond 5G called open radio access network. Then there is roughly 150 to 200 billion that's not appropriated, so it's simply authorized. And a lot of the dispute in that part has been, it is a dramatic plus up of the National Science Foundation. There are other areas, but one of the battles became should we simply plus up National Science Foundation or should we plus up as well? Department of Energy, our national labs. Again, you guys in California blessed with great winter. We've got at least one in Virginia. And much of the, I think the unwillingness, we reached some accommodation between NSF and DOE and frankly it was not my debate. I'm not sure whether we got the right mark or hit the right split there, but I think that could be resolved. A lot of this Yusieca battle has been, I think twofold, one less a partisan battle and more a traditional Senate versus House. The fact that the Senate did all this and Todd Young, Republican Senator and Chuck Schumer kind of took the lead on one piece of it. John Cornyn and I on the CHIPS piece took the lead and the House felt they were left behind. So some of the kind of traditional House versus Senate exchanges. And I think there is also, and understandably concerned that this be viewed not as, be viewed as a pro-America research and development bill and not be viewed as an anti-China bill. And again, I think all of these, one of the things I'm very sensitive to or at least try to be is recognizing that when we talk about a rising China we make clear who our beef is with and that some of this anti-China rhetoric does not become used as a tool for anti-Asian-American or anti-Chinese-American discrimination, which I know we have seen, there's been stories recently that sometimes I think the FBI and others have gone a bit too far. Yeah, so, and I need to disclose for the audience I'm on an advisory committee for Intel, which is a semiconductor manufacturer, the largest in the United States and indeed the world. But I think one of the things we've noticed is that semiconductors are pinch points in the supply chain. If you don't have enough semiconductors you can't produce enough cars. You can't produce enough, ask about anything one uses has a semiconductor chip or chips in it. And I think in this leads to the issues with China, but the bulk of semiconductors are actually manufactured at foundries in Taiwan. And that raises the issue of China, Taiwan, the United States, are we prepared and have we thought through this? What are your thoughts there? Well, this is an area you probably have more expertise than I, but I've really tried to go to school on the industry over the last couple of years. Interestingly enough, as you know, this has been an area that's been boom and bust for some time. And we have two fabs in Virginia, one that's expanding, one that's actually been shut down for a number of years, because it went through this boom and bust period. Matter of fact, pre COVID there was even some concerns about oversupply in the chip industry. And what happened with COVID was, we had this dramatic cutback and capacity and yet the, and part of the consumer moved towards buying more electronic products they operated home. That's where the chip manufacturers moved their business and kind of legacy industry, like the autos got left behind, one of the reasons why we still have some of those auto plants sitting idle. So this, I think we do have to recognize there is kind of boom and bust, number one. Number two, I think we need to have to recognize that as you know, from Intel there's chips is more, not all chips are made the same. There's kind of cutting edge chips, legacy chips, there's memory chips, and there are places in this equation, such as packaging and the machining and other areas where we and our allies, I think some of our allies like in Netherlands and elsewhere, we're still doing pretty well. You point out appropriately though, that, you know, Taiwan has a lot of some of those cutting edge manufacturing. American manufacturing of overall chip supply has gone from roughly 40% to about 12%. PRC itself has gone from 12 closer to the, on the path to 25 to 30. So you have the concerns around national security at Taiwan. You have the kind of increase in China itself making huge investments, estimates of $150 billion worth of capital investment. And what you also have our countries from South Korea, that's talking about somewhere between 65 and $130 billion of investment. You've got Japan talking about multi-billion dollar investments, Taiwan continues to do well. So there is a question here on a national security basis that we need to have some level of this domestic manufacturing facility in our country. There is, I think the consensus that I've come to is that unless we provide some additional subsidy dollars, there will not be new fabrication facilities built in America because these fab plants run between $8 and $15 billion. And if other nations are willing to subsidize two to $3 billion per fab and they take a lot of land and a lot of water and the industry experts say we're about 20% more to 25% more expensive to do it here in America. I think we have to make these investments from a national security standpoint from maintaining domestic supply chain, from keeping kind of, we keep some of the innovation but I still think having these long-term facilities in America make a great deal of sense. And this 20 years ago would be called industrial policy but when we see other nations, particularly China, make these kind of huge multi-billion dollar investments not just in chips but in a series of other areas. I think we in the United States and by implication, the West needs to pony up as well. We need to put our money where our mouth is. And I found a lot of my Republican colleagues that, and I think about John Cornyn or John Thun or a number of other senators on the Intelligence Committee who, this is a pretty dramatic change for them to acknowledge that the market alone is not gonna solve this. If we leave this to the market alone, there won't be additional fabrication facilities built in this country. Yeah, we'll lose our domestic production capacity. And we may still keep some of the innovation but the innovation oftentimes sometimes goes with the fabs too. So I think this makes sense. And I think getting it right, and this is why you seek or the chips bill is so important, getting it right and having a process that $52 billion, about $12 billion research, $40 billion roughly on, that could help subsidize seven to 10 new fabs built over a number of years here in this country. We gotta make sure we put appropriate controls in place because I think what we're gonna doing in chips we may have to do in artificial intelligence, we may have to do, we are already doing in quantum computing. There may be a series of other in our emerging technology areas that we have to make these kind of large scale investments in not just at the research basis, but actually at the development basis as well. Right, so I think the thought needs to go into what are the critical elements of emergent technologies where the United States from a security standpoint needs to put some of its public dollars in to remain competitive with the world. And that's a very different approach to spending, government spending than we've seen before where we're actually putting in significant dollars to support one private industry. And I get this, and matter of fact, I was fortunate over the Thanksgiving break had a fairly good critique on the chips bill saying maybe we're putting it in the wrong place and stuff and I would push back against some of this. But I think this maintaining a domestic chip fabrication facilities and it's basic research, investing in some of these other cutting edge technologies, I would argue is more important than adding an extra plane or ship or tank because I think most of the competition in the 21st century is going to be around who wins the technology evolutions, not who builds the most traditional military hardware. Although again, as you well know, even in the most sophisticated military hardware, you've got to have those bleeding edge chips as a absolutely critical component piece. Oh, for sure. And so you can't easily segregate between hardware that you need on the military side versus what we need simply for supply domestic production of consumer goods and other material. Well, one of the things, let me just add one thing here, Jennifer, one of the reasons why I think there's been part of this evolution is that, I think COVID showed that the just in time global supply chain model that we've all kind of gotten used to a long way and gotten used to that at the end of the day that it may be worth an extra few cents on a chip or an extra two cents on some PPE to make sure that there is a domestic or if not entirely domestic plus allied supply chain because again, my friend Debbie Stabenow who argued very strongly with me in one part of the battle to make sure at least some of this went into legacy chips which I was not 100% sure of at the beginning, the number of auto plants that are sitting idle in Michigan right now because they don't have access to chips, all the tanks in the world aren't gonna stop that. Yep, yep, that is for sure. And turning to you Seika, Mark, what do you hear on the house side? Is it gonna be stuck there forever? Is it gonna move? I think there is a recognition, there was an announcement right before Thanksgiving that we would have a conference on this bill. I'm not sure what the house is going to, what's the house bill? There was some minor bills that were kind of in this neighborhood, but nothing of this scale. So it is in my mind, one of the highest priorities that we need to make happen. I think Secretary Romondo, Commerce Secretary has been great from the administration at pushing this and advocating for it. I wish the White House had been as active on pushing for this. Again, this was a case of a bill that was passed in July and put it on my partisan hat for a moment, coming out of some of the challenges around Afghanistan if we'd sent a strong signal of passing you Seika that we were stepping up our game not only on chips but on investment in emerging technology. I think that would have been a great win for the president and a great signal to the rest of the world. And it's been more than a bit frustrating other than this kind of, you know, interseeing who gets credit house versus Senate. There's not been a lot of clarity. So I'm hoping to be on that conference committee and ready and anxious to meet to get to yes, the sooner the better. Yeah, yeah, agree. We've talked a little bit about cybersecurity, but let's turn to that directly if we could. When I started as secretary in 2009, our chief threat stream still involved aviation security. And I spent maybe 10% of my time on cyber. By the time I left four and a half years later, I was spending a good 40 to 50% of my time on cyber. I mean, it was exploding. I'm given our hope that we don't end up in an actual kinetic war with China, but we can anticipate that we're in a battle on a way in the so-called gray zone where cyber is concerned. And, you know, how do you see our interactions with the Chinese in this area, but the Russians, the Iranians, others that are active in the cybersecurity realm? Well, I think, you know, secretary in my office now, the current Homeland Security Secretary, if he's not spending 60 or 70% of his time and with CISA, which again, I know it was the independent agency that's been set up afterwards took us too long, not due to you by any means, but, you know, it took us too long to get it stood up to try to have that domestic non-regulatory, but, you know, the fire person you call, you know, when the fire gets lit, call CISA, so they can help when you public privately respond. Huge, huge issue. I mean, China, you know, during your tenure with President Obama, I think you did send a strong signal to China for a while and they cut back on some of the intellectual property theft, but it is estimated that, you know, China steals 300 to $500 billion a year of IP from us and around the world. That's a lot of dollars that if you don't have to invest as a nation-state, then you can acquire, you can steal that through cyber or it's not just entirely cyber through joint ventures and other things for, and let me also say, I think a lot of American and other businesses have turned a blind eye to Chinese government's bad behavior because of their, you know, fear that they would lose the Chinese market and consequently have made compromises they wouldn't make anywhere else, but that intellectual property theft is where China has done most of its activity. Russia has done, you know, traditional information exploitation, the way SolarWinds, but also, you know, they are Russia and its quasi-agents of people that may work for the GRU during the day and cyber criminals at night, they have been much more ransomware-based attacks, but I'd like to step back and say for a minute, you know, SolarWinds where the bad guys, the Russians in this case have been attributed, you know, got into 18,000 companies. Luckily they just exfiltrated information, but if that had been a complete denial of service and shut down of those 18,000 companies, our whole economy could have come crashing to a halt. So this is an area where we are still vulnerable and there was a fascinating story. Again, I saw in, you know, the public press over the last three or four days that has shown kind of the next, where this kind of cyber activity still not kinetic, but conflict may be headed. And that is the case and I've not gotten independent intel on this, this is just from, you know, public domain stories that showed the recent back and forth between Iran and Israel, where the Iranians thought that the Israelis were shutting down their other gas stations and drove up the cost of their gasoline and do complete disruption for a couple of weeks into people's availability to get gasoline in Iran. And the Israelis have then said, you know, the Iranians broke into the biggest gay website in Israel and disclosed a main and a half Israelis private information. So this is where this is different than a ransomware attack or this is different than stealing intellectual property or this is different than traditional spying, but this may be the kind of where cyber conflict is headed, where you've got, again, you're not bombing someone, you're not, you may not be violating the rules of war, but you are definitely affecting a domestic population's lifestyle. So I think this is, whether it's one, two, or three, if you say what, as chairman of the intelligence committee, what keeps me up as much at night, cyber is definitely one of the top three. Yeah, I have to say, I'm glad to hear that, but I'm not glad to hear that because we wish it weren't such a present risk, but it certainly is. You know, one of the areas of risk associated with new technologies is the risks to our democracy itself and the role of social media in being sort of an accelerant on the flame of extremism on both sides, but primarily in the US recently, we've seen it on the right-wing side. What do you think Congress can do or should think about that? Well, we ought to do some version of what California and Europeans have done with GDPR and put in place some basic privacy rules, number one. Number two, and these go before the potential of breakup, which I've not moved to yet, but I'm open to if we don't make some progress. I think what we ought to do, and I was a telecom guy, as I mentioned earlier, we ought to import some of the ideas from telecom that used to be really hard to move from one telephone company to another in terms of long distance, so you had number portability. I think we need data portability and interoperability. So if you get tired of a certain platform, you can easily migrate with all your data to NUCO and still then talk or communicate with people that remain on the previous platform. So data portability. I think, and this is something again, I know the California legislature looked at, but wasn't able to look across the line and this may probably would be too much for the American Congress to grapple with, but I think the idea that Facebook and Google and Twitter and so forth are free. You and I both know they're not free. Their model is simply based on, they suck information out from us and then monetize that. I think there's nothing implicitly wrong or morally wrong necessarily with that, but I think we ought to become informed consumers. So I'm a big believer that there ought to be some requirement that these platform companies share with their consumers or their products as the case may be. How much that data that they're sucking out of us is actually worth. So some level of data viability law, visibility law. And then I finally think we do need to take on what I referenced earlier, section 230, which back in the late 90s when this legislation was passed, basically put in place a complete impunity and a complete legal liability shield against any of the content on these platforms. Maybe that was right in the late 90s. I'm not sure 25 years later, it still makes sense. And again, even the large platform companies like Facebook say they're willing to do changes there. And we have made certain changes. I mean, childhood pornography, bomb making. I've got legislation with Amy Klobuchar and Maisie Horono that would call the Safe Tech Act that would say let's at least make certain things that are already illegal if they are used by social media companies. So if you're doing civil rights violation in business and you're doing that over social media, there ought to be some liability. If you had the illegal alien actor, which is basically what Facebook allowed when the Miramar government was using Facebook as a platform to encourage people to go out and murder the Rohingya, there should be some liability there. If you were levels of cyber bullying that are illegal in certain other areas, maybe they should be illegal as well on social media. The ability to be able to enforce injunctive relief. I mean, there was this horrible case of somebody on I think the Grinder site that got, somebody manipulated, somebody said he was another person, had his life basically ruined and there was no ability, even though there was to get injunctive relief to try to prevent what the platform didn't even deny was happening, but they said section 230 protects us. And then also I do think that there ought to not be, well, I think their first amendment obviously needs to be preserved and you'd have a right to say stupid stuff where you have the right to have it amplified a billion times remains to be seen, but I don't think there should be that same kind of first amendment protection if a platform is receiving benefits from paid advertising. I mean, there are prohibitions on television and radio from another mediums. If you are selling a faulty product or a pyramid scam, that's, you can be gone after. There is no such prohibition on social media. So this, our Safe Tech Act, I think preserves the first amendment, but gets set at some of a, it's not full content moderation, but it simply does enforce some of the laws that are already out there now on social media. There's also been, and we're still looking, we've had some good conversations with Republicans to get bipartisan support. There is a companion bill already in the house. There's been another approach that has been gotten some bipartisan approach bipartisan support that I've been also looking at that looks at the, again, easy to say hard to put in place, the algorithmic biases that may be coming in place if the algorithm is sending you disproportionately to someplace that is, and I don't fully understand how they're shaving this because it's easy to say but if it sends you to illegal site, my definition that is a little harder to sort through. But I think there is, even this week it's been suggested that the house energy and commerce committee, where probably this will more arise is gonna have a meeting on this whole universe around section 230. That I hope would move, I've gone on way too long on this answer. I wish we could have moved some of these other areas like data portability, data valuations, and I've got bipartisan legislation on what's called dark patterns in which again, I think you know and this audience probably knows, but for those who don't, when something comes up on the site that you have no basic ability to opt out, you get the big flashing light here to sign up here and you've got to go three different pages to find a place to say, no, I don't want this. So that's technically called a dark pattern usage and that ought to be prohibited as well. Right, so what you're saying is that there are a lot to be some guardrails, I'll use that word on going to the business model that the platforms use as opposed to government necessarily itself regulating the content. Yeah, I don't think we're gonna get to, because of the First Amendment and because I don't think in any bipartisan way you can get to content regulation. And I also don't know if I want to, I don't think, I disagree with some of my friends on the right and say these social media companies have an implicit anti-conservative bias. I think it's actually their biases to make money. Their biases to make money. And if you look at who the top 10 posters on Facebook on a daily basis, most of your audience has not heard of probably seven of them because they're far right-wing bloggers and posters. So I, yes, so the thing is, I think we need to have to obviously respect our First Amendment and I think there are ways to respect that First Amendment, but still put some appropriate guardrails in place. You know, and one last comment because I've been following this abroad as well. I know this is a hard subject to grapple with because even when you look at content moderation in countries that don't have a First Amendment, you know, after the great tragedy in New Zealand that the mosque shooting and some of the activities in France, you know, the Bernie Ebell shootings and manipulation of social media in the UK and elsewhere, none of these other countries or kind of the Western democracy countries at least have totally sorted this out. I actually think the British are gonna come out with some legislation that they think will get fully vetted early next year. They may be one of the first, but this is a, you know, even with countries that don't have First Amendment productions, this is not an easy needle of thread. And that's one of the reasons why I think the idea that there's so much attraction and why I've not ruled this out. If we can't put some guardrails in place or if we can't add some more pro-competition notions like data portability and data valuation, then, you know, some of my colleagues who said, we need to look at full breakup, you know, I'm not taking that off the table. Okay, good to hear. So, 230 is still an active topic of conversation with your colleagues. It is, and it's how we can be in a situation after the whistleblower from Facebook testified, you know, what was it, a month or two ago, absolutely damning comments about how that platform is, you know, in her case, I think some of the most powerful was manipulating young women around eating disorders and other issues and say that that status quo is acceptable is just beyond me. Yeah, yeah, totally agree. You've mentioned several times working with the other side of the aisle and you were one of the lead negotiators on the so-called Bipartisan Infrastructure Act that the president recently signed. How did you get that done? And you know, I'm gonna ask about the reconciliation bill, but first of all, how did you get the Bipartisan bill? Give me the nice one first. Give me the nice one first. Yeah, here's the slow pitch over the center. Part of it was, you know, and I don't, the mainstream media's attention span is pretty short, which is not exactly a news flash, but you know, people are saying, how did this group come together? Well, most of this group, at least eight of the 10 of us had worked very closely together with then secretary Mnuchin on the last COVID relief bill that took place in December of 2020, the so-called 908 bill, we had $908 billion of relief. So we had, you know, many of these, my Republican friends like Susan Collins and you know, Lisa Murkowski and Bill Cassidy and Mitt Romney and Rob Portman, you know, we've worked within the past. So this group kind of came about with Rob Portman and Kirsten Sinema who would not in the 908 group working, we had a prior working existing relationship and so there was trust. And it was still hard sausage making, you know, it took a long, long time to get there and you know, the fact that we got 19 Republicans and kudos to my Republican colleagues who took enormous amounts of grief for working with us, but you know, it's kind of hard to deny that a nation like ours that hadn't made a meaningful investment in infrastructure in 50 years, that that wasn't good policy and not just roads and bridges, but things like resiliency, things like broadband deployment, $65 billion, things like, frankly, even the energy component, a lot of transition to smarter grid, to investment in electric vehicle infrastructure, electric buses, you know, it was, I think a good piece of work. And again, it was not putting on again, my partisan hat for a second, I thought it was completely stupid that, you know, Democrats in the house would not go ahead and once we pass that in middle of July, go ahead and pass it, then to give the president a big win and the country a big win. Yeah. And there was a certain irony that we, you know, it literally got signed about a week after our election in Virginia and wearing my partisan hat again, where we lost the governorship and you lost the whole ballot, right? Yeah, we lost the whole ballot and partially due to the fact that we, you know, we had this big win that we could have gone and talked to people in Virginia about and said, you know, and the gubernatorial candidate could have said, listen, I know what we can do here about this road or I know what we can do here about this broadband that would have been a tangible item. So it was, you know, I'm proud of the work. I think it will be, you know, significant for our country and a host of various. I do think one of the things you as a governor will get this more than most of my colleagues, you know, we all know that was governors that passing the bill is just the first step, how it gets implemented. And this level of new spending in areas where you're either creating new programs entirely or pumping up historic numbers, things like roads and bridges, you know, we need the best oversight team possible. I think the president's taken a good step with our mutual friend, Mitch Landrieu, but I think there ought to be a whole team of people on implementation. Yeah, yeah. So turning to the next bill, the reconciliation bill, which is now in your chamber in the Senate, again, kind of give me how you see the lay of the land there. Well, I'm actually pretty optimistic. You know, I'm not going to put a date certain on it, but, you know, at roughly 1.75 billion, and I was prepared to do more than that, you know, because I think, again, over a 10-year frame, you know, the inflationary pressures we're feeling right now partially due to supply chain, but if they're doing the government spending, it's because of the $5 trillion that we spent, you know, oh, not for you. Yes, $5 trillion we spent under both Trump and Biden on COVID relief. But I think, you know, if we talk about what's in it that are pro-growth, like, you know, we know we need more people back in the workforce, disproportionately women. Well, childcare and guaranteed preschool are two pretty good places. You know, making sure that the piece this morning that I saw in the news that one out of every five Americans is a caregiver and one level or another, whether it's for kids or for aging parents, providing some support, particularly for, you know, aging parents and disabled, I think that makes a lot of sense, taking on climate change in a meaningful way. I would do a carbon tax, but you know, even without that, you know, hundreds of billions of dollars of incentives. And, you know, I say this is, you know, father of one of my daughters of type one diabetic, at least trying to bring down or put a cap on a drug-like insulin costs, you know, it makes some sense to me. I think we've spent, you know, it's easy. You know, I guess I've been partially guilty of this as well. We've spent the last four months talking about top line numbers and mostly Americans don't have the foggiest idea what's in this legislation. Component parts are popular. So I'm trying to talk about what's in it. I would have, you know, if I could have even corrected a little bit more, I would have probably tried to do less for a longer period of time than, you know, the whole wish list. Because again, I think you and I both know from our time as governor and you being more in the belly of the beast of the federal government that even than me, you know, having the record of the federal government under any president to implement a whole lot of new programs simultaneously has been mixed to say the best. Yeah, yeah. And, you know, one always worries when you see a program that is only funded for a year or two years, given how the political lay of the land could be changing, et cetera, et cetera, regardless of the merits of the program. So neither party has much, neither party has clean hands on creating fiscal clips, whether it's on tax cuts that expire too early or expire at some point or starting new programs. And I think, again, if I could wave partial magic wand, I would have made it less programs. And the only other thing I'd just say on this second half is and I'm not sure we can shift the battleship again, a lot of the initiatives, well, extraordinary important to climate change, a lot of the other kind of social initiatives in this plan feel like they were, the list was put together pre-COVID. And if there was a major change I would make in this legislation, would be thinking through the ideas of the fact that I think many Americans are post-COVID, rethinking what their work-life balance ought to be and how we invest in human capital and treat that investment from a tax accounting and reporting system at least as well as we treat things like research and development and tangible goods would have been the area that I wish we would have spent some more time on. We're about at the end of our part of the session, but let me just ask one concluding question. If you could step back a moment, the United States has been the world's leading economy because we have led in technology and innovation for years. And our universities have been talent magnets from around the world actually. What do you think the United States needs to do and what Congress needs to do to sustain that position as the number one kind of innovation center for the globe? Well, I think we've seen, let me step back and say, I think we've seen that without American leadership candidly the rest of the world founders a little bit. And we saw that when President Trump so dramatically tried to basically take America out of that leadership role. And I think waiting for the Europeans or the Japanese or any other countries to kind of take on these macro risks on their own without American leadership, I think the world suffered. I think democracy suffered over the last four years. But to get this right, we need to make these kinds of investments like you see here, but we also need to make sure that we continue to be a attraction of top talent from around the world, immigration reform, making sure that our challenges, for example, with China are focused on the challenge of the Communist Party and not turn this into an anti-Asian or anti-Chinese kind of political propaganda. I think there are countries like Australia that seem to have even managed particularly that component of how you deal with the Chinese diaspora better than America. I think there's things we can learn there. So let's keep investing in our universities. Let's go ahead and realize we are gonna have to get into at least the area of quasi-industrial policy to stay competitive with China in many of these areas. And let's make sure we continue to be the place where the best talent from around the world wanna live and then study and live their lives. There you go. Thank you so much, Senator. It's been a wonderful conversation. I know we have some questions coming in from the audience. I'm gonna turn it over to Professor Sargent. Terrific. And I'm going to seize the opportunity to lead off with a question of my own, if you don't mind. Earlier in your conversation, I think Senator Warner, you reflected that you wish there was a set of academics, theorists, sort of policy makers thinking about how to manage the consequences of artificial intelligence for security, for society. And in response to that observation, I would like to ask you both to reflect upon the role that universities can play in managing threats, emanating from emerging technologies. Now I'm mindful as a historian of the role that the service academies historically have played anticipating and responding to disruptive technological change from the advent of battleships to the rise of air power. Do you think that civilian universities, which are after all key generators of technological disruption, could be more proactive in anticipating security risks, social risks that emanate from disruptive technologies and in participating in the development of solutions? Janet, you wanna go first? You want me to go first? You go first. All right, well, I just wanted to great question and obviously to a world of mostly academic audience, I'm gonna say yes. But with a couple of caveats. One, there's an interesting idea that Senator person general brand and Senator Ben Sasse have that I'm very intrigued with. I wanna see it fleshed out a little more about creating in a sense, the equivalent of a cybersecurity academy that would be maybe not guaranteeing military service, but recognizing again that this is gonna be an ongoing threat, but how you train people and how we kind of move people in and out of government around cybersecurity. And this also begs the question of kind of a nerdy issue, but security clearance reform so you can get people in and out of government reform on an easy basis. On the issue of AI and the others, I absolutely think the academic community is critically important. I do think though it needs to be married in some format so that it gets the recognition that it deserves. I am sure that on at Berkeley and there's probably 50 different around the country, not just Berkeley, 50 different academics or even working groups that are looking at AI and its implications for ethics and policy. But how that information filters to policymakers and how it's done in a collaboration with the very investors who are making these decisions, the private capital. And some folks in the government, that's where I think we could still make some improvement, but the basic premise is absolutely dead on. Well, I have to agree with my friend, the distinguished senator from Virginia. And I think what we need to develop is a better bridge between the academy and the policy making world and the political world, quite frankly. And I think if we can do that, one of the advantages to the political and policy making world is to have access to those in the academy who are thinking not just of today's technology but who can see around the corner and see what's in development, what's the next thing so that we can become as a country more nimble, agile and proactive. And I think perhaps, for example, on AI where we know there are technical, legal, ethical, moral questions around AI, the idea of forming some sort of an independent commission with kind of one goal in four months, give us your best recommendations on how we handle AI. And I think academicians would leap at that opportunity. And I should simply add to the secretary's comment that you, we did have, Eric Schmidt and Bob Wolk did a pretty good paper on AI that was some of this multidisciplinary, but it needs to be ongoing. And I would just to be, so it doesn't sound like I'm being a total, you know, playing to the audience here. I would challenge the academic world that I feel, and I see this in my own state with our institutions that because the sausage making has gotten so ugly and because in certain areas we've looked so inept and then you've had the antithesis of the epitome of kind of anti-science academic leadership under the former president. I think there's been a lot of the academia that's kind of basically said, you know, we're not going to mess with policymakers and politics. And I think that is a horribly wrong decision and both the kind of intellectual rigor that we need in debates and the ideas in this. Janet said the ability of people who can see around the corner a little bit, we need you more than ever. And it's going to be messy. Yeah. And, you know, one of the things that Eric Schmidt and all have spoken about is the need to have a technology sophisticated workforce in the government. And that's one of the things we've been trying to think through at Berkeley is like, how do we support that development of that kind of a pipeline? And I think by in the government I mean both in the interagency and the federal executive agencies, but also in the staff for the Senate and the House. Because as you know, so much of the prep work gets done by staff. Yeah. And that's again, where I think the kind of nitty gritty issue and I've been working on this a lot and we even made some progress under Trump, getting security clearance reform done so that people can move in and out from academia into the government. And for that matter, back in the private sector, I see on a regular basis, good staff, I'm sitting with one right here who got stolen away by industry or get stolen away by academia. Because, you know, because it's a, the sausage making process at least recently has been pretty damn messy. Yeah. We gotta have this ability to come back and forth. For sure. For sure. So let me pick up in response sort of this metaphor of bridge building which you've both deployed I think to describe the relationship between academia and government. And I would be really interested to hear your reflect upon whether academics might be more sort of proactive in getting engaged with Congress as distinct from the executive agencies. Often officials in the executive agencies are really fixated with operational, tactical sort of day-to-day problems. Is Congress a venue in which academics might be more sort of constructively engaged with sort of longer-term, more strategic level challenges? I'll start again since I'm here. The short answer again is yes. And I think about the intelligence committee. We have a technical advisory groups that we have where we've put academics. And you know, I've been now chair for only about 10 or 11 months but we need to use those more often. And there's a little bit of when the Congress goes from radically one end to the other back and forth having that continuity because building these relationships particularly with academics into the congressional sausage making process. You know, I don't think any academic is gonna come and feel fully utilized on a short-term basis. It has to be some level of trusted relation that goes again to what Secretary Napolitano said is we gotta do this just not at the member level but you gotta have that trusted staff people that can again continue to build those relationships. But I think that's an area that we have underutilized. You know, for that matter, we've even underutilized outside of academia. You know, as Janet knows, there's a whole other industry in Washington of very smart people that are in the think tanks and at least on the democratic side our ability to use even the brains that are 15 minutes down the road has been pretty poor over the last few years. Yeah, and I think, Daniel, one of the challenges is because Congress has evolved in sausage making and it's a big place. You got 535 members, you've got all the staff, et cetera. It's knowing how to plug in and where to plug in. And, you know, working out some sort of process by which Congress knows where to go and in the reverse, those in academia know where to go. Great, so let me introduce some audience questions. There's a lot of audience interest in the topic of artificial intelligence. So I'm going to, you know, just read a question verbatim. An audience member asks, artificial intelligence is a fast emerging technology that straddles industries from national defense to healthcare. When will Congress propose definitive federal legislation to create a regulatory body or even a new department to manage this new technology? It's a big question, but the topic is so important that it seems worth pondering. Janet, I answered first the last two. You take that one first. You know, I think, you know, my read is that there will need to be some sort of crisis regarding AI before Congress actually acts. But I do think there will be a need for some sort of regulatory approach to AI. And I think we actually would be benefited if we had it, if it was beginning now. But I think Congress moves when there's a crisis. And I would agree. And I think, you know, that's kind of where we started the conversation. I am, you know, rather than trying to fix this, or put guardrails after the quote unquote, AI industry is fully stood up, would be a mistake. So how can we get ahead of it? Or at least how can we enhance some regulatory entity that would be at least looking at it? But we don't even have a good definition at this point. And, you know, somebody who's spent a bunch of time, I can sort through a little bit of the differences between big data, machine learning, AI, where one begins and the other ends. You know, I've read Piper Lee's book about the challenge of the China US competition around AI, fascinating. But I'm not sure most policymakers, we need, you know, we need academia combined with, you know, the emerging AI industry to help us at least get the definitions right. So we can, you know, figure out where that regulatory or at least advisory group ought to be. Right. And I think, although I recommend that we grab hold of AI now, you know, there are risks involved. Incorrect technological assumptions, too much technological specificity, omissions, unintended consequences, all of those go into developing a regulatory approach to a new technology. But we have to recognize those risks. And I think the benefit nonetheless grabbing hold now outweighs those kinds of risks. Yeah. Let me improvise a follow up question, sort of to that terrific audience question, which is to sort of ask you to Ponda, whether you think some degree of sort of international cooperation is going to be necessary to deal with threats, sort of emanating from this really broad sort of world of artificial intelligence. If we think specifically about, say, the battlefield use of artificial intelligence, there might be precedence in the Geneva conventions for sort of international, you know, treaty rules to limit the deployment of AI on the battlefield, but achieving such progress really requires sort of international cooperation. Do you think there's a realistic prospect of, you know, governments coming together to tame the disruptive effects of AI as a military technology? Well, let me start. I think that's a big question. And I think if you look historically as somebody who's advocated that we ought to have some international norms around cybersecurity, so that you might have a lower attribution requirement if you're rating a hospital system, or if you're breaking in or creating ransomware to strike back. Or, you know, but my least understanding is it was actually the United States when the rest of the world, including the Soviets and the Chinese and others in the late 90s, the Russians, I guess, in the late 90s were talking at the UN about international standards around cybersecurity. It was actually America that was reluctant because we felt we were so far ahead that we didn't want to enter into any kind of international normative entity. Number one, number two, I do think we needed this internationally because I can keep coming back to the CCP. I think we've seen, you know, if China, which the Chinese Communist Party has access to not only all the government data, but to the access to the BaDu data, the WeChat data, the Alibaba data, and they have already created an Orwellian surveillance state with social credit scores. If that model of AI, you know, becomes the dominant one, I think we should all be extraordinarily concerned. So again, this goes back to the point I've tried to make and I think the administration gets it, but they need to put some real emphasis behind this, that there needs to be this coalition of the willing. And, you know, it ought to not just be the Five Eyes or NATO, but it ought to include, you know, South Korean, Japan, Australia, and Taiwan, Singapore, India, Israel, there ought to be this coalition of the willing around, particularly as you said, Daniel, AI. I totally agree. And I think that we should work to establish that kind of coalition of the willing, regardless of the CCP. If we wait to see if the CCP will come to the table, it won't happen. So, but there are many nations around the world, I think, who would be in a coalition of the willing and kind of amassing that and creating that critical mass would be a good thing. And let me just add one thing on there. And I think that may mean, and, you know, this'll be part of my responsibility and others in government that just because we may have a short-term advantage in some subsets of AI, doesn't mean that we should walk away from that international order, because the value of us doing this collaboration in the long run, whatever short-term leads we have now, I mean, I'd argue the same thing, you know, the race on quantum, you know, who gets to quantum first can break through, you know, all defenses in the cyber world. But it's, I think we need to think about this in concert with others. In the history of the nuclear weapons technologies provides sort of powerful corroboration for that point. But let me sort of segue to an audience question about China's authoritarian abuse of artificial intelligence technologies. An audience member is curious to know, what lessons can we learn from China's domestic application of AI and machine learning to abuse human rights in Xinjiang, where extensive and highly advanced surveillance systems monitor and automate aspects of forced labor camps. As we think about the problem of how to regulate AI and machine learning sort of domestically in the United States, I think what the question is asking us to contemplate is, should we see China as a sort of cautionary example? And if so, what do we do about it? Well, I, you know, using China as a cautionary example, I think, you know, that establishes the need for the United States and a coalition of the willing to take on these AI related issues. Because I think the countries that would be in such a coalition don't want to see the Chinese model the CCP model as the template. And I would simply add on that that, you know, it's not just in Xinjiang, you think about, you know, think about what's happened to the people of Hong Kong, you know, where you had one estimate, 80% of the people take place in some level of protest over. And, you know, you've heard barely a peep and from businesses, you know, that I know that in Hong Kong, they feel like they now have to play by exactly the same rules that they were in Shanghai and Beijing because this massive ability to not only use governmental data, but the access, I mean, the fact that when the CCP changed the laws in China and explicitly said in 2016 that every private, every company, first obligation is not to their shareholders, but to the CCP. That means all that data goes and it ought to be a forewarning to again, this coalition of the willing that we need to interact together. Agreed. So we focused mostly in this conversation on the security threats resulting from emerging technologies. But I wonder if we might sort of take the last sort of five minutes to reflect upon the disruptive effects for society writ large as several audience questions are inviting us to do. So one question that an audience member has posed is how do we begin to grapple with the consequences of artificial intelligence for sort of distributed equity in society? AI technologies are poised to create sort of new forms of inequality, perhaps greater than any that we've had to grapple with in the past. Is dealing with the social consequences, the economic consequences of technological innovation sort of within sort of the purview of lawmakers or is this a problem that is simply sort of too big to address? I'll start on that one. That was a that's, and let me analogize. And I think Secretary Napolitano tried to do this as well when I was governor, when she was governor, she may have been luckier than me, but I always thought that we look back 25 years to the kind of promises of an interconnected world. The promise of an interconnected world was gonna be you could build it anywhere. And that should have been potentially hugely empowering to rural America. But we ended up showing those, we could build it in Beijing and Shanghai and Mumbai, but we didn't do a very good job of Martinsville, Virginia or Roanoke, Virginia or smaller communities and probably Jen has got some in Arizona. So I think we do have to be aware of the socioeconomic impact. I think we have to get educated. I think one of the things we both talked about at the beginning of our conversation was even, you know, algorithmic biases that we may not even be conscious of. And I do think that Congress has to sort that through how do we get ahead of that? I don't know one of the things I've been working on a lot in the last kind of post COVID areas is racial wealth gap issues and access to capital, which again, doesn't solve every systemic racism problem. But if we can have fair access to capital, one of the things that's come out of COVID is we are seeing entrepreneurship activities in people of color in this country at an unprecedented level. A progressive government that's actually, I would argue kind of, that wants the best American capital as impossible, ought to be encouraging that and again, access to capital. But that's, I'm slightly not answering your question, Janelle, because I think it's a smart one, but I don't know the answer. Luckily, Secretary Napolitano's got a great concise answer to that one that's gonna lay out how we figure that all out on a social equity basis. Well, it is a big question and the subtext of the question is this a concern that Congress ought to take into account as we move forward with AI or other emergent or disruptive technologies? And I think Congress ought to be aware of what some of those consequences could be. I think Congress should, once they have awareness, look for ways to mitigate that and identify areas of the country that are particularly impacted positively or negatively. And then there's a fundamental decision to be made, which is, is this something that Congress needs to involve itself in or should we just allow nature to take its course? And I would just, let me just add one other thing to that is think how much fair our economy might be right now. If we'd said in the late 90s, and I would say it's a little radical, not only to our telecom providers, but some of our platform companies who have, probably been the greatest accumulation of wealth, maybe in American history in a shorter period of time. If there'd been a prerequisite not dissimilar to what happened with electricity or water in the 1930s, that there needed to be complete broadband buildup at equal levels of speed at an accessible price. Maybe that promise of connectivity and the ability to kind of build it anywhere could have been realized in a much greater basis. So we ought to learn from that. Right. Well, and on that, we have now the new infrastructure bill, which puts some serious dollars into that effort, which I think is one of the best parts of the bill. Amen. Great, Janet, do you think we have time for one last question? I do. Great, so let me give you one last question. We've talked, you know, great deal today about the role that sort of government has to play in for fending against the risks, the perils that result from emerging technologies. A turn of phrase which has been deployed repeatedly is a sort of industrial policy, which I think is a really powerful way to signal the necessity of some sort of elevated level of government action to promote security in the face of disruptive technological change. But sort of the question that troubles me as I think about sort of the regulatory future is whether we can move forward in a way that engages sort of US international, you know, partners and allies and does not sort of accept the necessity sort of treating all sort of foreign competition equally with the effect that securing our society against the disruptive effects of novel technologies results in our country becoming sort of more insular, more detached from the world than it has been in the past. So is there a way to reconcile the need for more active government role with the international commitments that have defined US foreign policy since the 1940s? Well, I'll take, I'm gonna let you have the last word, Janice, so I'll take a first crack at that one. I mean, I do think America did that in a way post World War II and maybe we will never have that disparity of wealth we had at that point. And we did it even though we helped rebuild Europe and rebuild Japan, it didn't come at the cost, American industry flourished with that as well. I think we're gonna have to be a little more willing to share the benefits, the chips bill, which is important to help us domestically and semiconductors should not be viewed as anti South Korean or anti Japanese or anti the Dutch or anti even Taiwan for that matter. So how we do this in a way that shares some of the upsides is gonna be, that's gonna be hard. And then we also have to guard, I am not, I use industrial policy lightly because it's history of picking the wrong horse has been pretty broad, but I do think in a world where other nation states are making the kind of investments that we used to make and we have now, I mean, the Soviet Union was never an economic competitor, it was a military and ideological competitor. But when we have the CCP in China with an economy soon to be equal to or greater than ours and they're making those kinds of investments, I think we have to in conjunction with them, with that alliance of the willing, but also that means sharing some of the upside and not all of it has to be located in America. At the end of the day, I think our system in terms of its distribution of capital and if it's still willing to attract the best talent, we're still gonna benefit net net benefit, but it can't be we win your Europe loses or our friends in Asia lose that can't be this binary choice. So I'll just follow up on that. And I'm gonna use COVID vaccines as an example. To the extent that we can support vaccine distribution around the world, the United States will benefit. It will help the United States be protected from the next pandemic. And I think as Senator and I were talking about earlier, developing a coalition of the willing on new emergent technologies like AI will mean that we necessarily have to share more, but in the end, I think the calculus will be that we will benefit more. And that's the way I think the country should go. Senator, we are at the end of our time. You have been more than generous with your time this morning. It's been a wonderful conversation. We appreciate what you do in Washington and the contributions you're making. And we, on behalf of UC Berkeley, on behalf of the Center for Security and Politics and the Center for Long-Term Cyber Security at UC Berkeley, I just wanna extend my thanks to you and to Professor Sargent. It's been a wonderful program. Thank you. Professor Sargent, thank you. And Secretary Napolitano, it's always been great. And your center at Berkeley is one of these entities that I've benefited from interacting with and need to do more of. So thank you for the opportunity as well. You bet. All right. Have a good day, everybody. Thanks.