 Welcome. For those of you who don't know me, my name's Candice Rondo. I am the senior director for the Planetary Politics Initiative here. I'm also the director for the Future Frontlines Program at New America. So good to see you all. I know that there is a lot of travel, a lot of journeys. And I appreciate the distance that you've traveled looking forward to our conversations today. I'm guessing that some of you, when you got the email from us about a Planetary Politics conference, you were thinking that this is some sort of geeky Star Trek conference. I can assure you it is not because we are all about Star Wars here. And it's not really yet about space, or at least not yet. Planetary Politics is really a call to action and a conversation about how global power is changing. It's about the fact that our climate emergency, industrial transformation, digitization are changing how we think about power, about sovereignty, and about citizenship. We're really seeing a paradigm shift today. I think you all know that in terms of how our institutions respond to that set of changes. The UN, World Bank, all these big institutions that we formed way back in the day after World War II are not as responsive as we need them to be to these challenges. They're not really fit for purpose anymore for a variety of reasons. One of the biggest reasons is because power no longer resides just in the West. It's not held only by France or the United States or even by China and Russia. Increasingly, Brazil, India, other countries, South Africa. These are countries that matter in the conversation about global power and geopolitics. And so planetary politics is about acknowledging that fact. It is about acknowledging that the way we think about the future needs to change. Most importantly, we need to think about how concentrations of power and wealth are shutting people out of opportunity. And that's why we're kind of here today to talk about that specifically in the digital domain. You all know that digital technology is increasingly centralized and it's actually more in the hands of companies than it is of people. And increasingly also, there's a lot of competition and contentiousness around this. But there are also opportunities for collaboration. So today and tomorrow, we're going to talk to you and hear from you about how you're thinking about internet shutdowns, disinformation, runaway chatbots, all of the things that are kind of scaring us about our dystopian future, but also intriguing us about our potential utopian future, the opportunities out there. If all this sounds really serious, it's because it is. But we want you to be playful and imaginative over the next two days because that's really what it's going to take for us to really change the trajectory of our future together. So we're thrilled to have you all here at the symposium. And before I turn it over, I just want to give a shout out to our team. Lillian Corral and Gordon LaForge, Patricia Groover, Melissa Sallick-Verick, all of you have been very instrumental in helping us pull this event together. And we're very grateful for that today. So let me turn it over to Lillian Corral, our senior director for the Open Technology Institute. Thanks, Candice. And I'll be short, but I'll just give you a little bit of context about our work here at New America, especially around technology, and then set us up for the rest of the day and tomorrow. So as many of you know, since our founding in 1999, technology has always been a part of New America's mission and lifeblood. We carry out research, policy, advocacy, and field building work that advances a vision of technology that is open, secure, and equitable, a technology that really works in the public interest. I'm really proud here to represent a new orientation of our work that's really bringing together New America's technology programs. And so integrate our alignment. And I want to just make sure I acknowledge many of those programs here in the room. As Candice mentioned, the Open Technology Institute being one of them. But we also have our ranking digital rights partners, the Digital Impact and Governance Initiative, Public Interest Technology University Network, the Future Frontlines Program, Share the Mic in Cyber, and the Planetary Politics Initiative that's hosting this event. Over the next two days, your work will help us advance the Digital Futures Agenda. It will inform a major report and a set of recommendations that the team will be advancing through the UN and other multilateral and multi-stakeholder processes, beginning this year with the 2023 UNGA. The starting point is to map out the key areas of contestation and harm so that we can then address the bigger goal, which is to develop solutions for a principle-based global governance of digital technology, right? Though we are clear-eyed about the harms of digital technology, we are committed to a positive vision of tech, that it can and should be a force for well-being, for human security, and sustainable development. So you are truly a global group, as Candice alluded, many of you have traveled from as far as South Korea to Australia to South Africa to be here with us, and we're thrilled that you're here with us because we hope that you might actually be part of a global community of practice that can contribute to shape a new era of global digital tech governance. This event I wanna mention is co-sponsored with our partners at the University of Denver, Princeton, and Arizona State University. And some of you here are in the universities that are also part of our public interest tech network. So I only mention that to say that we continue, we aim and we continue to grow our networks and partnerships and wanna encourage you to recommend others who should be involved in this effort. We wanna make sure that everyone has a seat at the table, whether you're from the North or the South, from all around the world. We think it's highly critical that all voices be represented in this process. I also just wanna recognize our partners at the Ford Foundation, whom have supported our work in countless ways and invested in this effort from the very beginning. We would not be able to do this important work without their support. And lastly, Candice mentioned the team. It is a big team, but undoubtedly, much of this is really Candice's own work. So without further ado, I'll turn it back to her, where she and this great beginning panel will give us a backdrop for the work together over the next two days. It's titled Taming the Digital Frontier, Tech Governance and Fractured World. Candice. Thank you very much. So we've got a couple of panelists here who really kick butt when it comes to thinking about policy and technology and the digital wiles that we find ourselves in. Let me just quickly introduce them. I don't know if I can read your full bio because gosh. Jennifer Bacchus is the Principal Deputy Assistant Secretary for the Cyberspace and Digital Policy Bureau. Previously, she served as a charge of affairs and Deputy Chief of Mission at the U.S. Embassy in Prague, Czech Republic, fancy. Prior to that, she served as Office Director for Central Europe at the U.S. Department of State and Special Assistant in the Office of Undersecretary for Economic Growth, Energy, and Environment. Deputy Chief of Mission at the U.S. Embassy in Pristina Kosovo. Wow, this is really an incredible. It goes on for a while, 25 years on those. 25 years. I'm going to get to the good parts. Most importantly, for those of you who are part of the Tufts Mafia here, she is a graduate of the Fletcher School of Law and Diplomacy. She has an MA from the College of Europe in Bruges, Belgium, my favorite, and a BA from Brown University. And Umu Lee comes to us as the Senior Advisor for Technology and Ecosystem Security at the White House. That's also a big deal. Umu was a staff fellow at the Berkman Client Center, working on assembly disinformation programs. Prior to joining the Berkman Client Center at Harvard, Umu worked for four years as an aide to the Senate Democratic leader Charles Schumer. And in this capacity, Umu advised on national security policy, international affairs, and defense policy. Umu also worked extensively on legislation to fund the federal government, including the defense authorization, appropriations legislation, and budget agreements. Umu holds an MSC from the London School of Economics where she studied global politics and international security. Help. How are we going to get this conversation started? This is so impressive. Really great to have you here. Lots of questions. I just want to open it up because I think we're going to talk a little bit about AI technology. We're going to talk about cyberspace security, all the things that you guys are grappling with on a daily basis while you're working for the US government. Let me just kind of do a sort of basic open question for you, Jennifer. So looking at the global landscape and digital tech issues, what are the areas that are ripe for engagement with the US and its international partners? We talk a lot about contention. But what are the places where we can actually see the seeds of collaboration? Thanks for the question. I'm going to step back for a second and just take 30 seconds to explain what my bureau does. It's the newest bureau of the State Department. And what I think is interesting and it will sort of inform how I answer the question is that we're responsible for the national security, digital economy, and human rights elements of cyberspace and digital policy. So if you think of that as the totality, the global situation in which we find ourselves, everything from cyber attacks to the digital ecosystem, to questions around surveillance technologies, you get a sense as to what it is that we're working on. And all the areas in which working on these issues in concert and collaboration with partners around the world ultimately will either lead to success or failure for our vision of this open, interoperable, secure, and reliable internet. So the first thing I would talk about is the UN system. I think we just have to get to the elephant in the room. For better or worse, the UN system is what it is. A lot of countries around the world love it. I think there are very good things about the UN system and very good conversations that we can have there. So on the cybersecurity front, we negotiated the norms of responsible state behavior in cyberspace, which was adopted by every single member of the UN. Essentially, it says, what should you be doing and what shouldn't you be doing, particularly when it comes to questions of cyber attacks? What is the framework that we all agree? What's off limits? And we've seen, sadly, repeatedly, that there are a number of states who violate those limits. And then we build coalitions to discuss and to call out the bad behavior. This is critically important, because if you don't put limits on what's allowable in cyberspace, I think there is this whole idea around the proliferation of really negative technologies. And so now we're looking at taking that the next step. There was passed a resolution on what's called the Program of Action. And we're working with partners from around the globe to figure out what's the next step in what's called the UN open-ended working group on cyberspace. So I would say on the cybersecurity front, the UN is clearly critically important in setting the major stage. The second thing I would look at is, of course, the election of Dorian Bogdan-Martin to head the International Telecommunications Union. It was critically important to us and partners around the globe to have somebody in charge of the organization that shares with us this vision for the internet, as well as the importance of connecting the world. She was the head of ITUD, which is a development element. So she comes at it with the idea that we need to be doing more to get the unconnected connected, to work together. She also is the first female head of the ITU. And on an all-female panel, I think we can appreciate how important that is. She champions girls in ICT and other elements to get everybody onto the world stage. And then finally, I would just a second on the issues around digital freedom. There's a number of different things that we've worked on in this space. I'd like to particularly highlight what was the US effort on an executive order to ban the misuse of surveillance technology that was announced just before the Summit for Democracy. And as part of the Summit for Democracy, we also built a coalition that we hope will continue to grow that says that it's not just us who don't think surveillance technology should be used, but many partners around the world who see this as a real challenge that we need to face together and which governments need to take a stance against the US. So it's a sort of broad layout. Small things. Just a few little small things. But relatedly, you deal with some similar issues. But there's kind of a bigger palette for you in some ways. You have different slices of digital governance. So I don't want to begin with the hard questions, but I do want to ask you to touch a little bit on what Jennifer said about the challenges with UN institutions or just globally, what are we looking at in terms of norms? Where do you sit in this conversation? Where does your office sit in this conversation? Yeah, yeah, yeah. Well, so first, I'm glad to be here. I'm looking forward to the rest of the day. So as a matter of framing, the Office of the National Cyber Director is the newest component in the executive office of the president. Its main charge is to, one, unify the federal government's voice on cybersecurity, but second, to act as the lead advisor to the president on matters of cybersecurity policy and strategy. So that is our broad frame. Our North Star for thinking about these issues is something called the National Cybersecurity Strategy or the MCS for short. And this is a strategy which we just released last month in March. And so I'll do a quick overview, and then you'll see that this strategy really answers a lot of these questions pretty well and pretty incisively. So first, we know that our adversaries are increasingly using cyberspace as a means to further their own aims and objectives. We also know that cyberspace is now being used as a venue for waging geopolitical conflict. And so working from that understanding of the landscape, we knew that the National Cybersecurity Strategy had to do two things. First, we had to address those threats head on. But second, we had to establish an affirmative vision for what we know cyberspace can be and do and enable in our lives, because that's the point. The point of cyberspace, the point of technology, and the point of security is to assure the prosperity of people. So over five pillars, the National Cybersecurity Strategy and, again, NCS does those two things in tandem. Our first pillar really talks about the protection of critical infrastructure. Our second pillar talks about disrupting cybercrime. In this context, transnational cybercrime with a particular focus on ransomware actors. The third and fourth pillars are really aligned to how we shape market forces to invest in long-term defensibility and resilience. And by way of definition for us, resilience is the idea that we know we're going to be attacked. When these attacks happen, the recovery and the response should be swift and should be seamless. Defensibility is the idea that the total ecosystem should be tipped in favor of defenders so that our systems, our data, our infrastructure, our assets are secured by default. So we're not chasing security on the back end. And so over these five pillars, the National Cybersecurity Strategy really does establish this affirmative vision, right? And so to touch on a little bit of what Jennifer talked about, the pillar five of the National Cybersecurity Strategy really focuses on how we forge partnerships with international partners across the global north and the global south to establish this broad vision but also more evenly distribute the benefits of it and democratize access to it. So we talk about in that pillar things like, for example, responsible state behavior in the saberspace, shoring up global supply chains, assuring that the internet is free, fair, open, global, and interoperable. And over the course of our four pillars, which are not necessarily international in name, but they're international in function, right? Across those four pillars, there is a lot of opportunity for us to call in our international partners to collaborate with us on these things. So on pillar four, for example, we talk about securing the technical foundations of the internet, which is a necessary condition for the fair, free, global, interoperable internet we talk about in pillar five. So, I mean, I hear a lot of state in both of what you're saying and that's your job because you work for? Asian state. But I'm curious, and I think a lot of folks in the audience today will be curious to hear a little bit more about how the US government sees not just sort of how states interact, right? I mean, there are companies that do things, there are other actors that do things that are not necessarily working for states. And in the doing of those things, they create a lot of vulnerabilities for people who are not always as connected, right? There's a big digital divide between the global south and the global north. We all know about that. We also know that there's a huge gender divide that women are locked out in many ways of getting access to digital tools. There's a challenge with digital literacy. You name it, women face it, right? So, I wanna ask you both this question and maybe I'll start with you, Jennifer. What is the US government doing to address this gap between the global north and the global south? How do you see the opportunities there? What are the challenges that you see? Well, I would start by saying that the United States really strongly believes in the foundation of the multi-stakeholder approach to internet governance. It's one of the reasons why my boss went to the Internet Governance Forum, which is the only multi-stakeholder international grouping that comes together to discuss these issues. So, there's of course a role, I would say, for the governments to play in connecting the unconnected. We have done this sort of around the world. We actually, our bureau has foreign assistance funds. So, we have this thing called the digital connectivity and cybersecurity partnership through which we work to connect the unconnected. We also work through the ITU and other things. But there is actually, the private sector also plays a role in this. I would say they're running undersea cables around the world. They're creating data centers around the world. They're training people around the world. I'm not saying we can only look to the private sector to play this role, but the private sector has to be a partner and not an enemy in these elements. There have to be open conversations about what they're doing, why they're doing it, and what essentially opportunities and vulnerabilities they're creating by doing so. But they are, I mean, there's amazing stories out there of the private sector essentially, creating through low-worth orbit, satellites connectivity to hospitals that otherwise didn't have it. And so that's stuff that the US and other governments are never gonna be able to do. I would also highlight the important role of civil society on a whole range of issues. First of all, defending the open, interoperable, secure, and reliable internet. Because this question of internet shutdowns, we're tracking it, but we get most of our data from civil society. This question of how do you increase digital literacy? Both through sometimes grants from the US government, but sometimes from the generosity of people around the world. They're going out and they're teaching people basic literacy skills. They're getting women basic skills to go out and join the workforce. So there's also a partnering both for us, but also the United States, we are big believers that it's not just the government who does it, but there's the private sector, there's civil society, et cetera. And I think together that as well as, of course, our partners in like-minded countries, we work closely with the European Union, we work closely with Japan, Korea, actually many people in this room, I think it's a community that's coming together, essentially to make sure that everyone can benefit from the opportunities that digital technologies provide. So where does your office fit into this in terms of the global south, global north divide and access? Yeah, yeah, yeah. So I want to pick up where Jennifer left off, which is that in the Office of the National Cyber Editor, we really think about cybersecurity and technology as a people-first endeavor, right? And so that means by definition that when we wrote the National Cybersecurity Strategy, a lot of thinking went into how we bridge this divide. And so I think the National Cybersecurity Strategy addresses this question in two ways, because the big goals are to democratize access to the benefits of digital ecosystem and innovation, but also sort of build, make sure that all of us are lifted up. So first, the National Cybersecurity Strategy, and I just again use NCS, it's a mouthful, talks about building out the capacity of our partners, especially in the global south, to absorb and action aid, intelligence and resources, expertise, as they work to fend off aggressors in cyberspace. And I know that in terms of international capacity building, there are a number of efforts that are already ongoing across the federal government, including from the Department of Defense, including, of course, from the State Department and including from USAID. One of the other really important areas for bridging this divide for NCD is to really, again, ensure that our partners across the global south ensure that we build on our capacity to provide crisis response assistance, right? In this era, it is, you know, we take as a given that there are going to be attacks from common criminals and from adversarial nation states, and so bridging this divide is really about increasing our capacity in the global north and the global south on both of those things. I wanna preview something, which I think answers this question really well. So over the course of the next year or so, the Office of National Cyber Director is going to release what we're calling right now a cyber education and workforce and skills strategy. This strategy borrows from the NCS. It's both a derivative of and complement to the NCS. To really establish that a foundational assumption is that everyone should have access to basic cyber skills, right? Whether you're in a technical cyber field or whether you're a layperson, this is the digital economy we live in. This is the reality of today. So everyone should have access to these skills to participate in this economy. The strategy is also going to do a number of other things. We're going to really talk about how we build out international standards for foundational cyber skills. We will also have some focus on international capacity building because to achieve the broad goals we've laid out in the NCS, people have to be empowered to meet that challenge. And so those combination of things between cyber capacity building and the international realm between bolstering our capacity to perform crisis assistance and bolstering the capacity of our partners to receive that assistance in addition to the actions that we lay out in the forthcoming strategy, I think ONCD is poised to really meet this challenge and help close this divide over the next couple of years. So I'm glad you mentioned workforce. My brain was kind of churning. And I'm going to do something that is a little off script. Just don't get scared. But I am curious, today I think over the last couple of days you may have been hearing about the screen writers guild has been on strike for a number of different reasons. But one of the big reasons of course is concern about AI and how large language models are going to start potentially replacing some of the writers and in fact are starting to kind of change the economy, that particular economy for writers and producers and the entertainment industry at large. I don't want you to comment on that because I would never do that to you. But I am curious, because you mentioned workforce, you can roll out a strategy now today in 2023. But in about a year's time there's a real risk that AI will have accelerated and advanced so far that some of those premises and principles that you're outlining maybe are not going to apply or maybe what are the ways in which you kind of anticipate that. And I want to direct this question to you too because of course workforce matters for the US but obviously not everybody can be a prompt engineer. So what's the view on that? How do you kind of forecast for that potential risk? So one of the first things that I'd say is that when we write strategies they're designed to be enduring. We are writing a strategy not just for the threats of today but also for what we can anticipate tomorrow or what we can anticipate 10 or 20 years down the line. And so in this moment we can anticipate that artificial intelligence is going to really become a significant factor in workforce and all of the issues that we're talking about today. And at ONCD we think about AI in particular in terms of three components. The data, the compute and the algorithm and our role I think is really to make sure that as we were talking about backstage that we ensure that this technology is as safe as it can be that we mitigate all of the potential harms that we can foresee today but that also we remain adaptable to adapt to the harms that we're not able to foresee today. And so in terms of how AI might impact or change the workforce of the future I think one of the real challenges we'll have is to use AI alongside some of the actions that we'll outline in the workforce strategy to ensure that we are capitalizing on its benefits and mitigating its potential harms at the same time. What's your view on that? I mean we've got a lot of risks ahead of us when it comes to AI and sort of transformation of the workplace just generally. We've already seen that with the pandemic but obviously we're on sort of shaky ground right now. How do you see it? Well I think we are clearly in a moment of great change and I think that we either can embrace the moment, figure out what's the regulation or the bill of rights or the principles or the frameworks that can sort of be the guardrails on AI or we can just hide our heads and pretend it doesn't exist. I think hiding our heads and pretending it doesn't exist is probably a bad plan because other countries are going to continue to invest in it. We've seen it from China, we've seen it from other countries. So I think it's important to look at what's already been done on AI. So I mentioned two very specific things which is I think the blueprint, camera what was the final title they came up with but the blueprint for an AI bill of rights I think is what they came up with ultimately because it's not an AI bill of rights, let's be clear it's not actual legislation or regulation but essentially it says here's some foundational principles that companies should be looking at that we the US government should be looking at, governments around the world should be looking at as we look to harness this innovation without introducing all of the risks. So I think that's a really important one. I also mentioned the NIST framework, I actually so I will admit I'm a diplomat, I'm not a techie but the more I get to know about what NIST does it's amazing so this is like well because it's like very geeky and very in the weeds and so generally they don't go out talking about it internationally but I kind of think they should. The NIST framework on cybersecurity is a base element of what we talk about with foreign partners and I think the framework on AI will be similarly because they spent the time to think about as actual scientific technical people what should we be looking at? What are the guardrails? What should you be considering when you're going into AI? So it's not that nothing has been done in this realm I think sometimes people get the impression because chat GPT evolves so quickly that this came as a total surprise. I don't think it came as a surprise. I was out in Silicon Valley last summer talking to people who are looking at AI who were talking about what they saw as like some of the fringe cases that were worrying them and they didn't necessarily want their engineers to do. So some of it comes down to honestly to you know this is where the private sector does play a role if you're the head of an organization that is implementing AI I think you also have a responsibility to tell the engineers like that's a little too far maybe don't do that thing but I think sometimes there's this idea that the government needs to come in and create the rules because I don't know what the CEO doesn't have enough power. I think of course they have enough power but there is I think what's really impressive is lots of people are thinking about these issues particularly I know the references to academia here there's more tech policy centers today than I can probably count which makes me think there's a realization around the United States that this is an area we need to get into. We can't just build the technologies we need to build the people who are creating the policy who can bring those two elements together and then we need to continue to educate people like me so we go and talk to these tech policy groups educating Congress I think is incredibly important and I know there are a lot of efforts to bring congressional staffers into academia to think about like not just the harms but the benefits of AI and how can we think about balancing those but ultimately I think again this is one of those things where it's gonna take everybody and it's gonna take a like-minded coalition of countries from around the globe. One of the great things that we announced in December of this year was with the through the organization for economic cooperation and development the OECD a global forum on technology with the idea of trying to continue to anticipate the technologies of tomorrow and use the OECD which is maybe not the biggest tent but a bigger tent than some to get like-minded developed countries around a table to figure out what do we think and how should we be standardizing regulating collaborating on these technologies. So you're going right into the question that I really wanna ask which I think is probably on many people's mind. Look who remembers like when cell phones looked like curling irons? I totally remember that. And the first computer I had was a Vic 800 or Vic 1600 which was a little cassette player. You just stick in. Yeah, that's how old I am. But obviously like a lot of things have changed but one of the biggest changes I will say inflection points I'd say of this decade is just the whole idea of tech diplomacy itself. Like when cell phones were curling irons and computers were cassette tape run things. We were not talking about tech diplomacy. It was mostly kind of like how do I work this thing? So tell me a little bit about what you've seen in the evolution of tech diplomacy and kind of where the US sits, where other players sits in this realm and then also are we missing something by the fact that the US doesn't really have a presence necessarily a tech diplomacy presence in Silicon Valley? Do we need to do something different? Are there models out there? You don't have to name names that you see that could be borrowed from in the tech diplomacy realm. So I would say what's interesting is we've sort of done tech diplomacy for a long time. We just did it in a lot of different scattershot ways. So the office that we absorbed that did ITU they've been doing ITU for, I don't know as long as the ITU has existed probably they've been showing up to the ITU they've been negotiating in the ITU. We've been doing that. We also have been doing cyber diplomacy. This issue of the norms of responsible state behavior and cyberspace has existed in the state department for over a decade. But the idea of bringing all the strands together and making it a bureau which is a very bureaucratic thing to talk about really like what does it mean that you're now a bureau? What does that give you? Why is that an advantage versus like being a special envoy to this very bureaucratic speak? So I will skip that one. Happy to discuss if anyone actually cares but I'm gonna assume most people don't. So I think there have been these strands. The idea is that we should be weaving the strands together. We should be telling a narrative that makes sense not just in Washington but around the world and we should be training our diplomats to do this. So my boss talks a lot about this and I think it is critically important. We had a cyber course and we had a digital course and we brought them together for a Cyber and Digital Economy Officers course. So we're training diplomats around the world on what do these things mean? What is the ITU? Why should you care about 5G? What's a low earth orbit satellite and what is that actually gonna do? I mean, diplomats, we don't necessarily know these things. So it's just giving people the basis so that they can go out and talk to governments around the world about these issues as countries around the world are trying to figure out what's the best way to bring connectivity to their people? What are the benefits? What are the costs? What are the vulnerabilities if you only sign up for a Huawei equipment? So you need people that can actually talk about this in ways that make sense. There's a whole nother strand of discussion. Again, to go to the question of, let's say you have a cyber attack. Well, we do wanna respond to the crisis. How do you do so? But also, maybe we should build a coalition if it's a state-sponsored cyber attack so that we call out who did that but you have to understand what is a norm and what's not a norm and how can you discuss these things? So we're trying to train the State Department workforce. I would say, and I made the argument actually internally yesterday because one of the other goals that my boss has is that every chief of mission from the career service have some experience in this skill set when they're chosen to be a chief of mission. So yesterday, the people who make these decisions said like, so what does that look like? And I said, look, as somebody who's been doing this work for 25 years, I think everybody should be able to say something that they've done in this realm because there are these little bits that have been hidden about. So now we're just trying to bring them all together to a comprehensive vision that people can understand and collaborate. What's interesting to me is that this is not just a question for the United States. It's a question for diplomats around the world. Most of what diplomats wanna talk to me about these days is how are we organized? How are we training our diplomats? What are we doing next? What is this new skill code that the State Department created? And that's a whole nother bureaucratic thing I won't get into. But so what are the basis through which you did it? And we've done this through, we do this on one-on-one discussions. We had a really great, we have a really great group of people in town this week through something we're calling the global emerging leaders in international cyberspace policy who are here. From around the world and they're all diplomats and they are trying to figure out how can they understand this and how can they be evangelists for it in their own government? Because we can't do it alone. It's great if I understand the norms, but if I'm like going into a counterpart in Costa Rica who has no idea what I'm talking about, ultimately it's just not gonna work very well. So I think again, Big Tent is better. Silicon Valley, I would say, in my opinion, it was something that briefly existed in the State Department. I think we have very, very limited resources in our bureau right now. So I'm not personally planning to send somebody out to Silicon Valley full time. I kind of need everybody here. But we go out there all the time. My boss and a couple of other people were out last week for RSA. Other people will be out in a couple of weeks for a tech ambassador's retreat. I was out over the summer. Our digital freedom person was out in the fall. So we have constant sort of stream of visiting from the experts to the experts. Also, most of the tech sector comes to Washington, D.C. They check in with us, they ask us how we're doing. And we have this great thing called the internet. So we do lots and lots of calls over, name your Zoom teams, Webex, I don't know, I seem to have all of them. So we're checking in constantly through various means with people out in Silicon Valley. So I think that we've got a good sort of connection right now. So physical presence is not as important as sort of making sure that we're getting the policy right. But the Bay Area is so beautiful. Look, let me tell you, we've been lobbied by more than one diplomat who would like us to reopen a presence in Silicon Valley. I would just like to say, I think it's because they want to live in Silicon Valley. But you're from the Bay Area. I am, yeah. And so you must have had some exposure very early on to all of this stuff kind of growing up there. It was like in the swirl. You mentioned something though back in there about your responsibilities with Quantum. Was Quantum part of your conversation? It does, it's part of my portfolio, yes. Okay, but was it part of your conversation growing up in Oakland? Not really. Not necessarily, no. So what are you thinking about it now? How do you see from your perspective where you sit the challenge with Quantum? How is that discussed in the strategy? Yeah. So to give a little bit of context on my role and my alignment within the Office of the National Cyber Director, I am aligned to a team called Technology and Ecosystem Security. And our team really has a focus on emerging technology writ large. So that includes Quantum and AI. But it also includes data security, privacy, and emerging technologies writ large. I think about Quantum much like I think about AI, right? We are in a moment of really, with a lot of promise, with a promise for a lot of change as well. And so with technologies like Quantum, like AI, and even like Hypersonics and other emerging technologies, we are at the precipice of really stepping into a new reality in the digital ecosystem. So thinking about how when I was growing up in Oakland, and I also went to college in the Bay Area, I really had a focus on traditional national security back then. For structure, border security, military, national security policy. And so over time, it's funny as I've moved away from the Bay Area, I've started to have a more interdisciplinary focus. And so I think that as we look out to what the next 10 or 20 years are going to look like, I think a lot of us are going to have backgrounds that start to look a lot like mine and a lot like Jennifer's, which are interdisciplinary by nature, right? It's the traditional national security mixed with tech or mixed with healthcare or mixed with energy or mixed with something else that gives you a full sort of fulsome understanding of the issue set in such a way that you can have conversations with Polytechnic majors, you can have conversations with technologists, you can have the DC to Silicon Valley conversation and you're sort of a bridge and a translator between these two worlds. Yeah, so this is a really interesting point actually because I think we've discussed this on our team over the last couple of years, the changing nature of international relations just as a field IR, right? Which is kind of like the other industry here in Washington and for years and years and years, I mean really until like the last maybe 10 years, the conversation was mostly like, how big is your army, right? How big is your, what kind of GDP do you have? And if you can't say like we are in the top five, then there's no conversation to be had, right? But obviously it's now more about sort of what has your country or your corporation achieved in the realm of quantum or AI, et cetera, et cetera. That's kind of like resetting the stage basically. So I wanna talk a little bit about the field of IR itself and this is a little bit academic-y, so you'll forgive me but this is where my mind goes sometimes. When you think about now in this interdisciplinary moment that you just pointed out, 10 years from now, what will we be saying about what people need to know in IR? Like what do you need to be a graduate of international relations and to be a successful one? You're both in the realm of kind of diplomacy, policy, national security. And I just say this mainly because I know that there are some people who watch this and maybe not the folks in this room who are very accomplished but there are other people who are watching this especially at our universities who are thinking could that be me up on that stage one day? What would you say about the ways in which we need to think differently about IR today? Vis-a-vis digital, how it fits into things? What should people be preparing themselves? What skills do they need? Yeah, yeah. So when I was getting my education, IR was still a very heavily theoretical field. We thought about the realist tradition, these general sort of buzzwords that you learn when you take IR classes and sort of you come up thinking about where do I fit in that framework? But you quickly come to understand that the world doesn't operate necessarily in frameworks all the time, right? There is more dynamism than that. So I hope that over the next 10 years and really over the next 10 or 20 or 30, as I really start to get a lot older, we, yeah, yeah, yeah, that, you know, we think about this in a truly interdisciplinary way to start, right? International relations is part national security. It's part heavily technology at this point. It is also about people. It's also about having a heart for people. It's also about, you know, really understanding how to leverage all of your training, all of the theoretical background that you've amassed to figure out how to solve problems in the real world, right? It's really going to be about taking the theory out of the field a little bit, taking yourself out of the theory and really finding a place for it in the real world. So thinking back to when I was in college or even when I was at LSE, I took classes on concepts around responsible state behavior, but I really had to start thinking over time about what that might look like in cyberspace. You know, we have norms around allowable state behavior in wartime and in peacetime. You know, there's an idea that we don't go out and pollute the water supplies of civilians. The idea now has to be that we can't use cyberspace and we can't use digital tools to do that same thing, right? And so we have to really take a broad view at everything and think about how you want to show up and what you want to do and what you want to accomplish because there are a lot of challenges to meet in this moment. So everybody's thoughts about that? So as a diplomat and not a technologist, what I would say is you can't be scared by the technology. I think it's so easy for people to throw up their hands and say, this doesn't make any sense to me. I don't understand blockchain. I don't understand crypto. I don't understand quantum. Name your technology you don't understand and then say like, okay, therefore, somebody else is gonna deal with that. Well, no, somebody else is not gonna deal with it. We all have to at least have a basic understanding to be able to have good conversations. And then you get to the baseline of your knowledge and you go to the experts, but you need to at least open up the conversation. And this is actually what diplomats do all the time. I'm constantly going in and delivering to marshals around the world. Not so much in this job, but in my last jobs where I quickly reached the bottom of what I knew on the subject. Really, all the time. And I'd have to go in and I'd deliver my points. And then they'd ask me a lot of very technical questions on a variety of topics. It's not even necessarily these. And what I would have to say is, that's a really good question. Let me get back to you on that one or let me put you in contact with our experts. And that's the other thing that we talk a lot about our new bureau is we're the back office. We're the customer service for these people out in the field who quickly get to the bottom of their understanding of norms of responsible state behavior or AI or whatever it is that's coming up in their conversations, algorithmic amplification, all of these things. So they get to the bottom and then they send us an email and they say, okay, so X country is asking about and what are our thoughts on, how can we respond to this? How can we engage in this conversation better? And I think that's the, in some ways when you're out in the field, you can't be an expert in everything. I think you have to have, I talk about as like layers of knowledge, like you show up in a new country and usually, cause I should say, you got through part of my background, but there's a whole part of my background that was like in random countries, I had nothing to do with Europe for a really long time. And so I never knew anything when I showed up in those countries. So like the first week, like I knew this much. Second week, I knew this much and it just kept growing until three years. I felt like I really understood the country and then of course I left and started all over again. So that, I think for me to know to having done that, I had the humility when I showed up in this job to say, okay, not an expert in the topics, but you know what? I'll get to the point where I can have an intelligent conversation about this. And I think that's what, I think that's really what it is that, you know, you have to be curious, I think international relations, do you have to be curious about the world? You have to be curious about what's going on in people's lives? What do they want to make their lives better? What is it that really matters to them? And then you meet them where they're at as far as you can go. And then you, you know, throw the lifeline back to Washington and say, can I have some help here? I think that's pretty much what it is that we do. Okay, diplomacy 101. All right, so I imagine there are some questions in the audience. We're gonna hear some questions from folks on the live stream. Let me throw it open. Okay, now I know I'm supposed to pick my boss first, but I'm gonna, I'm gonna pick you, go ahead. Do we have a, do we have a mic running here? There we go. Thank you. Thank you very much. My name is Jerry John, coming from the Coaman Chroma University of Science and Technology Ghana. I'm very happy about the interventions that you have mentioned with reference to how to deal with the whole issue about AI, generative AI and all that. What I realize is that there is a lot of conversation and a lot of action with reference to regulation. So you talk about the EU, you talk about UNESCO, the UN and all that, coming up with various regulations that helps guide the way that we should use artificial intelligence responsibly. I recognize that regulations are good, but the implementation is equally important because there have been several examples of coming up with regulations and then people flout it and there is difficulty with reference to how to deal with that. My thinking is that we should be as equally involved in educating people with reference to how to use these technologies responsibly. And what I haven't heard much of has to do with interventions as far as education is concerned. So that is the first point. The second point has to do with bridging the gap and that bothers on cybersecurity and all that. The world is a global village. I mean, the reality is that if you are safe at one end, the other end is not safe. That becomes a vulnerability that can be exploited. To that extent, how are we bridging the gap between the global north and then the global south with reference to interventions that are being made to ensure that people in that area are equally as safe as the others. Then my last point has to do with incentives as far as education is concerned. My thinking is that, and I'm coming from an educational background, it becomes important that when it comes to institutions that are mountain programs, developing modules, as far as the responsible use of technology is concerned, there must be some resources available as far as this is concerned. And that becomes a way of ensuring that people are being educated. I belong to the school of thought where people's heart needs to be touched so that it's not so much an external influence forcing people to do the right thing. But once you have been able to touch their hearts through education, they naturally will do the right thing. That is the assumption anyway. Thank you. So I mean, so just to paraphrase a little bit and not steal all your words, education, what are we doing? How are we shaping that opportunity? Which is essentially to also shape the ethics on some level, right? Which we were discussing backstage a little bit. But also this question of implementation kind of goes hands in hand with that because so what? You've got a bunch of rules but people don't really know how they apply to them. They don't have that kind of like ethical compass that's imputed from education. Let me go to you, Umu, and then I'll go to you, Jennifer. Yeah, yeah, yeah. So on implementation, I hate to keep bringing up the national cyber strategy but it answers all of these questions so, so, so well. So in the summer of 2023, which is just a couple of months, we will have a public document released which is going to be called the NSIP or the National Cybersecurity Strategy Implementation Plan. That plan is really going to detail how we will go about implementing the strategy. How we will go about coalescing the right partnerships across the stakeholder set but including with international partners to really as incisively as we can make sure that we bring the issues to life and off of the paper, right? That is the point. You know, on the topic of genera of AI, education, you know, this is something that we're really grappling with as we are drafting the National Cyber Workforce and Education Strategy. Education really is the most important basic intervention, right? The education around responsible use for technology for students and for people who are new to using the technology in general but also for the educators, right? We are all, you know, learning this technology together at once. It's brand new. And so I think one of the first things we'll have to think about is how do we educate the educators? How do we, you know, establish the rights, you know, guardrails, norms, educational paradigms to make sure that when we educate students, they're coming from a place, our education is informed by our values but it's also just sort of rooted in the reality of the technology today and as it will evolve in the future. We can't know how it's going to evolve in the future and that's why, you know, remaining adaptable not only in the implementation but also as we develop education and other interventions really is key. So I imagine that's pretty difficult in terms of like just diplomacy. Essentially you're going to try and sort of push out there these different values and ideas. Maybe they're going to be received well in terms of education. Maybe they're not. What's the challenge there? Well, I would actually pull together your education question and your implementation sort of what I would consider foreign assistance question, right? I was actually last week in the Dominican Republic for a meeting and it wasn't just with the Dominicans but we actually, this was the topic we actually discussed was this, okay, in some countries they're teaching people cybersecurity like they teach them how to cross the road, children, right? So don't use the same password, multi-factor authentication, all these things, right? They're teaching it, this is Israel, right? They're teaching it very young. I met with them. They said, this is like, we're teaching this at the same age. We teach people about how to cross the road, children. I said, okay, well, are you teaching them about how to identify disinformation? Are you teaching them critical thinking skills at the same time? Because in my opinion, that needs to go hand in hand. How you comport on the internet is a combination of factors which is either going to lead you to success or down a rabbit hole in multiple ways, right? So I think there is this question around building modules for educators particularly in developing countries but maybe also a little bit in the United States we're not so good on cyber hygiene in some cases, right? I mean, I tell my children not to use the same password on everything. I'm not sure they listen to me, but I tried, right? So I think it's the, how do you inject this into the curriculum at a base level where you're talking about this whole range of issues? I think actually the US is doing pretty well on the critical thinking skills because my kids actually they tell me what they're studying and they say that they give them like a picture and then they're supposed to find one source and what did they learn from that one source and then they look at five sources and what did they learn by looking at five sources, right? They're teaching research today in a completely different way than I did research when I was in middle school and high school. So I think there is some realization that this needs to happen. So there are I think lessons that can be applied and this is the advantage of having foreign assistance money is that we are looking at how do we plug in as the United States and how can we provide training frameworks, et cetera? And so we have sort of a growing foreign assistance budget. I hope it will continue to grow to answer what's really I would say a cry from around the world for more support in these issues which we do think are critically important. As you said, we 100% believe that if Ghana cybersecurity is not strong, it means the United States cybersecurity is not strong. And so we need to work together. Emory. Emory Slaughter, CEO of New American. It's just a joy to see all of you here after a lot of hard work on the part of a lot of people. And it's wonderful to hear about this bureau. I led a tech dell, technology delegation of all women tech leaders to Liberia and Sierra Leone in 2011 and they were two Google, Twitter, somebody from the Cell Phone Alliance, all women and all focused on tech. And that was really new for the State Department. That's a decade ago. I know well the bureaucratics of this, but frankly, having that bureau means this is now part of the world, right? Digital space is part of the world and represented and to see it at the White House. So there is real progress. My question is how does the U.S. lead differently in this space? How is it not the great U.S., the great tech power preaching to the rest of the world, aiding the rest of the world? Just following up on Dean Caponeo's question, you know, I was just part of a high level advisory board for the U.N. Secretary General and we talked about these issues and the African Union has a well-developed digital strategy. I mean, they really in many ways are well ahead of other parts of the world and of course that's also true if you look at Singapore, parts of Southeast Asia, Latin America. So this, you know, cyberspace is much flatter. As Candace said, power is measured differently here. How can we not be the top down leaders but sort of the learners as well as the teachers and the receivers as well as the givers is what I wanna know how you're thinking about. Great question, you wanna jump in, Jennifer? Well, so I think first of all, it is sort of by basic diplomacy going out talking to people about what's on their mind, what are they seeing, what are their concerns, what's next, how are they looking at regulation, standardization principles, whatever it is that they're bringing to the question, I think that's critically important and I think again, it's trying to figure out what are the best for it to do it. There was a lot of discussion around the creation of a T-12 or T-14 or I don't even know what T number it was. The reason you haven't seen one is because essentially the State Department said we think this is a really bad idea and we think it's a bad idea because if you create a club, who's in the club? Who's out of the club? How did you choose the club? Is that club right for five years from now? And I say this having gone to G20 meetings where it's awkward at the moment, it's really awkward. So do you wanna, is that the right number that you picked today? Maybe all of a sudden there's gonna be a huge startup from some country you didn't predict. So that's why essentially rather than creating a new grouping where you had the winners and the losers and who's out of the tent, we said no, what we wanna do is we're doing work in the OECD but the global forum on technology is not just the OECD. People from outside the OECD can join the conversation. We want them to share values with us because you spoke, it reminded me of the great work Anna's done on the Freedom Online Coalition which is another different big tent discussion about how do you work on these issues together and the United States took the chair ship for the first time ever this year on the Freedom Online Coalition which again, it's just another, but there's amazing countries that you would be surprised to hear about that are members of the Freedom Online Coalition where you have these people getting together to talk about how are we facing the threats of the internet, what are we looking at for the opportunities for digital technologies and how can we do it in a group of like-minded countries and one of the things actually we're trying to do with the Freedom Online Coalition chair ship this year is continue to grow that tent because we think the tent could be bigger. There are a lot of countries who have yet to join who we think would be really useful partners in this. So I could probably name five other formats that we operate in but just to say we look for a variety of opportunities to engage in issues as they make sense and to try to do it in a way that has some humility, some respect and some just free flow of ideas so that ultimately we come out with better policy. The other thing I would say, although it's maybe not the best example, but the program of action that I started by talking about the US did not support. We were actually against it for a really long time. We only signed on at the last minute it was not our initiative, but now we love it, we're supporting it, we're doing it, right? So it's not always about the US coming in with the best idea there are lots of other countries that come to us and say we wanna do this thing, right? And then we go, okay, well maybe this or that. C.S. stops on it, so I'm gonna stop talking. All right, well listen I mean, so I just wanna note here that the error of kind of club benefits diplomacy may be ending, right? And so that's sort of one big takeaway I think from this conversation. Yes, the US is important, but obviously there are lots of other players out there and we've gotta move away from that old school club benefits diplomacy, if we're gonna get any further on the digital domain and better governance and norm setting. I'm afraid we are out of time, but I really wanna thank you, Jennifer Bacchus, Lee for joining us today and give you a round of applause. Thank you.