 R.C. Prabhakar, the director of DARPA, we're here to talk about the future of war. And if you're talking about the future, you're talking about DARPA when you cover the Pentagon. I'll be very brief about her remarkable record because you can read all about it, other than to say a couple of highlights. President Clinton appointed her director of the National Institute of Standards and Technology, which I think everyone in this room would understand is a fundamental building block of national security in this country, besides the other things it does do. She served in many venture, private sector venture capacities in Silicon Valley. And when I look at her CV, the thing that she knows always impresses me the most is she has a doctorate of philosophy and applied physics from Caltech, which is pretty darn cool. But I think you once said it was just the name of the degree. That's what they give you there. That is what you get when you get asked. That's what happens. So it seems to me that when we want to look at the future of military technology and what role it plays in future warfare, we do talk about the notion that the future is already here. And your agency must be in a fascinating position right now because you spend billions on advanced research. But yet we are in this era of ISIS, of fighting what essentially is an adversary that perhaps exhibits traits of thinking of the 8th century, though I suspect there was a lot more humanity in the 8th century than these people have. And yet they do have some very advanced capabilities in social media. So talk to us about that. ISIS is going low tech, except for social media, while the Pentagon still focuses on the high tech. Information, information warfare. That's a great place to start. Let me just say, first of all, I'm really glad to be here because I think this conference is asking these very challenging questions about how the world is changing and how technology is changing. I really applaud that focus and appreciate the chance to participate. So there's an 8th century component in ISIS, and there's a 21st century component with their use of social media. And I love the way you frame the question because the 8th century part of it is horrific, but it's actually their 21st century tools that are the scalable part. And in many regards, I think that's the piece that needs addressing in order to deal with the whole situation. And in fact, I think we're beginning now to have tools and techniques to start dealing with that. If you think about it, ISIS in essence is using the same infrastructure that we use for all the good things that the internet has brought us in terms of connectivity, commerce, changing the way we interact with family and community. They're using that for their nefarious purposes. Today at DARPA, just as an example, we are developing some of the tools and technologies that allow us to start seeing patterns of interconnection in that vastness of the internet. A program, for example, that we've been working on called Memex, started by developing a way to understand linkages among websites, first in the domain of trying to look for sex trafficking. And in fact, that early work very rapidly led to the ability to see, for example, the same phone numbers that would pop up over and over again in websites. And we started very quickly being able to give law enforcement, in that case, a tool that allowed them, rather than doing a single-threaded search through just the small portion of the web that is indexed by Google or Bing, now we've given them a tool to do domain-specific deep web search. That's what Memex is all about. In the human trafficking world, that is now leading to indictments and convictions. Very satisfying to see. But those tools, of course, can be used for many other purposes. And today we're starting now to help with the fight against ISIS, using those same kinds of tools. One example. It's for the past, well, I think everybody probably has exactly the same question I have. What can you do? What's the payoff here? Where are you going with this program in terms of ISIS? Yeah, and that work is just beginning. And I think because it's live and we're in a wartime situation, that's not going to be an area that we can talk about in a lot of detail. But if you look at how it played out in the case of law enforcement and human trafficking, maybe just to use that as an example, there what we found was that by looking in a particular region, we started working with law enforcement in the Dallas, Texas region. They were looking for sex trafficking patterns and networks. We looked at back page ads in the Dallas, Texas region. And from that we were able to build a very quick assessment of where the same phone numbers kept showing up on multiple websites. Again, if you're looking across thousands and thousands manually, you wouldn't have seen it. But we were able to scoop up these high value phone numbers, hand them to law enforcement. And then our law enforcement colleagues, they were sort of taken aback, I think initially, by how rich that data set was. Many of those numbers tied to criminal violations that they already knew about through conventional law enforcement means. More interesting from a national security point of view, they found that some of those phone numbers linked to fund transfers in the region around North Korea. And that started them on the trail of looking for a trafficking network. So that's the kind of work that is now being picked up by law enforcement and is starting to help put people behind bars for human trafficking and sex trafficking. But you can imagine how that might give you a way to see how the ISIS global community that's spreading like this cancer, how are they using that infrastructure similarly? So everyone talks about information warfare, dominance in the information space. What are DARPA's priorities right now? What are you looking for from industry, from universities? What are the leading edge multipliers that you want to see? That's what we think about all the time. We are looking for ways to scale our cybersecurity capabilities faster than the growth of information and the growth of the threat that comes with it. We're looking for foundationally better ideas than patch and pray, which is pretty much all we have today in cybersecurity. And then similarly, in the big data realm, we are looking for the, our aim is to build the kinds of tools that will enable end users to deal with that vastness of data, with this huge data explosion. So rather than drowning in it, can we actually start seeing these kinds of very valuable patterns? So cybersecurity and big data, those are the two major objectives. But it really, unlike operational units, our job is to find techniques and tools that are going to scale faster than the explosions happening. And that are, I know you've talked about it before, techniques, tools that are useful to a soldier or marine or sailor out in the field. To the end user, absolutely. Not just cool stuff to have cool stuff. Right, we're a technology agency. So we want to show capabilities that are possible and start getting them out into the world, absolutely. Tell us what Plan X is. Plan X is a program designed to give anyone dealing with the cyber world a way to understand what's happening in the cyberspace and to be able to run exercises and to plan maneuvers, whether it's for cybersecurity or for cyber warfare activities. I think part of what makes cyber a very challenging arena is that it is inherently abstract. And we struggle to map it to the physical domains that we understand. Plan X starts giving anyone who's involved in that a way to grapple with it. So one example is work that we're doing that will transition, we believe, to the army that will actually give soldiers on a foot patrol or part of a squad as they're going through a community, give them a way to visualize what's happening in the local cyber environment to understand which Wi-Fi spots they're walking by to connect. Maybe they're going to walk through an urban environment in which they sense a particular Wi-Fi node. They're in a peacekeeping setting. They're trying to make sure that they eliminate bad actors or people who've been involved in IEDs. They know that that Wi-Fi node has been implicated in a prior act of violence against US troops. We want to give them the way to see the local cyber environment and then all the way on the other end of the spectrum at the command level as you're trying to understand a major military operation. Again, seeing the cyber environment, being able to assess who's coming at you, how to respond. And that doesn't exist today. That doesn't exist today. No, it does not exist today. We have very highly trained, highly expert cyber warriors who are navigating in this dark, complex space. We want to start, again, making it tools that can be used by, in this case, warfighters at all levels. It sounds to me also like that could be something that achieves another one of your goals, perhaps of transitioning it to private industry, to companies. That is our goal. I won't say Sony Pictures. Well, but this is a really key point. But they might have known. Sometimes, actually, I'd say for a significant part of our portfolio, the best way to make the impact that we seek on national security, that's our mission, is to make sure that the technology gets adopted by the commercial sector, either because we need to achieve cybersecurity through our economy in order to protect our nation, or because the commercialization of technologies, robotics is a great example, but there are many, that that commercialization is going to be absolutely necessary before the technology becomes ripe enough for DoD to use for military needs. So at a time, and we'll start thinking about some questions, because in about five or six minutes, we're going to turn it over to you guys for some questions. We started off the eighth century enemy that's also using 21st century technology, forcing us to hold two, as you say, to hold two ideas in our head at the same time, but still a Pentagon and Defense Department spending billions on very high-tech, very complex, single-generation weapons like an F-35. So what's going on? Right. So what's up with that? I think what's up with that, the starting point is to really understand the national security environment that we're dealing in today. And my first tour at DARPA was in the Cold War, and at that time, the model was you understood, you worked against this one monolithic existential threat, and everything else was just sort of backseat, and you didn't really think about it too much. We don't really have the luxury of just dealing with one kind of national security threat today. So ISIS is a today issue. Ebola was a very current issue. Those kinds of crises are going to flare up and be part of the national security environment. I wish I could imagine a time when that weren't going to be true, but I think we're going to need to be dealing with that throughout my lifetime and probably my kid's lifetimes. Those chronic threats and how we deal with them as technology changes what those kinds of actors are able to do, that's part of the national security landscape, but it's not the whole story, because we know as well that nation states around the world are changing their military positions, their military capabilities, and with those shifts come also the concern about an acute national security threat in the future that we want to deter and defeat if that becomes necessary. So the challenge for the department I see is really this very wide spectrum of threats. Now the question about how you deal with a peer adversary in the future, how you deter those kinds of conflicts, that is going to need very sophisticated high-end technology, but you are absolutely right. If we do those as point solutions, the equation's just not going to solve. And so at DARPA what we think about is how do we prepare for that environment? How do we get ourselves to a place where we are able to deter and defeat if necessary against a very technologically enabled peer adversary? But how do we do it in a way that isn't just more of the same from the past? And you'll see those kinds of ideas about rethinking complex military systems throughout our portfolio. So you're not ready to write off Russia? It's hard to write off Russia given what's been going on lately. Well, in terms of technology, you know, there's still people readily say, oh, they're just out there using the same stuff they've had for decades. Well, I think we see all technologies blended with new methods. The creativity exhibited by different kinds of threats around the world is not limited to ISIS, right? We're seeing it in lots of different places. Let me shift gears for just the last couple of things before we turn it over to the audience. You do some fascinating work in the biology area. I know you've worked on advanced prosthetics for wounded warriors. I'd like you to talk about that for a minute. And some of the work you're doing in really leading edge advanced vaccine research. Yeah, maybe I'll start with that. Infectious disease, to me, is in this category of chronic crises that will continue to flare up. And we hope we're just coming through this last round with Ebola. But we shouldn't relax because I think our future is going to have those kinds of challenges. The objective of the DARPA program in this area is to completely collapse the amount of time that it takes for us to contain that kind of a flare up of a new infectious disease. If you think about what happened with H1N1 a few years ago or Ebola today, the peak of infections preceded the time that an effective vaccine was available. So in the case of H1N1, we had this huge surge of H1N1 cases in the United States. Shortly after it started turning over naturally, we actually had a technological solution. So our program aims to collapse that time and nip these infections in the bud. That takes a number of advances, some on the diagnostic side, which we're pursuing, some in terms of better acting vaccines. But there's a new element that we're introducing which has to do with the notion of building a fire break, a way to provide immediate protection. If anyone who's had a vaccine remembers that there's a period of time, usually weeks before it becomes effective. So this fire break notion is a therapy that would provide immediate but temporary protection. And so you can imagine a case where as infection spreads, we would be able to identify the friends and the hospital workers who are in touch with an infected population, give them this short-term therapy, creating a fire break, and that allows then time to give a vaccine to a broader population. And together then we think all of that can be a way, not just to sort of pull in the timelines a week or two, but just collapse that timeline and try to nip these things in the bud. Doing that is gonna require amazing advances in biology. It's in a lot of ways, I think it's the positive side of a lot of the synthetic biology conversation that was happening here two sessions ago, because this is the kind of thing that our ability to engineer biology is letting us approach now. And on prosthetics? Prosthetics, so a lot of our work in understanding the human brain began out of a driver centered on restoring function. It began with one of our amazing program managers, he was an army doctor, he came back from theater, convinced that we had to find a better upper limb prosthetic for our wounded warriors. He developed this very sophisticated prosthetic arm with many more degrees of freedom than the simple hook that we've had for decades. But he's also a neuro-intensivist and a neuroscientist, so he also did the research that came to understand a lot more about neural signaling from the motor cortex. That work, those two branches of that program came together a couple of years ago when we had our first human trials. One example is a woman named Jan who had been a quadriplegic for a number of years, volunteered to have surgery to place two small probes on the surface of her brain on the motor cortex. Those signals then, her neural signals are directly picked up and now where she was very sort of surprisingly rapidly she was able to just think and directly control this very sophisticated prosthetic arm. So she thinks she can shake your hand or offer you a stack of cookies, sort of amazing functionality for someone who's been paralyzed for this time. And it's so moving to see what an impact it has on people to be able to even experiment with a technology like that from the perspective of restoration. But of course in doing that work we've also opened this door to, we can now see a future where we can free the brain from the limitations of the human body. And I think we can all imagine amazing good things and amazing potentially bad things that are on the other side of that door. Give us a couple of examples on both sides. Well, so Jan tolerated her implants very well and we were able to extend her period of time with that. So we started experimenting. Jan's initially the implant she had on her left motor cortex designed to operate a right arm as you learn in your neuroscience 101. John Hopkins applied physics lab built the arm. They built a left arm as well. So we said, well, let's see what happens if Jan tries to control two arms from the part of her brain that's only supposed to control one arm. Well, it looks like Jan has, surprisingly she has some independent functionality of both arms from this one spot. Then Jan decided that she wanted to try flying a joint strike fighter simulator. So Jan got to fly in the simulator and Jan, instead of thinking about controlling a joystick which is what our, you know, our ace pilots do, right? When they're driving this thing. Jan's thinking about controlling the airplane directly. And in fact, you know, for someone who's never flown and it should not a pilot in real life, she's been there flying that simulator directly from her brain. So you can start to see some things. Maybe a few years before the Air Force does that. I think we're a long way from any of those becoming real. But again, we've opened this door. You can see some amazing positive things. But of course, you know, we're talking about crossing some very important ethical boundaries when you start talking about people being able to do this very different way of connecting their brains to the rest of the world. So we think it's an important time to think about what those next steps in research are going to be to engage a bigger community. But I think this is, maybe I should just take a minute and say, in this area in privacy issues regarding data in the synthetic biology area, over and over again we find because of our mission, which is to pursue these frontiers for national security, we frequently find that we are pushing into new areas of technology that we know are going to raise very important issues of ethics, morality, social issues. And we believe our job is twofold. Number one, our core mission is to understand and pursue and grapple with these technologies for our country and to understand what the possibilities are before anyone else does. That is a core function and a core role for us in the context of national security. So we don't want to only go play in the safe places. That would be a violation of our mission. But with this pursuit of these advanced technologies comes a deep responsibility not to craft the answers for society, because I don't want to live in a society where a bunch of technologists in national security tell you the answers to those questions. But I think it's so vital that we engage and we are engaging in each of those areas with a broader community of people who can help us think about how we might approach research, but also to whom we can show this future that technology is making possible so that they can be part of helping us, helping us, the largest, the societal us, think about where this can go. So I wanted to mention that, because I think it's very much in keeping with the theme of this conference. And I think it's very constructive. I am quite certain that there must be some questions out there, and I don't know how we're doing this. Are there microphones in the room? All right, so. And of course, obviously, please say your name. So Dr. Prabhakar knows who she's talking to. Joe Marks from Politico. In the DARPA budget request for fiscal 16, there seems to be a shift in cyber priorities from a little bit less of a focus on things like Plan X, combating cyber attacks, and more on surviving through cyber attacks, which you also hear about elsewhere in DOD, and I was wondering if you could talk about that, things like EDGE CT, I think it's called, and those programs. Yeah, when you look at DARPA's budget, as it's submitted through the president's budget, what I think you're seeing is a very natural consequence of the fact that we're a projects agency, and so at any moment in time at DARPA, some projects are wrapping up and winding down, and others are starting up. The two examples you gave are both within a larger and much longer scale commitment to this transformative thinking in cybersecurity that I described. But yeah, Plan X started a few years ago, so its budget is, I don't actually remember exactly what the numbers are, you probably know better than I do since you just looked at it. And then we are doing some work starting up now on exactly this idea of how do you operate through and then survive and reconstruct after a devastating attack. We'll have you continue to, that gentleman back there perhaps, and then we'll move over to the left side of the room. Thank you for that presentation. I'm Nicholas Berry from Foreign Policy Forum. When our website writes critically about, especially two governments, our computer are attacked and even our server gets attacked and shut down, we have geeks on retainer. There's not much we can do about that, is there? Well, I think this is, I don't know your specific story, but the frustration that you are expressing is that is the cybersecurity realm that we live in today. And Barbara, you mentioned Sony, but yeah, everyone is dealing with these, just continuing attacks. Our sites are constantly under attack, you see it as well. And today I think the best answer I have for anyone that's dealing with that is get up to date, do patch and pray, because that is the best you can do, but at least let's do that. And actually it's sort of alarming when you look at cyber attacks, how frequently you find that in fact, things weren't patched, things weren't updated. So there is actual value in just staying as current as you possibly can. But again, I think over time we wanna develop some tools that will get us a little bit better foundation of cybersecurity. Let me just give you one example. We have a new DARPA challenge that's underway today. It's called the Cyber Grand Challenge. And the notion there is that, if you go to DEF CON every year, there's a capture the flag for human teams to compete to try to keep their networks up and deliver on their missions while they're all attacking each other and fighting off attacks. And these amazingly capable teams fight for this honor every year. We're building a league of their own for machines to play capture the flag because we want to start developing the automated systems for cybersecurity in the hope that they, I think, the expectation that they will be able to scale and to operate at machine speed, which is what's gonna be necessary given that the attacks are coming at machine speed. If you think about the attacks that are being driven at machine speed and you think about humans typing as fast and furiously as they can, you know we're just dead, right? So we're gonna have to find a way to get machines to scale and keep up with that. So I think the Cyber Grand Challenge over time is gonna lead to the kinds of tools that I hope will, you know, make your pain go away. So how many people in this room have either had their email, their Twitter, their social media? How many people have been hacked and I can't be the only one? Come on, yeah, exactly. Have you had yours hacked? Oh, a few months ago, the entire Pentagon press corps got hacked by the same visitor. You were all loved equally, okay. I think we had a question back here, the gentleman, yes, thank you. Good afternoon, Tom Ryzen. I'm the technology reporter at US News and World Report. Thanks for all the great work you do with DARPA. I love robots. And yeah, you mentioned ethics and you also mentioned autonomous capability. How are you tackling the ability? There's talk of drones or cybersecurity programs that can make some of their own decisions so that they can tackle threats faster than a human operator. But you also mentioned a process for ethics. How are you dealing with the ethics of that because a drone could, it has missiles attached to it and it could make some of its own decisions. So that's obviously a future ethics concern. How are you thinking about that? I think this issue of autonomy and how we want to use it, use those advanced capabilities is core and as you point out, it really does span across many, many different domains and types of technologies. The way we're trying to think about this and grapple with it is to start by thinking not just about what the machines do but about how humans and machines accomplish tasks together. And if you think about it that way, I think, let me break the term autonomy down. One dimension of it is the technical capabilities of systems. And in fact, because of all the underlying technologies, we know that those continue to get better. The second part of the equation is the autonomy piece which is really about rules of engagement. Those are human choices in the context of warfighting. They're warfighter choices about what degree of technological capability is used under what circumstances. And especially when you get to practical warfighting situations, I see a lot of warfighters and future warfighters in the room. My experience is that those are the people who are the least interested in losing control over the decision-making process because that is the central responsibility of our warfighters in conflict. So again, if we try to frame it in terms of understanding the advance of technological capability and thinking through where the human decision-making and control fits into how those technologies are used. So for example, cyber grand challenge, if that leads to autonomous cyber defense capabilities, there's still gonna be a human decision about what class of behavior it's gonna be used against and what machines it's deployed on. And I think that generalizes to things like the use of higher technological capability in weapons systems as well. So human decision always somewhere in the chain? Fundamentally, conflict is about those human decisions and over many generations, we have increased the technological capability that humans control and decide. And I think that will continue, but I think you just wanna keep your eye on the ball that it is not here. Where it is in that. Some more questions, please. This gentleman right here. And then the gentleman in the back by the door, please. Thank you again. Appreciate your work. I'm the deputy at the cyber center at the Naval Academy. And one of the big issues we're talking about is the intersection of unmanned systems and cyber insecurity. And one of the arguments made was well, if we have a cyber attack on advanced military then we can go full auto and we don't have to worry about the downlink, which point then brings up the ethical questions of what the machine will do. But it was interesting, I was talking with former secretary of the Navy Danzig who's written a recent paper on this. And he doesn't seem to be so sure about the supply chain because people argued, well these unmanned systems will have a secure supply chain, people know everything that's in there. And he made an observation that the number of basically transistors being built per second in the world today is in the Trojans per second. And can we really even know what's going into these many, many components? So I wonder if you could comment on, might cyber insecurity trump or delay this move to autonomy and unmanned or paired systems? Yeah, I think there are a handful of very interesting topics embedded in the comments you made. Let me just tease out a couple. One is about the supply chain. We live today in a world in which I actually think that the DOD use of semiconductor components, we've put ourselves on a dead end path. And today we use a technology base that is American, it's IBM, but it's in a semiconductor fabrication facility that IBM is in the process of trying to sell to a company that is still located in the United States but is owned by Abu Dhabi. And so I think the whole model on which our notion has been based for the control of the supply chain for the semiconductor components, which often are the critical pieces here. All of that I think is gonna get swept away. Something different is gonna happen. And I actually think that's a good thing because by focusing on trust, we've achieved trust through that model but the price we have paid is that we move much more slowly and we use much older technology. And I think relative to where commercial industry is, we've been at a disadvantage, therefore we've been at a disadvantage versus adversarial threats. So at DARPA we think it's a great time to reset on that question of the trusted supply chain for electronics. We're working on some radically new approaches to that problem that will allow us to use the leading edge capability that is a fundamentally global technology that is not controlled by the United States anymore. But we wanna tap that and we wanna end up with trusted systems and we're working on a new paradigm to achieve that. Number one, and then number two, I think this notion of the cyber dimension of security for embedded systems is that's a major issue for DoD but it is also as we talk about what's happening commercially with the internet of things. All of that, when I hear internet of things, I think the marketers are all imagining all these wonderful gadgets. What I see is this exploding attack surface and I think it's gonna be a pretty unpleasant experience unless we figure out how to make those embedded systems secure. But I think there's actually some very good progress on a DARPA program in that area. It's a program called Hackums. It takes formal methods and starts scaling them so that operating systems for embedded platforms can be made provably correct for specified security properties. And we're just starting to demonstrate unhackable small-sized drones. Some of that is starting to move over into the automotive industry. Again, that's part of that attack surface that is available to attackers today. So again, another piece of this notion of foundational cybersecurity that elevates our capabilities across the spectrum. I believe there's a gentleman way in the back and then the gentleman standing up. Let's see if we can get both of you in. And then the gentleman sitting right in front of him when this gentleman's done. Thank you, go ahead. Thank you. Good afternoon, ma'am, Ben Hernandez. So for decades, the military has been able to plan around space as a sanctuary for our intelligence and communications, which I imagine our near peer competitors are not too happy about. Are there threats emerging to our space capabilities and how is DARPA meeting those challenges? Yeah, I'm really glad you asked about space because that's another domain that has changed dramatically. I think we still, our methodologies and the way that we deal with the space domain today are still sort of premised on an environment in which we were the only ones acting up there. And of course, two things have happened. One is that other nation states are getting active on orbit. The other is that commercial space has just exploded in a really terrific way. So it's no longer just a sort of sparsely occupied part of the, not part of the world, but around the world that we can just sort of hang out in and do whatever we want. So a number of things need to change for us to be able to maintain the incredibly critical space assets that we need for every aspect of war fighting. Unfortunately, there's not gonna be a simple answer to this, but the solution I believe lies in everything from real-time space domain awareness to changing what's possible on orbit and changing the cost structure of what we do on orbit, but also changing launch. And just to use that as maybe one example, one of those things that I think is promising in space is especially in the commercial space world, we're starting to see amazing things possible on very small satellites driven by microelectronics and software. But those, even those small sats today still rely on launching out of Canaveral or Vandenberg, you still sort of have this 24-hour, 24-month delay before you can actually get on orbit. We wanna break that, and one of our programs, Alasa, is designed to let a fighter aircraft take a rocket with a satellite up to a high enough elevation that it can then boost from that high elevation to low earth orbit and net of all of that, deploy a satellite in 24 hours from the time you're ready to go, do it for a million dollars for a hundred pounds to Leo and do it from any runway in the world, which I think would be game-changing for space. But just one of the things we're gonna need to be able to answer your question. And let's make it our last question as we wrap up the gentleman in the back there, please. Hi, Zach Biggs with Janes. I wanted to ask you about risk. In particular, DARPA's known for taking some greater risk. It's part of the purpose of the agency. Pentagon official Al Schafer has talked about his desire to get more risk in the research and development programs for a lot of the service labs, for a lot of different areas in the Pentagon. He also said that if those labs didn't start doing it, he would think there'd be more money going to DARPA. Obviously, more money would be great for the agency, but do you envision that there's a way to try to get more of those labs taking some of the risks that might be necessary for the leap ahead technology that's, for instance, being talked about as part of the long range research and development planning program? You know, the reason we talk about risk is not because we love risk. We take risk because it is necessary to achieve high impact. And that's really what we need to be talking about. In fact, a conversation I have with my program managers all the time is, you know, if you can achieve very high impact with zero risk, let's just go do that. But it turns out after you do those, there aren't too many of those, and after you do those, then you have to take risk if you want to reach for really transformative change and future opportunities. In the context of the broader S&T community in the Defense Department, I think it's important to be clear about missions. DARPA was created in the wake of Sputnik out of a recognition that the science and tech base that we were building in the Army Navy and the Air Force was incredibly essential, but that we also needed a place that daily came into work to think about how to prevent that kind of technological surprise by living outside of the known requirements and the visible threats and opportunities. And that is very, that is still, you know, how our roles and responsibilities are divided up all these decades later. And I think it's really important that we not lose sight of that. Because in fact, you don't want all of S&T doing what DARPA does. You need a lot of what the service S&T organizations do. Without that work, we don't realize a lot of these advanced capabilities. We certainly don't keep up with all the daily needs that our big complex mission systems have. So, you know, I'm always about, let's reach for greater impact, but you don't want to make everyone look the same. You need an entire ecosystem that works. And in fact, DARPA can't achieve what we need if the rest of that ecosystem isn't functional. So, I think it's important to understand it in that context. Well, we're just about out of time. I have to tell you, Arthi, if half the things you've sketched out here came to be true, it would be not just a different world, but it would be a different Pentagon press corps. We'd be covering, we'd be covering a different world. She's given me about six things to think about, which I just have to beat Politico on deadline. Which is not easy to do. They're faster than me. It's all about pace in my world, too. What can I tell you? It is. It's about innovation. It's about speed. And maybe someday we'll have a pilot take us, think about, he or she will take a plane into space, launch a satellite completely with their mind, and not, and there'll be no hands-on controls. Thank you. Is there anything? We have two seconds left. But let me just ask you. I think we all want to know. Is there anything we haven't asked you about? Is there like some incredibly even more super cool thing that DARPA's doing that we can't even imagine? Is there like one last? One last thing. One last piece of super cool stuff. Well, if you want cool, we could talk about cold atoms. I'll just finish with cold atoms. So we're completely dependent on GPS for position navigation and timing today. We know that that's not a good strategy to have a point dependence. And so among many things that we're doing to get beyond that GPS dependence is we're taking this beautiful cold atom physics and starting to make the world's most accurate clocks and gyroscopes, not on big lab benches with PhDs, but in boxes that you can put onto ships and take this Nobel Prize-winning physics from 20 years ago and turn it into solutions to DoD problems. Is that going to change the super expensive watches out there that we all wish we might think about buying? I don't think cold atoms are coming into your watch, but they are getting smaller. So I don't know. I think it would be uncomfortable on your wrist. It's going to be a little bulky. All right, we'll leave it there. Thank you so much. Thank you, Barbara.