 Well, ladies and gentlemen, thank you all so much for being here this morning and welcome to this Lowy Institute conference and this beautiful venue here. Frontier rules, emerging technology and gray zone challenges to the rules-based order. I'm very grateful to the Defence Department, the Strategic Policy Grants Program for sponsoring this activity. I'm Sam Roggeveen. I'm Director of the Lowy Institute's International Security Program. I'd like to begin by acknowledging the traditional custodians of the land on which we're meeting, the Ngunnawal people, and pay respects to their elders past and present. I'd also like to recognize two Lowy Institute board members, the Honourable Penny Wensley and Serangus Houston. Thank you for being here. And I'd also like to welcome our international keynote speaker is Dr Samir Saran, President of the Observer Research Foundation, who I'll introduce more formally in a moment. And Dr Corey Sharkey, who you heard from last night from the American Enterprise Institute. And lastly, while I'm acknowledging people, I'd like to recognize my colleague, Ben Scott. Ben is the Director of the Research Project, which was responsible for this conference. Ben has led the Lowy Institute team's effort to bring this wonderful event together. And today was going to be Ben's big day. COVID had other plans for Ben, so I'm proud to stand in for him this morning. It's worth adding actually that we did plan for this possibility, which is why all of us, Ben included, has been working from home for some time now. So just to ensure that if one of us got sick, the rest wouldn't. This conference is about how the rules-based order intersects with emerging technology, and in particular, what that means for the security of Australia. Emerging technology is of course only one of the challenges to the existing order. The rules-based order is buffeted from all sides, the war in Ukraine being just the latest challenge. On its face, Ukraine looks like an old-fashioned war dominated by artillery and armour. But as I'm sure a number of our speakers are going to discuss today, beneath the surface we see the emergence of new technological trends. New technologies are emerging in cyberspace, in outer space, and in lethal autonomous systems. The international rules are developing alongside those changes, but not as quickly as the technology itself. And for countries like Australia, that creates a particular challenge. Australia has a strategic interest in defending and extending the rules-based order. Rules that constrain raw power benefit a middle power like Australia more than most, certainly more than great powers. So Australia supports the development of new norms to better regulate emerging technology and to regulate state behaviour in cyberspace and in outer space. And more broadly, of course, we're famously advocates for the rules-based order, at least among policy wonks in this town. It was a major talking point in 2016 when the Defence White Paper made 56 references to the rules-based order. On the other hand, Australia has to protect its interest in an environment of intensifying geopolitical competition. The 2020 Defence Strategic Update, a mini white paper, if you like, made only three references to the rules-based order, but seven to lethality or lethal. And last September's launch of AUKUS and the Nuclear Submarine Project at its centre was the clearest indicator yet that a much harder and sharper edge has entered Australia's language and signalling about the emerging world order. There are many hard questions. Should Australia use so-called grey zone tactics? When should Australia engage in offensive cyber operations and in information war? How should Australia's acquisition, deployment and operation of lethal autonomous systems be guided by international law? How can Australia support the peaceful use of outer space and also prepare for strategic competition there? Now it's possible, just possible, that we won't have complete answers to all those questions by the end of our day. But we hope that asking the hard questions will make it easier for policymakers to answer them when the time comes. A few more mundane matters before I continue. You should all have a conference program in front of you. A few highlights. The Chatham House rule applies to our panel, our three panel discussions, but our keynote speeches, including the Q&A, will be on the record. They'll be recorded and they'll appear on our YouTube channel tomorrow, including last night's address by Corey Sharkey. The bathrooms are through the doors of the back here and just down the hallway to your left. And finally, just a reminder that for anyone who needs it, there are masks and rapid antigen tests available outside the door. It's now my honor to introduce our opening keynote speaker, Dr. Sameer Saran. And there's no one better to get us started on this day than Dr. Saran. He's the president of India's premier think tank, the Observer Research Foundation. He's also the curator of the Ricina Dialogue and the founder of Sci-Fi, India's annual conference on cyber security and internet governance. Dr. Sameer has authored four books, the latest of which is the New World Disorder and the Indian Imperative, co-authored with Shashi Thirur. Dr. Saran will speak to us for about 20 minutes, after which I'll ask a few questions of him before opening up to questions from the floor. So it's now my pleasure to formally welcome Dr. Sameer Saran. Thank you. Thanks. So I'm actually wanting to give you a view from India, and it's going to be very different. It's going to be different from the enthusiastic, compelling interventions by Kori, fondness for technology, optimism around technology. And just to put it in perspective, please understand India is changing based on technology. We are connecting more people to banks. We are giving them more opportunities. More women are coming into the workforce. And our growth and our transition is going to happen in the fourth industrial revolution. So for us, technology is home. It's the base condition. Having said that, because it is so important for us, we need to start being a little cynical and a little careful in its deployment. And I'm going to give you a view from India, especially around the gray zone, with possibly, with all terrible slides, let's start with a good one. This one. Yeah, let's start with a bright slide. Now, you know, I thought this would be a good way to get the morning going. 1899, that's the first image, this cute little monster on the top of the screen. It's from a magazine called The Verdict in 1899. And this was basically the fear, the anxiety, the threat posed by the monopoly of standard oil on American economy. This is standard oil. That monster is standard oil, 1899. And of course, we all know by 1913, we had dismembered standard oil. It had been broken up. It did not exist anymore. That was policy action. That was regulation. That was public opinion. That was politics. That was political leadership. That you saw a threat, but it took you 14 years to dismember it. This is an image, more recent image, 2018, the economist. In 14 years, we'll have dismembered each one of them, mentioned on the screen. That's a question to ask. And it's 14 years too late. The oil companies were only managing your subways, your cities, your heating, your transportation. Google, Facebook, Amazon, decide who you date, what you eat, what you choose, how you vote, what you think. It is far more dangerous to allow them to have such unprecedented control of our daily lives. And yet, we are still seeing a euphoric approach to managing technology and their engagement with our lives. Now, let's just place this against a map. All these companies that are named are American companies, having great influence on all our lives. If you're sitting in India, this is the biggest threat you see. Companies not answerable to Indian law, not answerable to the Indian Constitution, not working in the Indian context, yet having complete control over Indian thinking, Indian thought, Indian decisions, Indian consumption, Indian preferences, Indian partners, and Indian attitudes. And if you have someone like Mr. Jotso, who puts a bounty on the heads of leaders around the world and says, I'm going to change them, then the same international corporations, transnational corporations in partnership with those who put out such bounties become suspect. And then you come to the Ukraine war. Transnational corporations who sell themselves as being global platforms, as global enterprises, working for global technology benefits for all, suddenly aligned with one side of the conflict and cancel Russia. I'm not saying it's right or wrong, I'm just saying it happens. And you realize they are not transnational, they are not global, they are partisan. And if your entire society and economy is going to be based on riding on these international platforms, then you are on a very sticky ticket. And in that sense, the threat from unregulated technology behemoths and their being able to game your regimes, your political systems and your outcomes, economic as well as social, clearly present a danger to those sitting outside of the geographies where they originate from. More importantly, even if you were to look at the geography where they originate from, there is a danger that you have the possibility of non-state actors using these corporations and their instruments and their platforms for different purposes. I would argue that the polarizations in the U.S., a degree of polarization in India and in many other places, is equally the cause of unfettered technology as it is of tribalism. In fact, tribalism and technology are a heady mix that are challenging stability, that there are challenging the politics of our times. And in some ways, we are all becoming, all democratic societies around the world are becoming IKEA stores. Do it yourself. China and Russia have to do very little. We are doing it to ourselves. But certainly, they are going to use this moment and they're going to interject themselves into the debate. And that's the next slide. So, let me get a hang of it. The second is, of course, I think we did speak about China yesterday night. Corey did mention China in passing. And I think this is an important question to grapple with. We have not only the Chinese ability to participate on our platforms, while denying us the ability to do the same, we are not able to go and criticize Xi Jinping on Baidu and Baibo and their platforms. They can come and interject themselves into our debates. But more importantly, unlike the Russians who can game the Facebook ads and Twitter ads, the Chinese have their own platforms as well. They have their own products that are serving the purpose of the Communist Party of China. So they now have the ability to not only shape your debates by participating in them. And you have the Wolf Warriors and you have the Bluetick Chinese spokespeople who are rather caustic in their comments on everything that happens. You also have Chinese technology platforms, TikTok and others who are shaping public opinion, have the ability to shape behaviors as well. And in that sense, I think the question before all of us is how much longer will we need before we start circumscribing Chinese behavior online or at least on our platforms? Now, some countries have taken action against Chinese hardware. Some companies and some countries have taken action against Chinese apps. No one has taken action against Chinese influence operations online. And there is this rather small belief that our democracies are immune and safe from such manipulation. The last decade tells us they are not. So in some sense, the approach to China online will have to be a central question for the sanctity and for the preservation of our regimes at home. When we spoke about last night, when Michael introduced the conference, and we spoke about gray zones and certainly security in the defense operations, I would break them into three parts. The first would be actions and instruments such as cyber attacks, lethal autonomous weapon system, as San mentioned. Certain other instruments that are being developed to perpetuate and fight wars, theft, data acquisition, DDoS attacks. We have a plethora of actions and instruments that all of us are aware of. Many of our agencies are preparing for. Many of them are also partaking in it. So not only are we being attacked, but we are also reciprocating. And I think that's the first element of the gray zone, the non-state actor, coercive interference that is happening in many of our countries. The second part is what I believe the more insidious part of the gray zone operations, which is influence operations. And that is also part of many playbooks around the world. And if you were a military person, the will to fight, I think that's one of, that's a term you regularly read when you read books on military doctrines, the will to fight. Now, if influence operations can deplete your will to fight, or can reach to political decisions that are inimical to your own self, that itself becomes a gray zone operation. And increasingly Chinese operations online are beginning to resemble that. They are building and shaping public opinion in different geographies. It may take a little longer in the US. It may be a little difficult in the Indian context because we are anyways so diverse and different in our own approaches to key questions. But in smaller countries, in more homogeneous countries, the Chinese have actually achieved results. They have actually shaped political outcomes to their benefit. It has happened in South Asia. It has happened in Africa. It has happened in my neighborhood. And some of my neighborhood countries are struggling because they reached a consensus that was dangerous for them. So the Chinese and Chinese tech, China tech, I call it the red tech. So you have big tech from the US, which is clearly unregulated and needs to be reined in. And you have red tech from China, which serves the single and primary purpose of the Communist Party of China, which needs reining in. Now, when we say rules based order, and I think that was used a few times by Michael yesterday and by Corey as well, whose rules is a big question? I think there is this sanguine assumption that Michael's rules apply to the whole world. And what Michael defends is what I want to defend. So I think the big question that has happened is, and this is something that's very interesting. I think some of you have time, I was mentioning this to someone last night, media and modernity, John B. Thompson, you must read this book. It's fascinating. He was actually talking about the broadcast media 100 years ago. But actually, every argument of his holds true for the digital media as well. And I think this is something we all must read. He spoke about a few concepts, which I think are important to understand when we think about a common aspiration towards a rules based order. The first is that he said media has an ability to create synthetic histories. Someone was saying that Australia doesn't have, someone was mentioning to me last night, Sam was mentioning to me last night, or someone else was mentioning to me that Australia doesn't have the baggage of history. It doesn't have the complex history. So they are less likely to have polarized societies in the future. And I said, read this book. They will tell you how we can create the baggage of history. You can actually create synthetic, it's called mediated historicities. You imagine your histories based on what you've been fed through the media. And when you've been fed through the media at a rapid rate all the time, chances are you will describe yourself in those terms. And in some sense, this particular site tries to tell you that we are colliding. The world is colliding curtsy the digital media. Rules based order was not meant to be in a world that collides. It was meant to first have sovereign boundaries. Sovereign boundaries were the first point of departure for a rules based order. And sovereign boundaries meant that we had different constitutions, we had different treatment on freedom of expression, we had different rights, we had different obligations of the state in each of those sovereign boundaries. But American tech seeks to homogenize many of those differences through tech design. The coders in Silicon Valley have homogenized many of our approaches to expression, to speech, to rights, to obligations, to duties. Now, what does that do? In a multicultural society where you had typically freedom of expression with reasonable restrictions so that you don't poke each other in the eye all the time are today a flame because we are now enjoying the absolutist freedom of expression of American platforms while living in certain localities. We were not meant to have the absolutist expression. We reached a consensus in those geographies to behave in a certain manner. But as soon as you start colliding and these intrusions through technology, the interference through technology is breaching the sovereign borders. Weak sovereign borders means weak rules based order. The idea that you can dispense with sovereignty to have one world is rather foolish. Unless you recognize sovereignities, you don't have a rules based order. And technology till now has been breaching sovereignty. We've been interfering in our friendly countries. We've been interfering in countries that we don't like, but interference has become the norm and I think it is time to step back and recognize that we were not the same. And that's why we decided to have borders and create some of our own systems. If you were American, you would have adopted the American Constitution. You're not. You're Australian. So I think this becomes one of the big questions for us that in a world where ideas are going to define security for the future. How do we ensure that sovereignty survives in that particular space? It's difficult. How do we have digital sovereignty even as we have digital interconnectedness? And I think that becomes the big question for all of us. And there is a technical fix, but businesses don't like those technical fixes. They want one market. And I think that's the other challenge. That how do we create business models that don't undermine the whole notion of sovereignty and therefore international law and therefore international order? And it flows from there. I will conclude this with the final thought. The third element of gray zones is technology as combatant. So one technology as an instrument, we use it to hurt other people's infrastructure, acquire assets, steal intellectual property, bring down facilities. Second, influence people's minds and political decisions and outcomes and elect Donald Trump or even leave the idea behind that we may have elected Donald Trump. That is enough. You don't have to actually elect him. You have delegitimized a national election by just planting that idea that somehow they would have been interference. But the final role of technology in that is perhaps not necessarily gray zone, but in the gray zone is technology deciding when to fight. And of course, there's a whole field of study, black box algorithms and how they will eventually advise nations on the risks, on the threats, on the possibility of China crossing the border again and therefore, India retaliating as an advanced response. You are going to increasingly see machines shape our military decision making. We have already seen that technology has shaped our social behaviors because we are all addicted to Twitter. We are now likely to be more rude, frivolous, short, sharp, out of context because we know this works. We have seen how technology has changed social conduct. The idea that technology will not change military conduct should be dispelled. Increasingly predictive technology is going to decide general's actions. So general from a particular algorithm, using a particular algorithm is going to behave in a certain way based on his handheld device. And eventually, combatants are going to both use predictive tech to decide whether they need to fight, go to war, make peace or create a stalemate. So increasingly, we are seeing big tech will proliferate political decision making, military decision making and battlefield decision making. Now drones and all are just the tip of the iceberg. And lethal autonomous weapon systems, smart borders, submersibles, these are just the beginning of the digitalization of combat. In the coming decade, we are going to see a far greater proliferation of technology-led combat and technology shaped combat. And my fear is that the political processes, the democratic processes, because the algorithms of black box by definition are going to be kept out of that combat role. So we are going to see perhaps even a degradation of democratic decision making over the battlefield. And you will increasingly rely on machines and uniforms to go into war or to make peace. And I think for democracies, that is dangerous as well. If you were to outsource your decision to fight a war on a Coder and Silicone marry and a testosterone fuel general, then you are in danger. And we have to reclaim that space as well. By final point, many of our assumptions including around the rules-based order, Michael, Corey and others, were based on the real world. Many of our rights guaranteed under the constitution were not designed for the virtue. For example, let's take again my favorite law, freedom of expression. We were given various shades of freedom of expression for our conduct in our daily lives in a real world. We could use a mic in auditorium. We could not shout fire in a theater. We were given laws that were meant for behavior in a real world. Even in that absolutist right to express in certain countries, you did not allow mobs of thousands and millions to crowd outside Michael's home and demand higher salary. That was not permissible. A mob fury was not part of freedom of expression. In the digital world, mobs of millions are at your door all the time. And don't tell me you can switch off your phone. You can't because your phone is your lifeline, your livelihood, your connection to your professional requirements, your connection to your families. And yet, we want to defend that expression. I hear it so many times. We must have absolute expression online. No, you should not. We must continue to strive to be good human beings as we sought to do when we wrote our constitutions. And leaving mobs on someone's door was not part of that plan. And therefore, we have to rethink, perhaps rewrite some of our fundamental assumptions, not only of the rules-based order whose rules, but also of our own constitutional laws as we increasingly embrace the digital. So I'll leave it there. Thank you very much. Sameer, thank you. That was electrifying. And as is traditional at a conference, we are running behind time. So I'm prepared. Let me just make the magnanimous gesture here of yielding my time. And if anyone has a truly pressing question that they want to ask Sameer, rather than me kicking things off, put your hand up right away, and we'll get started with the audience. John Blacksland is at the back of the room. Sameer, thank you. Really stimulating presentation, John Blacksland. I'm struck you were really critical of the multinational, US-based multinational, mindful Google is headed by an ethnic Indian CEO. And we know that Silicon Valley has its second India, if you like. I wonder, how do you reconcile those two things? Thanks. I am not going into personalities here. I'm going into principles. I don't care who leads Google, but should Google have so much control over my life? I think that's the fundamental question. And again, I think if you were an American citizen, and you would see this slide, you would probably say, oh, wow, this guy, they're just Indians, they're complaining. But that's the point is that the same Americans dismembered Standard Oil because they control the subways and the energy systems. These companies do much more. It was not India who decided to break up Standard Oil. It was the American political leadership and public opinion and society. It was the complete control over your lives that was unacceptable. And if Google is getting there, it's time we do something about it, irrespective of who's the CEO. But we take pride in the fact that we have so many Indians in New Zealand. Sameer, let me ask you about a practical example of what you spoke about with regard to conflict and the possibility of information technology having an undue influence on decisions about war and peace. What was actually the influence of the technologies you spoke about in relation to the border clashes that we've saw between India and China and lead up to 2020, both beforehand in perhaps generating the animal spirits, if you will, but also in the aftermath when India banned a lot of Chinese apps. So I can give you an example. I've written about it so I can share it, since this is not Chathamal's rules, but I can share one episode which I've already written about. And this is something that was told to me from a member of the National Security Council. He mentioned to me that he can't attend a certain event I had planned a few weeks ago simply because there was this massive Chinese influence operation happening in India at that particular time. This was during the Doklam clashes 2017 where Indians are basically a smartphone or a handheld device internet country. Literally everyone is connected through the phones. Most of them are Android phones. Half of them run the mobile OS with the Chinese software. So China has literally complete sway over at that particular point of time India's handheld devices. And if you were to put the word India China clash or Doklam, any of the keywords, if you were to put it into your search engine at that particular point of time, the first two or three pages were all Chinese news, fake news, synthetic news. It was for two purposes. One, to delegitimize Indian action. Second, to deplete the will to fight. And I think that is important from a defense sector perspective. If you were to demoralize, if you were to de-fame and delegitimize, I think that's half the battle. And we saw it happening in 2017-18. So us, our banning Chinese apps and our banning China's tech proliferation in the OTT layer was thought through. It was not a knee-jerk reaction. Chinese were gaming our public sphere. Chinese are gaming your public sphere. Yes, please. Just wait for the microphone if you would mind. You're right that there's often, when a company gets too big, we see the strong antitrust move happening. But obviously, one of the things that's happening now is that there are big, big tech ones in China and big, big tech ones in the US. And so one of the dynamics is also how do they fight against each other. And it looks increasingly like they will not fight against each other in their own markets that the Chinese market in the US serve the US market. But there will be a clear fight in the third markets. And so I'm curious about two different things. The first is, what is the impact on the regulatory approach on things like breaking up big tech, when you're not just thinking about big tech and its power over your citizens, you're thinking about how that big tech plays out against another big tech in a third set of countries. And then on the flip side, I'm curious not just about India, but about other third countries, how you think that they're going to respond to the competition between Chinese and US tech? So let's start with the second one. Recently, we had a bunch of folks over in Delhi for the Raisina Dialogue. Lots of them from Africa. And we had a very fine technology policy person from Kenya. And I asked her, I said, what do you do? How do you choose? Where do you go? China, US? And actually, it was she who mentioned to me that, listen, you saw what has happened in Ukraine. All the American tech companies have aligned behind the American position. And they tell us that the Chinese are serving the Chinese interests. Hello, this is the same. So we will go where the deal is better, because both of them are serving their own interests, we realize it. Both of them are going to serve eventually their own national positions. And we will have to choose what is best for us in terms of our affordability and ease of using it. So many of the emerging markets who don't necessarily have the wherewithal of, say, in India are going to choose the best deal, because they don't have the market size to shape the behavior of any of these corporations. Countries like India or perhaps even the EU, the bigger middle powers, although I must admit that we are most to be identified as a middle power. We are a big power in waiting. But some of these new geographies, which have the muscle, are hoping to change the behavior of these companies and make them more accountable to domestic legislation. So Margaret Westerer is my hero. And she is taking on the cowboys in the west coast with gusto. I like it. She is looking, staring them down and EU is beginning to tame them. Australia, for example, is a regulation obsessed country. And we love Australia for that reason. You are the pilot project that we eventually decide after. You do your project, you mess it up, then we get the right regulation to our country. So I think we are seeing some of these middle power countries. One, trying to shape behavior through their own legislation and laws. Two, because of their muscle exert operational influence on these corporations. But the reorganization of their original company will have to happen in the geography of origins. And we are not empathetic to Chinese operations. We ban them. We are empathetic to a fact that from a friendly, largely open society like the US, if there are corporations who are operating in a certain way, we can work with them. We can talk to them. We can hold them. We can hold them to account, hopefully. If we fail, then many of us will have to start using the veto on their presence. And I think the new bill that the new act that India is considering bringing in very soon, as mentioned by a Minister of State, seeks to circumscribe the operations of many of these big platforms. And it's not to change the ethic of their operations, but to respond to the Indian constitution and Indian law. Like I mentioned, we have to hold sovereignty even in the digital world. If you don't have sovereignty, there is no rules-based order. Well, I'm not sure that referring to Australia as regulation obsessed is a compliment, but it certainly is accurate. I did want to press you, though, on a possible tension between what you said earlier about the information age being, in a sense, post-sovereign, that we're entering a post-sovereign era. And yet what you just described, it seems to me, is that some of these tech companies are, in a sense, doing the work of sovereign states, of the big powers, in order to impose their preferred rules-based order on everybody else. No. So, you know, I was in a hurry, so I was already eating up. Oh, it's gone. Okay. I had a slide. So, but you're right. So, I had a neat slide which actually goes back to early college days, and we had the Maslow's pyramid there with the hierarchy of needs. And if you look at the Maslow's pyramid, the first two layers were largely provided by the state. So, the right to food, you know, air, water, electricity, conditions for jobs and your safety and family safety. And if you look at the Maslow's hierarchy, the bottom bunch was the prerogative of the state. Increasingly, that has become the domain of the private sector. At the state and governments can be elected and rejected. You can fire them. You have no such way to hold boardrooms to account. So, I think one of the big questions for all of us is how do you create accountable boardrooms when they are responsible for your life as well as your likelihood? And I think it's both. They are now beginning to provide you both lifeline services and likelihood opportunities. With that kind of, in some sense, landscape, you will have to eventually reach a situation. And I say this as an extreme position where you will have to perhaps communities who these companies serve will have to elect the boardroom. I think there might be a time that for good governance, you need to elect the chairman of the board. So Mark Zuckerberg will only be there if he is elected by the people he serves on Facebook. So 300 or 400 or 800 users, 800 million users, etc. I don't expect to see that day, do you? I've seen worse. We have time. Lydia Khalil. Just wait for the microphone. Thanks, Lydia Khalil with the Lowe Institute. Thanks so much, Samir, for your remarks. They were really thought-provoking. I'm wondering if I could ask you about what you think the prospects are for cooperation internationally for regulation of these platforms at one. And as a related question, there's a lot of focus on these big platforms I think deservedly, but they're kind of like the very sexy end of what's going on. There's a lot of more mundane issues around technology regulation and internet standards and other technology standards that are happening across the board that we pay less attention to. I'm thinking here about things like that are covered by the International Telecommunications Union and the UN. And I'm wondering if you see any prospect for, say, countries like India, other, I wouldn't say other, and middle powers as well, to put some influence in terms of regulation on the technology sector through those forms. So like any good UN organization, ITU is going to struggle due to its composition. Of course, there are murmurs now of a Chinese takeover and Chinese using the ITU to promote Chinese standards, protocols, etc., etc. But the fact is that before the Chinese took over, the Americans had been running it. So as a middle power, you have to choose the lesser evil. And clearly, as Kori would remember, I mentioned many years ago that even though America was a hegemon, it was an accountable hegemon. There was an accountability through which you could hold them to, and certainly you could petition their Congress, you could stand outside the White House with a placard, try doing that outside Xi Jinping's house. It's not going to be very pleasant. So you would prefer an accountable hegemon and an accountable power to an unaccountable. But I don't think you're going to progress beyond a certain point other than the standards and protocols at the ITU. You're not going to create shaping regulation for the nature of businesses. That is not the job of the ITU. The ITU's job is to make sure that the hand-holding remains firm and you are able to connect to each other all the time. And of course, certain countries and certain companies would want certain benefits out of those arrangements. And that is the margin, that is the game at the margin that is played. But your question on international cooperation is valid. I don't think incentives exist today for that international cooperation to happen. The companies like it the way the world is today, they pretty much have an unregulated open space for themselves. They pay lip service and they create these organizations that they fund that pretend to be working with everyone to create an international consensus that consensus is not going to come through unless there is incentive for the companies to come to the table. And that incentive is exactly what I mentioned. Companies have to be told that either you perform or you perish, perform as per our constitutions, perform as per our laws or you are going to be disbanded. I don't think anyone has wielded the stick yet. In that economic principle of carrots and sticks, all of us have become so used to feeding carrots we have forgotten that we hold a stick. Governments around the world have forgotten governance. I think it is time now to really bring the stick into the play and then come getting them onto the table for partnerships or for a solution might be easier. But on your point of forget the big tech, you're right. Equally dangerous are the intrusion softwares that are now proliferating. All our phones are bugged. All our conversations are public. No one pays attention to these smaller companies with more exotic products that are creating greater damage to many of our rules that we had agreed to live by. And they go under the radar. I think those need to be bought into focus as well. And I'm happy that Samantha Powers mentioned in the last Democracy Summit, the US government's intention to focus on these intrusive softwares that are now proliferating. We heard about a few of them last year. But there are many more that we have to worry about. And also another issue that we need to worry about is the kind of money we are putting into technology and for what purpose. Now if you were to read a few reports, today the largest amount of money on AI in the domain of AI is being spent by the defense sector for creating the perfect killing machine. So you're building the perfect killing machine. And you are obviously going to commercialize many of those inventions into civil appliances as well. Imagine your mixie that was designed to kill someone. So many of those military-grade innovations are going to be then used for commercial scale operations as well. So I worry about so much money being spent for creating the perfect weapon. And the Japanese, of course, like to spend money on creating the perfect sex robot. They are spending a few billion dollars on that. So Misogyny is the second industry that is being invested into. A few years ago I saw a box that was selling one of these companions which says that she will remember your birthday. She will wish you good morning. She will bat her eyelids. She will coat you Shakespeare. And she will never say no. And this was the tag on the box. Now you are investing into Misogyny. You are investing into Mayhem. That is the largest investment going into the future of our technology industry. So we can imagine the society we are investing into. Well, Samir, thank you. I am afraid we are out of time. No, but listen, this is a great morning and I want it to be positive today. Yes. Look, if my phone is right here, if my phone is bugged, as you say it is, then I hope whoever is listening got a lot out of that. I know I did. I am sure everybody in the room did. That was wonderful. Thank you so much. Could you please thank Samir? Thank you.