 Good afternoon, everyone, and thank you so much for joining me and the ANU team here. And this event, the Chancellor's Panel, is the first event we're holding at the beginning of Alumni Week for 2023. And to commence proceedings, I'm absolutely delighted to welcome Paul Howes, who will conduct the Welcome to Country. And he's an integral part of the ANU family, so please welcome Paul Howes. Manda Ngu, Wurrga Wurri, Chancellor Bishop, thank you, Julie, for the kind introduction. And Yurya Murang, Pura Murambang, Maranya, good afternoon, everyone. You and I do Paul Giroa House, my name is Paul Giroa House. I was born here, the centre of my ancestral country at the Alcambra Hospital. God bless it. Anyone born in the Alcambra Hospital? One. Good great to see one hospital alumni here with me this evening. Injumarabala Dr Mathilda Williams House. My respects to my mother, Dr Mathilda Williams House, because of her I can, because of all of our matriarchs we can. Yilin Galangbo, Giba Bangu, Woga Bo, Megai Bo, Didenil Bang Mai. And ladies and gentlemen, young men, young women, Chancellor Julie Bishop, distinguished guests. Nyari Injumali Nyambri, Gumao Waugulu Waulabulao Ngunuao Nagara Gawiradri, Muji Gang, Yanangbu Jayandu. My respects to Nyambri, Gumao Waugulu Waulabulao Ngunuao Nagara Gawiradri, Elders, past and present. Nyari Injumarabu, Muji Gangu, Nurembanjigu, Nini Edadu. My respects to all people and Elders from all parts of the country. Nyambri, Waugulu Waulabulao Ngunuao Maiin, Gawinbanya Nginuga, Nurembangu Dara. Nyambri, Ngunuao people, welcome you all to the country. Nadu, Wuda Gabigi, Balabambo Gubu, Balagibangu Gubu. Goenguliala Dumbali, Murwai Marambu. We listen to our old people, our ancestors, our Elders, and they show us the straight, the correct path. Dullagangmuru, Bijerimuru, Goenguliala. Belangali now, Yamamali now, Waulamali now. They nurture us, they guide us, they protect us, our old people. Mambu Wara Naminya Gu, Wuda Gabinya, Wuda Daraigu, Winninggala Gubali Gu, looking to see, listening to hear, and learning to understand. Yinja Malgiju, Yinja Marabu. A powerful word on country, philosophy, and ideology. It means many good things. They go slow, be patient, be polite, be gentle, and take responsibility. All those good nutrients. Yinja Marabala, Guji Gang, Gagam Marawala. Nunga Yalala Dain, Maiin. Respect is in the people and the government embracing and the ANU embracing Voice Treaty and Truth-telling in this country. Yinja Marabalawain, Giran Banana, Nuro Balabua, Gujiang Bing, Bunga Nara Nara. Respect is in the warmth of the campfire and the possum skin cloak that shelled us all. Our welcome to the country is always made in the spirit of peace and a desire for harmony for all people of modern Australia. And our main aim as local custodians is always to establish an atmosphere of mutual respect through the acknowledgement of our ancestors and the recognition of our rights that declare our special place in the pre- and post-history of the Canberra region. The name Canberra is derived from the name of our people and country from right here, gazetted on the 22nd of January, 1834, under the New South Wales colonial government. We have cared for Mother Earth since the dawn of time and evidence of our occupation, our sovereignty, our ownership can be seen everywhere throughout the land. Our signature is in the land, not just our DNA. And taking care of country is important to us all. Yinja Marabalawain, Moran Maginia, Borodorei. Respect everything living and growing. Look after the land and the rivers. And the land and the rivers will look after you. Yinja Marabalawain, Balabirida, Binna Binna and Yambri. Yawuluweonurambango. Respect is in the Canberra Creek and the breeze quietly moving through country. Goonguliala. Balawalawangadabu. Moranmadandabu. Mamayugurugambira. Respect is in the grinding stones and the carved trees made long ago on country. Goonguliala. Magamnawa. Wagadainanudangua. Bumiradaganda. Respect is in the soles of the feet of our dancers, hitting Mother Earth. Magagiribiringa. It's also found in the journey of the Bogong moths in the mountains. It tastes very good. Marambangmalang. Noyagoyimalang. It's wonderful. It's fabulous to be here. Chancellor to share this welcome to the country. Just finally, the law of the land. It talks about yinja malgiju. Mayangalangbu. Yandu mayangalangdu. Yinja malgiju. Yinyogea. Give respect and honor to all people. And then people will respect you. Yinyogea. Yinyogea. Yinyogea. Yinyogea. Yinyogea. Yinyogea. Yinyogea. Yinyogea. Yinyogea. Yinyogea. Yinyogea. Yinyogea. Yinyogea. There. Where we're now right now, left now. The past, the present and the future. Just like the acknowledge, sorry Chansell, one last thing. My mother was the first Indigenous Australian to be awarded an doctorate here from the ANU in 2017. Because of her ICANN, we can many of us can on country. She was also the first Indigenous Australian to ever conduct welcome the country for the opening of parliament of Australia, which was the 42nd opening of parliament and also the 43rd and the 44th opening of parliament. And we were gifted that opportunity last year for the opening of the 47th. And in 2006 my mother was awarded the Canberra citizen of the year. And today is my mum's birthday. So happy birthday to my mother. She's not here, but thank you for my mother. She's going to play a quick song on the Yadaki before I go. Thank you, Paul House. And what a beautiful ditch that is. And Paul, thank you as always for your gracious and insightful welcome to country. Please extend our best wishes to your extraordinary mum, the amazing Matilda House. And I too acknowledge and celebrate the traditional owners of the land on which we meet, the Ngunnawal Nambri people and pay respects to elders past and present. So please, thank Paul House once more. Thank you, Paul. Now not only is this the beginning of alumni week for the Australian National University, we also held open day here. So welcome to prospective students and parents who might be here. Hope you can also join the ANU family. It was a really exciting day on campus. I had a ball going to the different college presentations and meeting students and families who are considering a life here at ANU. And it's also a time to remember our extraordinary alumni, those who have been through this university will always be part of this university. And we have 132,000 alumni around the world. I think there are about 5,000 or 6,000 people on campus today. So there's obviously a great deal of interest in the Australian National University. And quite rightly, we are Australia's first and only national university. And we were set up with a national mission, a mandate to tackle the challenges facing our region, facing our country in the aftermath of the Second World War. And so I feel we have a responsibility to focus on the issues that impact on Australia's national interest. Whomever might be the government of the day, we have a responsibility to come up with answers, to have the debates, to have the discussions about national issues. And hence the fifth Chancellor's panel this evening. We are grappling with the topic of cyber security and its impact on our national interests. Now we have gathered some extraordinarily talented people with huge experience in this area and you are in for a treat. I will be moderating the discussion. But we have Suthagar Sivan Ratnam who is the Chief Intelligence Security Officer at ANU. And Suthagar has an extraordinary background doing all sorts of things including working in cyber at the Bureau of Meteorology, the Australian Signals Directorate. He is without doubt the kind of information security officer that allows the Chancellor to sleep well at night. We then have Olivia Shen. Olivia is an alumna from ANU and she's now back at ANU at the National Security College. But for about a decade she has been working in the area of national security, foreign policy in various roles with the Australian government. Then Stella Solar who is the Director of the AI Centre at CSIRO. And Stella has had, I got to do it, a stellar career. Nobody's ever done that before, have they? With Microsoft, with various IT distributors. She's been very involved with tech startups and she brings a wealth of experience to this discussion. And then Professor Joanna Weaver. She is the founding director of the tech policy design centre here at ANU. And she has had a vast experience negotiating cyber issues at the United Nations. She's worked within DFAT. She's had many roles within the Australian government. So we have people with vast insights, perspectives, experience in this field. So I'm going to take my spot in the middle and ask a few questions and get the debate going. So let's start by diving right in. My first question is going to be to Joanna. Right. Russia's invasion of Ukraine. 12 months on. This would have to be the first time that cyber operations have been used in armed conflict since Russia and China and the US and others actually agreed at the UN that international humanitarian law applies to cyber. So they were all on the same page on that day at that moment. So drawing from your experience in negotiating this agreement with others at the UN, can you share with us how laws and regulations being shaped now have been affected by what's happened in Ukraine? We now have, if you like, a living example of what was agreed at the UN. Well, maybe let me start by painting a little bit of a picture in terms of how that agreement came about. So in March 21, I got on a plane and headed to New York. This was in the middle of the pandemic, so very eerie flying at that time and arriving in New York, central station is empty. There are very few cars on the street and I met Putin's special advisor for ICT security in an Irish bar because we used to meet. They served vodka in Irish bars? No, that's why I went there because they didn't serve vodka. It was actually a very strategic decision because it only served three types of beer, light beer, dark beer, and wheat beer, and this particular ambassador is known for his vivacious capacity to consume alcohol, and I knew that I would have to have drinks, so I specifically chose the Irish bar, but we digress. So March 2021, the negotiations, we concluded those negotiations in May and had agreement with Russia, China, the US, and 20 other countries that international humanitarian law applies in cyberspace. That was endorsed by every single UN country in December of 2021, and that was a really momentous achievement, but the significance of that achievement, in particular in the context of the Russian invasion of Ukraine just three months later, was not something that we at all appreciated or anticipated at the time we were having those negotiations. So I think there's three things that stand out to me from the application of international humanitarian law and that agreement. Number one, it's pretty obvious that it doesn't matter if you agree the rules, there's going to be countries that will violate those rules, and if Russia's going to violate the most fundamental principle of international law, state respect for another country's sovereignty, then the expectation that they will not be violating it in cyberspace is an unrealistic expectation. But nonetheless, we have agreed it and it means we now have the ability to say this is the standard that the international community has agreed and we can hold to account. The second thing that Ukraine has really shown is that cyber security issues are absolutely national security issues. They cannot be divorced from them. Many countries we've heard or some commentators at the start of the Ukraine conflict say, well, we haven't seen the flash bang Pearl Harbor cyber incident that everyone was expecting, and in my view that's a total and utter misunderstanding of how cyber is used in conflict. It's like saying all of a sudden we now have aeroplanes that can drop bombs and saying we think we'll no longer have tanks or infantry battalions. It's when you're adding cyber warfare into, it's adding a domain rather than saying you will exclusively have things operating in cyberspace no more than we would expect war at the time to move entirely into air. So the, the, it's very clear that we, that cyber warfare is now a fundamental part of modern conflict. And there are many instances and examples of that in Ukraine. And then the third thing I think the example that stands out to me in the lesson that we learn from this is it's actually not just about offensive and defensive cyber operations or how good your cyber security is when you talk about this now. Cyber security is important, but also equally in, in the Ukrainian context, digital services have been just as important. So the Ukrainian system of digital identity has been absolutely integral and allowing refugees who fled without papers, for example, to be able to prove who they are, to be able to receive payments and support and to stay in communication. So the idea that this issue is an issue purely focused within the military domain and purely focused in the context of offensive and defensive cyber security is really a fallacy. So cyber security is national security, digital government services are government services. And when we're talking about this, if we focus purely on the tech or if we focus purely on the laws and the rules, we miss the picture. We actually need to have focus on both the technology and the laws, the context in which they're used. And that's really what we're trying to do at the tech policy design center. Thank you. Now let's segue from war to peace. Suthaga, cyber resilience. How can we democratize? Can I say that? How can we democratize cyber resilience so that all sectors of our economy can be prosperous and safe without undue dependence on government and expensive cyber services? Can you just elaborate from your perspective? Thanks, Julie. One thing that's worth noting is in the last five to 10 years, cyber threats have become democratized. So there's a really low barrier to entry for people wanting to do cybercrime. And we're talking about things like ransomware as a service, hacking as a service. So even the most unsophisticated threat actors out there can now find services, find technologies that will allow them to do cybercrime at scale. So if there's one thing that's become democratized in cyber, it's on the threat side. And it means with very little effort and very little outlay, you can do sophisticated damage. So on the other side of the equation, the sort of the defensive side, if you will, we have to do the same. We have to democratize the ability to defend. Now, most breaches in Australia happen to small and medium businesses who don't have access to cyber talent in the same way a bank would or even a university. And they don't necessarily have the same level of interaction with with government. Most small businesses or medium businesses aren't going to ring up the Australian cybersecurity cell. So there is a security divide that's beginning to form. And yet a lot of Australia's prosperity depends on this very sector. Small and medium enterprise, mum and dad operations. If a large company like Medibank and Optus, and we've seen that in the news, they will survive with their reputations damaged undoubtedly, but they'll survive. But a small business incurring exactly the same fate, exactly the same type of attack will most likely fold. So this is a major problem. We have to empower these small businesses to be able to defend themselves. Government cannot scale to the millions of businesses that exist in Australia in that particular segment. Government is a backstop. So it's really important to understand how we can improve their access to cyber talent, improve their ability to build culture, make them more self aware about the data that they possess. So a lot of the services that you interact with. Small, medium businesses have incredible amounts of personal information that they may not know how to do, may not know how to secure. So we have to teach them. I think it really comes down to education in the first instance. We have to have an ability, I think government has to have the ability and industry itself has to have the ability to be able to teach people how to be safe when they run their businesses, but it is just part of contemporary risk management. Can't rely solely on government and we can't rely solely on industry. It has to be a combination of both. There's also a role to play for suppliers as well. A lot of these businesses outsource their IT and they think they're safe. So they have an expectation, they should have an expectation that their IT providers are able to keep them safe. And I think that's where the law can help. It's no different to occupational health and safety. It's no different to any other form of risk management. There is an expectation that you should be safe. And I think alongside that are changes to, and it's already beginning to happen around privacy legislation. So we're seeing more and more effort being placed on who is actually accountable for privacy, the kind of penalties that exist if you don't look after people's private data. Now the thing that I really liked in the recent amendments that are coming from government is that one of the enforcement mechanisms is that if you have prospered, if you have benefited from not looking after someone's private data, well then part of the fine is proportionate to the benefit you accrued. Now that is an amazing shift in behaviour and it's an amazing shift in regulation changing behaviour and I really applaud the government for starting to think about it in these terms. How do you think the behaviours will change? Is it a deterrent or is it an encouragement? What's going to happen? For some it will be a deterrent. For some who have monetised and benefited from a lack of privacy controls, it will be a deterrent undoubtedly. But for most it will be more regulation that they don't know how to deal with. So for those people we need to help them. Just like we do with every other aspect of consumer law. So I think for the vast majority it will be an incentive to get things right. And it's raising awareness that this is an issue that has to be taken seriously. It's just part of the health and the risk management of anyone doing business anywhere. That's exactly right. Yeah. Now Stella, we can't have a discussion about cyber security without talking about artificial intelligence. And we have chatbots and data deserts and Chinese facial recognition technology and all of these advanced technologies cause us to reflect on the ethical issues, the questions raised by the global pursuit of technological development. So what are your views on the implications of AI technology and how AI is changing the landscape of cyber security, indeed changing society, humanity potentially? And AI has really captured our imaginations, I think, over the last couple of months as well. So there are many debates out there on how to define AI and what AI is. But two key characteristics really stand out for me that are also connected to how it's changed the landscape for cyber security. One is scale and the other one is persistence. Artificial intelligence can deal with scale incredibly well. It can deal with large volumes of data, make sense of the data, find pattern in that data. And that is incredibly valuable for us. You know, we have challenges that are beyond what our minds and hands can solve alone. We're needing to solve health scientific challenges, find new breakthroughs and there is a data centric complexity to that, that we need tools like artificial intelligence to be able to navigate and solve for those. At the same time, that scale element can scale some negative components. It can scale biases. It can also scale negative actions, positive actions, but also negative actions. And so for me, scale is that operative characteristic that I think is important for us to factor in when it comes to artificial intelligence because it can help us do positive things at scale and negative things at scale. And in a persistent way, artificial intelligence technology, it doesn't get tired like a human would get tired behind a computer. It doesn't miss an anomaly or like a blip if there's, you know, something that is out of pattern. It doesn't miss that. And so this persistence and scale can be used for positive. It can also be used for negative things. In fact, in our recent report about Australia's AI ecosystem momentum, it talks about one of the key benefits that AI has brought to organizations is increased security. It talks about how artificial intelligence can do better network monitoring, application monitoring, and improve security. Here's the ironic thing. The same report also found that one of the key challenges that artificial intelligence is bringing in is security. So it starts to make you realize that actually AI is only as good as we lead it. It's a tool in our hands. And, you know, if there are positive intentions, it will have positive outcomes. If there are maleficent intentions, it will have negative outcomes. So it has really upped the ante. I wouldn't say that AI has contributed to one side or another. More so, it's upped the ante on both sides. And I hope we can do more to up the positive elements. One more part that it's really put a spotlight on is how much of a frontline we're all on as individuals in this cybersecurity domain. Artificial intelligence has enabled experiences that we're having a tough time telling whether they're real or not. Suddenly, any kind of social engineering attack or cybersecurity attack that individuals might be getting are emphasized because we're not able to tell what's real and what's not. And so in the same way that email has spam filters, I'm almost wishing for a experienced spam filter to be able to weed out what's real and what's not. And so I know that our teams at Data61 have been doing a lot of work on dark pattern detection, which is about looking through the web experience and being able to see are you trying to be manipulated with this ad or are you being taken down and non-secure pathways through this click, which is so difficult for us to tell. And so I really see that AI has upped the ante on both sides and it's really emphasized how much of a frontline each of us are on as well. And each of us as well as being obviously personal private citizens, we're also part of organizations and that frontline, it means that each of us are also a doorway potentially into the organization and the cybersecurity considerations. Now, Olivia, if I were going to study AI at ANU, I'd trot off to the engineering, computer and cybernetics department, wouldn't I? But I note that at the National Security College you've introduced a course on artificial intelligence. Now, that tells me that you think governments and policymakers need to understand more about AI. And I get that because I see this debate going on, the tension between those who think AI can be used for the betterment of humanity and those in the military industrial complex who are delighting in ways to improve the lethality of military operations. So what are you teaching and what do we need to learn? Well, firstly a shout out to my colleagues at the School of Cybernetics, they do have a great program if you have a year to spare. But for me, for the National Security College, what we did was create a course that's specifically geared towards policymakers and people in the national security community. So to wind this back a little, the way I came on this journey was that in 2019 I was lucky enough to win a Fulbright scholarship to look at artificial intelligence in the US and specifically the ways that artificial intelligence was being used in national security settings. So I was looking at things like facial recognition, persistent surveillance, predictive policing where the AI supposedly will tell you where the crime is going to happen before it even happens. And also algorithmic tools that were deciding really fundamental security decisions like whether you were released on bail and what the length of sentence you would get for a crime you committed. So I did this research in the US and I came home utterly terrified because I think it goes to what Stella said about the scale and the persistence of AI. And as someone who's worked in national security for over a decade, I can actually completely understand the temptation and the potential to use these technologies to keep us safer. Imagine the intelligence analyst that never gets tired. Imagine the cop that never misses a pattern, that finds anomalies and patterns that a human brain cannot really comprehend or achieve at scale. These are all the kind of uses that are incredibly attractive to the national security community. And at the National Security College, our mission is all about developing the people, the ideas and the networks that keep Australia safe and make Australia safer into the future. But what is security? Security at its heart is a freedom from fear. And freedom from fear in our society, I would like to think, also includes freedom from fear of your own government. And government is very much in this incredibly privileged, powerful position. Government actually has a monopoly on the use of force. And yet we come into all of these conversations about the uses of AI in the national security field that is incredibly opaque. It often doesn't satisfy the first principle of uses of AI around transparency or in some cases proportionality. So on these metrics, AI uses in national security settings is really risky and it's really consequential. So it's incredibly important that government knows what they're doing. Because these tools, Stella talked about them being tools, they're not inherently good, they're not inherently evil, they're not inherently neutral either. AI takes on the values of the system that created it. And the system that creates it, frankly, doesn't quite know what they're doing a lot of the time. If anyone's been watching the RoboDebt Royal Commission and been following it closely, it's a pretty clear indictment on a system that doesn't understand the capabilities and the tools that they're dealing with. Now, frankly, I wouldn't even call RoboDebt AI for a whole bunch of technical reasons I wouldn't need to go into. But this is a program that was legally, ethically and mathematically flawed. And yet an entire system rolled it out, sustained it, justified it and continued to justify it in the name of security and budget and good governance, even when people were being severely harmed. And I saw that in a lot of my research in the US, right? I saw people with disproportionate amounts of power using these tools against people who had the least power, people who were most likely to be discriminated in the data that built the AI, people that had no recourse or were too scared to seek recourse, people who were vulnerable, people who were marginalised, and I don't want that replicated here. So that's a long way of saying that what I intended to do at National Security was bring a course that really targeted from like policymakers in the National Security space and we bring in speakers from academia, from industry, from experts like Stella to come to the course and cut through both the hype and the fear mongering around AI and really hope that we can help policymakers avoid some of the pitfalls that we've seen in other applications, both in Australia and overseas. Thanks, Olivia. Now, I just want to think for a little moment about Australia as a cyber safe place to work and live and also a place of cyber and AI innovation, you know, cutting edge innovation. So perhaps you, Stella, and then Joanna and Siddharika, please. How can we as a nation invest for long term prosperity in terms of workforce capability and the accompanying governance and critical technologies and think of it in terms of small to medium businesses who we know are the most targeted sector when it comes to cyber crime and are so impacted by changes in privacy, reporting and the like. So how are we going to achieve that, Stella? It's a big question. How are we going to achieve that? What we definitely see is that SMEs are being left behind on the technological journey. They generally don't have those dedicated resources to look at cybersecurity or artificial intelligence. They don't have those teams in the same scale or investments at the same scale as the larger enterprises. And so, you know, there's education is one of them, but also technology, how we make technology, how tech firms also make technologies as another one, a lot of technology is requiring custom, highly skilled individuals in order to implement and that is just going to be out of reach for so many SMEs in the short term. So I think there's a couple of different layers to it. One additional point I really want to mention is for us to really uplift our capability in the technology landscape, we're going to need to think of not a band-aid fix for right now, but also a longer term uplifting of our skills and capability throughout the education sector. And this actually starts off very young as well. So one of the programs that I've been engaged with is the Day of AI, which is about enabling young students from years five to 10 in without official intelligence skills. And this is so incredibly critical to do early on. It just got expanded to years five and six and why? It's because we want to enable individuals, young students, before they start getting divided off with subject selections to different experiences. I myself got into technology by accident. I was going to be a film composer. And so I did not have the technology exposure during my education. I landed into it by accident and then learned on the job very hard. It was a very hard journey, but I did that. It should not be an accident. It should be a structured education approach that starts off very early. And the reason I also want to emphasize it needs to start off early. I will talk about right now about women specifically in the digital arena. There is a divide when it comes to skills in the digital arena where women are choosing not choosing as many stem or digital technologies courses through the careers. And that's actually a risk right now because it means that women may become more vulnerable to cyber attacks. And so digital and STEM are becoming critical life skills same as financial literacy, digital literacy is a really critical life skill. It doesn't matter whether individuals want to go into technology or not. Digital is actually a critical life skill so that individuals are not vulnerable in this increasingly technological world. And that's why we want to start incredibly early before this subject divide happens and we miss out on enabling people to have thriving lives later on. So Tharga, I'd be interested in your thoughts on what Stella's just said. I couldn't agree more. Yeah. Security literacy. You're nodding. Yeah, security literacy starts at a very, very young age. And I'm going to give a little shout out to my daughter who's seven over there who's in the audience. And she said, I want to come and learn about cybersecurity. I want that for every single school age child in Australia to learn from the get go how to be safe. We teach them how to be safe in the sun. We teach them to be safe swimmers. We teach them to cross the road safely. Well, it's more dangerous in some respects to open up your laptop, open up a browser and be on the internet. We have to teach our children. The future economy depends on it. Our future resilience depends on it. The government has said we want to be resilient by 2030. Some of these people haven't entered the workforce yet. That's seven years from now. They're still at school. Some of these people, we need to reach them, teach them and make sure that they are safe. The other thing I would add is, and just to pick up on the thread around smaller medium businesses, cyber is at its core from a pure security perspective, a risk management thing. And it's often looked through the lens of technology. And it really shouldn't be all the time. It is risk management. It's like any other form of risk management, which means we should teach it as such as part of everyday governance in a business and we don't. So if you're going to start a business, there should be something that tells you how to be cyber safe. The internet should not be something you should be scared of, either for our children or for our business owners. Olivia. Could I just briefly add to that? Because I just this week, I was in Singapore leading a study tour for the NSC. And we met with the Singapore Cyber Security Agency. And they're in fierce agreement with both of you. But they've just had such an impressive whole-of-life cycle approach to building up the cyber workforce. So in the space of five years, they've increased their cyber workforce nationally by 5,000 people. They have 900 students exiting high school with really great baseline cyber security skills that will hopefully then lead them to studying cyber security and computer science at university, working in the field and beyond. And they start really early. They start from kind of like primary school age all the way through cyber security professionals who are now like the leaders of businesses, small and medium-large businesses who are then kind of like filtering that knowledge and that expertise throughout their organizations and creating a really vibrant cyber talent ecosystem. And I think that's what we have to think about like not just as, well, here's the segments that you target, but how do you facilitate that whole ecosystem? And it goes to a critical point that Sothegar mentioned at the very beginning. It's about democratizing these skills. So not just beyond the small and medium enterprises which we know are vulnerable, but the individual vulnerabilities. If we consider cyber as a risk management approach, then it can't be, it can't be sort of like tech wizardry that someone else does or that we devolve to some sort of company or some sort of third party to do for us. It is a lifelong skill, but it's also about a lifelong ability to participate with agency in a cyber security world because cyberspace is like the base infrastructure of our lives now. And yet we've developed this sort of learned helplessness where we don't know how our data's been collected. We don't know the cybersecurity of the devices we rely on daily. And we don't wanna navigate through that complexity. But I think there's a role to be played by both government and the private sector to help us actually navigate through some of that complexity so that those cyber skills can be democratized at the individual level as well. Whenever I think of Singapore and its ability to introduce a national curriculum, I think of federation. We navigate the perspectives of states and territories and can't even come up with a common school starting age. But anyway, just saying, Joanna, how could Australia navigate what I think is a growing shift in the narrative, placing more security accountability on suppliers of software and services? Now we're seeing that narrative shifting in the EU, in the US, is that something we should be embracing here? Well, look, I think it's definitely, if you look at the US's latest cybersecurity strategy that just came out this month, they've really placed an emphasis on saying, look, we've tried voluntary cybersecurity guidelines, we've tried to say you should do this. It hasn't worked, so they're moving very much to a model where they're saying, you must implement this and much more severe penalties and placing that onus not on the individuals, but shifting it from the individual to the suppliers and the makers of the technology. I think perhaps the biggest way to understand that shift is maybe by using an analogy. If you think about the car that you drive, right? We drive a car, when you buy that car, you know that it's made to particular safety standards. When you get in your car, you put on your seatbelt, you drive on a road that you know is built to particular specs and standards, well, if you're living in a country like Australia, and you obey the road rules, there are clear road rules to use when you're driving your car. And I think what the analogy at the moment in terms of cybersecurity has largely been, put on your seatbelt, the onus has been on the individual, whether it be an individual person or an individual business, and there's been no guidance about the safety of the vehicle, the road, and little guidance about what the rules of the roads should actually be. And this shift is basically saying, we need to put those standards around the way that the car, in this instance, the way that our experience online, the product that we're using, shift it so that there are mandatory safety standards. But it's not enough just to do that. And my fear at this conversation, particularly when you look at the Australian government's cybersecurity strategy discussion paper, it's full of conversation around regulation, which is great. I lead the Tech Policy Design Centre. We focus on regulation, but if we only focus on regulation, then we will, we still need people to be putting on their seatbelts. We still need people to be obeying the road rules. So I think it is a sign of the maturity of the discussion, but it's not the full answer. Stella. I'd like to expand that conversation a little bit. There is no one point in the chain of technology creation that can take the full accountability and responsibility. And it's because as technology or a solution, as it moves from the technology vendor to then the services suppliers and then connects with additional data sources and ultimately comes to the organisation who's implementing the use case and then comes maybe to us as private citizens, as experiences, there are so many additional things that get added onto it that this technology supplier all the way upstream can actually not control the downstream. And so we're really needing to think about how to empower each of the chains that are connected to make responsible, secure choices throughout because there's decisions and choices that are happening throughout. And in fact, we know that from national listening tour that we conducted that businesses want to do the right thing. There's very little guidance in what that right thing is. And so one of the reasons we just launched the responsible AI network is national wide network to help demystify what those right things are when it comes to AI. What is the guidance? The fact that law and standards and principles and governance and leadership and technology, all of those need to come together to do the right thing. But there's just very little guidance out there. And so we really want to invest in building out easy checklist type approaches that SMEs can also leverage as well all the way through the chain so that everyone can be empowered to make responsible and safe choices. Well, let's turn to governments. Now, governments are custodians of personal information that can be used to commit identity fraud and other types of crimes. And governments have a monopoly on certain information. So there's no question of giving consent for that to be held. So governments should lead by example. What about the security accountability of governments as suppliers of public goods? Are they holding themselves to the same standards that they want private companies to meet? I mean, are they exceeding the security standards that they set for the private sector? Olivia's laughing, so I'm going to you first, Olivia. Johanna and I have both worked in government and we're just like, oh. Explain the eye roll. The eye roll is an obvious no, unfortunately. Look, I think, so just on the cybersecurity front, we've had successive ANAO audits that demonstrate Australian government agencies are not meeting the baseline standards that they're asking companies in Australia to meet. And that's quite problematic. I'm not saying that it's a case across the board, but it's very, very uneven. And some of these are really big agencies and departments that hold some of the most precious sensitive data sets on you and I as citizens. So this is quite problematic just from the sort of cybersecurity angle. And it's good to see that sort of ACSC and ASD are helping government departments to lift their game. But I do think government needs to get its own house in order before they tell industry what standards they should be meeting. And I think there's a credibility problem there, right? And on AI, it probably gets a little bit worse. In 2019, the government of the day released a set of AI ethics principles. Pretty high level, but pretty easy to meet because they're high level. It just sets out the intentions for responsible uses of artificial intelligence. They asked five of Australia's biggest companies to adopt and pilot those principles. And to date, I haven't known of a single government agency that is signed on to adopt it or pilot it. So a little bit of a problem there. A little bit of a double standard. And I wonder how that's going to influence the next tranches of regulation that government is going to try to enforce or implement. Jonah, does the dispersed nature of modern work that you've got employees working remotely and we've seen it during COVID, at least for part of the time now, is that going to increase the challenge for governments? I mean, they've got pockets of older software systems that remain vulnerable and now people working from home with home computers that might not meet stringent security standards. Is that adding to the challenge? Look, I think government does have unique challenges when it comes to cybersecurity, in part, ironically, because a lot of the systems that they're using are legacy systems because they want to have particularly secure systems. So there is this sort of circular irony to it. I think people working from home does increase challenges, but I think actually the biggest problem in government is a lack of knowledge and skills about these issues. And that's why my center in next quarter will be piloting a training course that is designed to really help uplift the capability of across the public service because we have a real challenge in being able to understand and identify and ask the right questions. And I think for me, that's the most alarming thing out of RoboDet. It's the questions that were not asked. The lawyers that gave the clearance that didn't ask the question and it goes to that point of there's this idea that technology is still something that sits over here, that isn't a core part of every single public servant's job. That isn't core to every graduate of ANU. This is not something that sits off to the side. We actually need to give everybody the capability to be asking and answering these questions. Well, Suthaga, that kind of brings me to a question. And I've got my ANU Chancellor hat on when I ask you this. How are we going to enliven and grow investment in cyber research and development in Australia? And so Australia can be a net exporter of talent and know-how and innovation rather than, as I believe we are today, net consumers of cyber innovation and heavily services reliant. So Chancellor, one of the biggest challenges I think Australia has, and it's building on Olivia's point earlier, we are a digital economy and yet we're not necessarily producing a digital workforce, nor are we necessarily building the innovation that will drive the next wave of the economy. We generally buy things from overseas. And that is a problem. Because every time we do that, we forgo the opportunity to build something for ourselves. Now, we're at the National University, one of the core elements of our mission is nation building. Part of that has to be digital innovation, security innovation, building cyber talent, building new ways and new approaches and new thinking so that our economy can prosper. I'm being shamelessly parochial and shamelessly Australian when I say this. But it has been so long watching, certainly from my perspective in government and outside of it, watching us just accept the standards and accept the software that's from elsewhere. Now, if we want to change that, we've got to change the way we look at ourselves. That we are a nation of innovators. We will invest in that innovation. We need government to open the doors in some respects. We need to encourage startups, most startups, particularly in information security or cyber security, most startups in Australia fail. That is not the case elsewhere. There is an insatiable appetite in the US and Europe for security startups. Here, you'd be lucky if two out of 10 survived the first five years. There's something clearly wrong in the way that we are supporting startups, small businesses, cyber talent, growing innovation. And I think places like A&U have a very special role in that because we are incubators of innovation. And I take to heart the vice chancellor's challenge about doing spinouts, billion-dollar spinouts. I think universities, whether it's us in particular, with our national mission or other universities right across Australia, we have, I think, both a mandate and an absolute need to grow, research, to show the world how much better it can be done. We've probably got one question left for all of you. So let me muse for a moment. I recall discussions when I was in government about the tensions over the issue of encryption and the government's cyber security strategy would always say, we need robust encryption for individuals and organisations to protect their information. And then in the next sentence, they'd warn that cyber criminals are using encryption to evade law enforcement. You know, that inherent tension. So what changes in regulation might be useful to improving long-term cyber resilience, balancing long-term economic prosperity? Can we balance compliance with natural corporate mechanisms for accountability and risk management? So I'm going to start with you, Stella, Joanna, Sathaga, and wrap up with Olivia. Over to you. I'll take the artificial intelligence lens on this. One of the biggest challenges that we experience when we are meeting with industry is that as soon as we mention artificial intelligence, it defaults into ethics and principles. And it actually ignores that there's this whole legal system that still applies to AI as well. And so greater clarity is what our objective is to create that greater clarity that the law still applies for AI. There is a coming wave of standards that I think are going to be very surprising for industry that are starting from this year, like the AI management system standard. I think it's going to catch a lot of industry by surprise and really thinking about AI governance is important right now so that we're not caught by surprise. Only then do we consider principles, you know, once law standards have been addressed. And then thinking about a robust governance model for AI within an organization, it's not something to be siloed off to IT or research and development, which is what we see generally. It's something to connect to the core governance models of the organization. And in fact, we're running a set of really intriguing pilots right now to connect AI into the ESG governance model because ESG is an all-of-company kind of governance approach. And many of the topics addressed through ESG are also the topics that come up when it comes to AI. And so thinking of it as integrated to the governance of the organization, recognizing first that responsible and safe AI starts off with the law is important so that we dispel that myth that we have to default into principles. And then really leaning into leadership. Again, I mentioned at the start, AI is only as good as we lead it. That is showing us to be incredibly true. It's only as good as the maturity, the skills of the organization, the robust governance models. And so that's why it's so important for organizations to implement governance right now. So I'll take it quickly, one from a government perspective and one from a business perspective. I think, and the business is much easier if I had a magic wand to say the biggest regulatory change that would have the biggest impact on cybersecurity would be making it much clearer and unambiguous that we have an obligation as a director to ensure that your company has strong cybersecurity protections in place and to have public reporting about that. I think that's the single biggest change that would make the biggest difference because it would drive change from the top and the people that control the budgets as well. Within government, I think, and going to the specific question you raised around encryption and how do we have encryption driving innovation but also from the security perspective. This is a really structural issue that we have within the way that our government departments are structured but also the way that our ministers look at these issues. So if you take three pieces of regulatory reform that will be really crucial in Australia over the next parliamentary term. One is privacy, one is cybersecurity strategy and the other is around digital identity. Those three processes are run through three different departments, report to different ministers, run through different committees within the public service and then they go through, when they get to cabinet, they'll go through different cabinet procedures. Some will go through the National Security Committee of Cabinet. Others will go through full cabinet or cabinet. Why does this matter? Because if you have something that is going through the National Security Committee of Cabinet and going through a security stream, you're not properly considering those other impacts. Yes, you may have someone who gets seconded into the room of all of the national security folks but they're the outsider in the room. As opposed to if we take a structural change to the way that we structure those committees both within the public service but also at the ministerial level, then you start to have a more equal balance of power so you can have an informed debate about those issues rather than one that's siloed with attack on. And incidentally, we've just released a report cultivating coordination on exactly that issue. Good luck with that. Right, Sathaga, your thoughts. I might take this from, I guess, the private sector perspective. What government can do, Yana has covered off incredibly well but there's no substitute for building a good security culture inside your organization in order to make the right decisions, enforce the right governance, make the right decisions around technology and people. If you don't build that culture, none of this is gonna work. No matter what the regulation says, no matter what the best practice compliance handbook says, if those decisions aren't being made because your culture doesn't align to your values and the values don't measure and respect security, then nothing's gonna change. And almost every high profile breach that you've seen in the media, I can sit there and point at a cultural failing and had that particular organization paid a little bit more attention to the advice, whether it was the CEO or the board, paid a little bit more attention to that advice. I bet you that breach might not have happened or at least might not have been as severe. So my big push here is around building security culture inside corporate Australia because I think that will make a world of difference. I can't imagine who you were referring to. Moving on, Olivia, thank you. So the National Security College recently had Chris Inglis, the US Cyber Security Chief come to visit and he's talked a lot about creating a new cyber social contract. So cyber that is for social good, in other words. In Chris's view, we've spent too long allowing cyber risk to be borne by those people who are least capable of bearing that risk. So small businesses, for example, or individuals. And so in order to flip that around, we do need some regulation at the top, the kind of regulation that we see on critical infrastructure, for example, that has really brought that conversation about securing critical infrastructure from attack into the boardrooms. Assigning that responsibility really, really clearly because, frankly, there was a long time of drift where it just wasn't being taken seriously enough. So I think that that is one element where we can adopt some of those learnings from other types of regulation. But Chris also talks about having government be clearer about the threats and being more transparent about what the risks are we actually have to deal with. And communicating that clearly both to the community level, the business level, and a whole of enterprise society level, right? Because I keep coming back to this idea that cybersecurity or cyber skills, it has to be really individualized and democratized. It is the base infrastructure of our lives. It is something that we need to take ownership of as well and do our own risk calculus of. I absolutely agree with the points about corporate culture. But I think at an individual level, I don't want to feel disempowered in the cyber future. Like I want to be contributing. I want to build a better cyber future and I want to be part of that conversation and part of that mission. But I think there's an onus on us to really take that on board as an individual approach as well. So when we talk about national sovereign capability, what about our individual cyber capability? And how do we build that? Because I think if we achieve that, then the national agenda will be much easier to manage. We don't have time for any more questions, but I was going to ask each of you to just give one takeout. What would you like this audience to leave here remembering? Stella, I'm going to start with you. Just one sentence, no more. What's your takeout? Digital skills. Good. Johanna. Technology policy is relevant for every single person in this room. Swathaga. Digital skills for our children. Liv. Roses are red, violets are blue. If the product is free, the product's probably you. Oh, now that's got me thinking. Ladies and gentlemen, please thank the Chancellor's Panel this evening. We have Stella from CSIRO, Johanna from the Tech Policy Design Centre at ANU, Swathaga, who's our first Chief Information Security Officer, and Olivia from the National Security College here. Please give them a round of applause. Thank you. Now, we could have gone on all evening, but I have very strict instructions that I have to bid you farewell. Wish you a safe trip home, and please remain connected with ANU, be part of the ANU community, hopefully the ANU family, and we'll see you here at least next year for the next Chancellor's Panel, but hopefully in the meantime, much sooner than that. Have a very good evening. Thank you.