 Greetings, friends at Cyber Initiative, Tokyo. I'm Audrey Tong, Taiwan's digital minister in charge of social innovation. I'm really happy to be here virtually to share with you some thoughts around digital transformation based on the questions that I have received. In the face of new lifestyles and other social changes, what did you as a minister consider necessary and what measures did you take? What measures have already been effective and what measures have not been as effective as expected and what measures are likely to be effective in the future? Well, I believe as the name of the conference shows, taking initiative is the most important. And that means that each and every citizen need to be empowered with not just the digital access, like broadband as a human rights, but also the know-how, the competence, not just literacy, of how to remix and reuse digital technologies and components. For example, as you know, last February, we began the mask rationing program. But it is the civic technologists in the GovZero GZeroV community that came up with this brilliant idea of visualizing the real-time inventory of all the pharmacies' real-time stock of masks so that people do not have to queue in vain. But this is not a relationship of procurement, but rather reverse procurement. The specification, the norm was set by the people, the social sector, and we just provided from the state the real-time open API, updated every 30 seconds as people queue in line. So this means that innovation is open and people who are not that used to interactive maps has exactly the same numbers and know-how to create, for example, interactive chatbots or voice assistants for people with seeing difficulties and so on. And this proved to be very effective as more than 100 different applications started in the first seven days of the mask rationing program which resulted in the three-quarter of our population getting access and wearing masks in a record time. And so that makes sure that our early counter-pandemic efforts last year was very, very successful and our value was controlled to be below one. Now, as for what measures were not as effective as expected, well, the top-down measures. So any sort of centralized planning we have discovered tend to disagree with reality. When we rationed out masks, we initially allocated the distribution of masks corresponding to the population distribution. That is to say, the pharmacy's availability, if you put it on a map, is almost exactly overlapping with the population census. And our initial idea was that each average citizen should enjoy exactly the same physical distance to a nearby mask that's available and that's most fair. However, soon the open-street map community discovered it was not the case because not everyone own a helicopter and the distance on the map does not translate to the time the opportunity cost that a person needs to access the nearby pharmacy. Maybe they have waited for an hour or so for a bus in the more rural areas, whereas in the more urban areas with metro, well, it's much more accessible. But because of this, they reanalyzed the real-time open API and through a parliamentary interpolation, MP Engel said to our Minister of Health and Welfare, Minister, I don't think the distribution was fair. I have the numbers to prove it. And Minister Chen simply said, Legislator, teach us. For it was because of the radical transparency, the real-time availability of API, the numbers collected by machines and published upon collection that enabled a co-creative mood, not just demonstration as opposition, but demonstration as demo. So the very next day after the interpolation, we implemented a much more fair rationing scheme along with pre-registration, pre-ordering, and so on. So these people, public, private, partnerships are the kind of measures, the model that are likely to be very effective in the future. The next question is responding to COVID-19, Taiwan used the power of civic tech to develop the app, the system that notifies pharmacy of mask inventory in a short period of time. What are some of the points that you pay attention to in terms of information security in order to make effective use of civic tech? That's a great question. You see, in Taiwan, the National Health Insurance Administration enjoys its own private network, not connected to any internet services that hosts the centralized database of all the transactions for the insurance records, and so on. And since 2003, when SARS first hit Taiwan, we very quickly learned that all the not just hospitals, but clinics, pharmacies, and so on need to connect to this virtual private network system in order to make sure that the info security is not just protected on the edge, but also protected by the network topology. So basically, the idea is very simple. On the cybersecurity layer, we ensure that there is mutual accountability. Whenever any transaction happens, there is a record of a particular pharmacy or clinic having this transaction with the National Health Card that a citizen takes with them. And so when they purchase a mask using the National Health Card, it's not just use for rationing purposes, but also for auditing and accountability purposes. And that makes sure that each and every pharmacist, clinicians, and hospitals, and so on can benefit from this penetration tested, this defense in-depth design that powers our existing national health care system. Compare that to any system that's invented during the pandemic and you will find that people do not trust new data collection points that's invented amidst a crisis because people did not have sufficient experience with its cybersecurity as well as its privacy boundaries. And so because Taiwan never declared a state of emergency, we basically held ourselves to account to only use the existing measures, existing technologies before the pandemic and repurpose them for pandemic control. And that has the dual purpose of ensuring the resilience and reliability, but also to ensure the understandability, the explainability that is at the core of participatory auditing and accountability when it comes to cybersecurity systems. The next question asks, when governments and companies work on digital transformation, there are only a limited number of people who are familiar with digital tech and there is a risk of high costs due to lock-in when relying entirely on IT vendors. What are your measures and solutions to overcome the lack of talent in your organization? As I mentioned, the open API approach makes sure that people who innovate on the front end using chatbots, augmented reality, whatever, mobile websites and so on, all these can benefit from the bedrock of the cybersecurity measures that we have put in place by the larger system integrators. But they are not in a competitive relationship because basically when procuring government systems, we can take a box and say, whatever you make available for human beings, you must also make available using the Linux Foundation open API standard to make it available to the robots. Otherwise, if a vendor does not implement universal access in terms of catering to people with seeing difficulties, of course they could be disqualified for discrimination and using the same contract language, if they do not comply to open API at a negligible cost in addition to the cost to make such systems, then they could also be disqualified for discriminating against robots. Well, we don't quite say that but that's the effect. And so because of that, the open API is the angle upon which the startup ecosystem, the civic type ecosystem can build alternate ways of experiences of interacting with people. But as I mentioned, the underlying infrastructure remains secure and so this enables us to tap into the talent of the entire society not constrained by one or two IT vendors. The next question asks, even in a democracy there is a clear risk that fake news will spread social division. What can the executive branch and media companies do to prevent social media-mediated social fragmentation? Well, in Taiwan, the media companies that covered, for example, our presidential election, worked with the schools like a middle school when the students can fact check their presidential candidates as they are doing their debates and forums to type the transcript, to check it against the sources and if they find any discrepancies, it's not just an exercise, rather it gets posted on the live stream, on the media channel on public television and so on. And so that ensures that the lifelong and basic education of media competence is not just about literacy, about understanding the news, but rather about making the news. And as for the administration, I think we need to invest in the digital equivalent of the public infrastructures like a university campus or a public park, essentially the online spaces where people can have a pro-social conversation about policies. In Taiwan, the academic network hosts the PTT, which for 25 years has been free of advertisers and shareholders, and yet it's one of the most active forums for public issue deliberation. That's the place when the Dr. Li Wenliang's message from Wuhan of the seven SARS cases in December 2019 spread to Taiwan and within just 24 hours, the people triaged the message, contributing to a pro-social investigation which resulted in we getting the advanced warnings and public inspections of the health of the flight passenger coming in from Wuhan on the very first day of 2020. So that's collective intelligence. And we also invested in the join platform, which is a one-stop platform with more than 30 million visits in Taiwan, 23 million people, it's a lot. And participatory budgeting, regulatory pre-announcements, the accounting of the national budgets and the citizens' initiatives with 5,000 people's petitions. Every two weeks, we meet face-to-face with the people who raised those issues and held collaborative meetings in an interagency fashion and so on. Of course, all of this takes investment, which is why in 2016 we classified those budgets as infrastructure budget, even though it's not made from concrete, like not concrete tangible construction. But nevertheless, we understood if we do not have the digital equivalent of the town halls and public squares, then our citizens will be forced to deliberate about policy in the digital equivalent of a nightclub, right, with a smoke-filled room. You have to shout to get her addictive drinks, private bouncers, and so on, namely Facebook. And I don't have anything against the entertainment sector. I mean, the nightclub in Taipei is open now, but I don't think that these districts, it should be the place that we hold our town hall and deliberative democracy events in. So people would like to know, in Japan, anonymous slander and fake news have become a problem in the SNS space. What is the situation in Taiwan and also what are your thoughts on this issue? Of course, in Taiwan, we do have this information and we do have the sort of information manipulation, as I mentioned, that try to interfere with the election integrity and so on on the social media. And it comes to a really high degree of intervention in 2018, but for the 2020 election, we have witnessed less influence of such campaigns on the democratic process. And the reason why is that in 2019, there was a concerted effort by the social sector, by the professional journalists and so on, to start a norm package that is to say, what is considered normal for those social media companies to do. For example, the social sector G0V, Gap Zero people a few years ago literally occupied, well, went into the National Accounting Office, the Control Yuan, and brought out the paper copies. At the time, there were no digital downloads, so just paper copies of the campaign donation expense records. And digitalization was done through the participatory optical character recognition so that the investigative journalists have the access for the first time of the structure of data of the past elections. And of course, the Control Yuan eventually changed the rules, the law, so that you don't have to download through paper OCR anymore, but rather it's published as open data. Now, in 2019, the social sector applied the same pressure to Facebook saying that the foreign-sponsored political and social advertisements needs to be banned because we also ban such foreign-sponsored campaign donations. And also, even if it's domestic source, exactly who provided which target of the messages to how many people, all these needs to be radically transparent in an open data way. And Facebook basically were faced with the implicit threat of social sanction if they do not conform to the norm that is set by the people and the public sector has already adhered to. And so that is why in the 2020 presidential election, the situation is much better and our democratic process were not in that much of a danger from the algorithmic sponsored interventions. The next question asks the COVID-19 pandemic has drastically changed the way we do business and live, including telework. How do you see the security risks changing before and after the corona disaster? Well, in Taiwan, we've never had a lockdown in the past couple of years. And so people are by and large not forced to telework. Of course, I'm a teleworking minister. I've been teleworking since 2008 and as a public servant since 2016. So sharing some personal experience, I guess, of teleworking instead of the societal thing because we've never had a lockdown is in order. So in my opinion, the teleworking in the places where it's not a satellite office but rather a cafe or at home and so on brings a very different threat model, a very different configuration. Nowadays, of course, we've seen that a more proactive way of intrusion detection are the so-called zero trust configuration of network of the privacy and cybersecurity at the edge and so on are now becoming necessary because of the teleworking configuration. But I would also like to say that a good habit like wearing mask and washing hands and social distancing, that sort of good habits is also essential when it comes to enforcing the cybersecurity rules but when people are literally in all the different spaces. So for example, if a company or a large organization adopts teleworking, it pays to make sure that the individual workers understand the difference between their personal desktop and work environment or a dedicated virtualized machine through remote desktop or some other virtualization ways. The distinction between a browser that's running locally and a browser that's running in a security-isolated environment and just streams the images to a desktop and so on and there need to be a clear delineation of when to use which kind of tools and just as that when we're in the office we make a distinction between the bring your own devices vis-a-vis the internal configuration of the intranet. We also need to make a similar distinction but in our minds when we're passing through the barrier and previously it was on the network layer so once you start using VPN, you're in the intranet but nowadays I think the entire configuration of the way we collaborate the tools that we use to video conference and so on need always to reflect this reality to make sure that when people get into a working environment this working environment adheres to the privacy and cybersecurity rules and norms in the workplace but instead of in the kind of outside environment of their own devices so habits and norm building I believe that are the most important ideas in a teleworking environment. The next question asks advanced technologies such as AI and quantum technologies can contribute to improving cybersecurity but they can also be a weapon of course for attackers how should governments and businesses deal with the duality of the advanced technologies that's a great question so I believe that the best protection against the emerging threats of emerging technologies is widespread social competence when the entire society understand how to doctor photos, how to photoshop then they become resilient against the shallow fakes that would make anonymous slanders and scam and spam and phishing and so on effective so the idea is that if each and every one person become competent in the use of AI as assistive technologies that people are comfortable of understanding the models and demanding accountability and extensibility and so on in the front line then we're not trapped in a situation where only a few percent of the elite controls the rules that governs the AI or the use of quantum and so on and the social innovators will then be able to not just hold the technologists to account but also as I mentioned demonstrate actively by demoing a better way to adapt to the actual situation in their society and therefore just resilience is enforced by the people closest to the front lines and instead of having a one top down way of regulating against all the potential threats so instead of the single system that's robust we need to have distributed decentralized social and computational systems that is resilient that is to say of course there will be threats but just like building the materials that can absorb the initial earthquake and then be resilient that is to say after earthquake very quickly recover from it we need to build similar systems when it comes to defending against the emerging threats and uses of the digital technologies the next question asks when promoting social innovation through digital technology what are the issues that need to be taken into account in terms of privacy protection for example personal data and what measures should be taken please tell us separately for the government and for the private sector so for the government I believe the point here is to trust the citizens that is to say if the citizens trust each other and would like to store data in a place that's closer to the point they trust the government should not forcibly centralize that data but should instead invest in privacy enhancing technologies such as federated learning, homomorphic encryption, differential privacy and so on that enables the data storage to not overextend or squander the trust because trust is not transitive if a person trusts a data collector and the data collector trusts a data processor it doesn't mean that the data subject automatically trusts the data processor but if we apply homomorphic encryption or other PET's then such trust is not necessary because data processor encrypted version of the data and do not need to extend their purview to the raw data that it has privacy implications so just like the building materials that I just referred to those PET's needs to be a part of the basic vocabulary of all in the public service also the National Center for High Speed Computation investigates how to speed up such computations and so on and we've made strides and breakthroughs in applying post quantum cryptography in the field of homomorphic encryption so we can use such techniques at ease even with the data that is previously too large to process this way now the private sector of course are the places where certain customers trust but if people don't trust this private sector processor of course they have competition so they can go to some other places so basically just like the idea of social responsibility of not contributing to carbon emissions so the privacy violations can be thought of as a kind of emission that externalize the negative policy to the entire society instead we need to externalize the good the best practices that you have invested in while you can share the norms if not the actual source code with everyone in your industry so that people understand the kind of PET's to use and this not only builds a better branding but actually makes the total trust to the kind of digital delivery services much higher otherwise people do not have an incentive to digitally transform their spending habits and their transaction habits so basically instead of asking for absolute trust invest in infrastructures that can make yourself and your business trust worthy the next question was about what should companies and countries do to improve eye deletioncy so the approach would be the short cut well IT connects machines but digital connects people so literacy is not sufficient when we talk about literacy is about consuming information that was like in the era of radio and television but nowadays with a smartphone each and every person is media and so they need to be competent in the way that is to say journalism as I mentioned before as well as the use of data for public good how to join a data collaborative and so on but of course such ideas data bias data stewardship are kind of abstract and very difficult to teach very time consuming to teach so the short cut will be essentially deploying the kind of high impact sources of data like air boxes in Taiwan's primary school most of the primary schools in Taiwan have air boxes that measures PM2.5 and other climate and weather related environmental sensing numbers and then contributed to a distributed ledger maintained by our national academy and so the students previously could not easily comprehend the idea of a data bias or how to maneuver the data but once they understand that the air box that they collaboratively maintain would actually inform their friends and family's choices before they go to Jog they would check the air pollution map and ensure that they only go out when it's safe to go out even large demonstrations to change our energy and other public privacy are the direct result of environmental sensing of more than 10,000 different places of such air boxes deployed and so this idea of competence is very easily making an impact to their community and therefore the incentive for the individual students to contribute and to learn more about data science and so on become very very high because they understand they can help the livelihood health the public health of the communities involved so the shortcut is to connect data collection analysis and application to something that has a high social and environmental impact capstone projects forster's competence instead of just literacy and I call this PBL purpose based learning because purpose leads to projects which leads to problems and then the skill and mindset to overcome those problems together so the next question is about what kind of IT literacy is necessary for a digital society what about information security in particular please share your thoughts on both companies and individuals so as I mentioned in companies it pays to make sure that we work together in a way that is both swift effective and safe that's to say secure and we need to begin by empowering the individual to make conscious choices on multifactor authentication on using you know privacy tabs instead of downloading the video conferencing software of choosing the routing and service providers that is cybersecurity minded as well as making sure that there's always resilience so three backups in at least two different physical spaces and so on and all these are something that a person can do as an individual so instead of the company enforcing one particular set of rules without explaining this that would be like a lockdown fatigue right people would not know the why of policymaking but rather a set of easily understood norms and spread across the individuals and citizens that I believe is the fundamental societal infrastructure for good cybersecurity habits. The next question asks on the one hand digital technology has positive aspects that enrich people's lives but on the other hand it can also be used for purposes such as surveillance and regulation as in the PRC regime so what are your thoughts on this point well yeah definitely I mean it all depends on the shape of power when I talk about transparency it's about making the state transparent to the citizens so power at the edge empowering the people's power but in the PRC when they talk about transparency it's making the people transparent to the state so that the word may be the same but the direction entirely different when I talk about AI it's always about assistive intelligence that for example like this eyeglass that aligns with my eyesight but it doesn't display like a pop-up propaganda or advertisement that I have to you know wait for 10 seconds to close it takes well I know exactly how it works I can repair it myself or I can bring it down the street for someone else to repair it but I don't have to spend years to reverse engineer it's in a workings nor should I pay 10,000 US dollars in terms of the license fee right so the point is that it's assistive technology because it empowers my own dignity as a person it connects me to other persons rather more easily but it does not replace any individual and so that is the assistive paradigm of assistive intelligence that always assists and augments the collective and connective intelligence of human beings but if we get all the power centralized in one or two single spaces well then that will become a authoritarian intelligence and so my thought is very simple basically we need to concentrate on investing in the commons that empower not just the young people but also the senior people who have a different way of interacting with the system and enable them to contribute so it's not just about machine learning it is about collaborative learning and I believe the spirit of collaborative learning is the most important in democratic societies so we can not just defend but also advance our cultures and our virtues as democratic polities well I guess we're at sign thank you for the great questions sorry I didn't have the time to go through them all however I would like to conclude by quoting my own job description which talks about collaborative learning but also about how to transform from IT based mindset to a mindset based on digital transformation goes like this when we see the internet of things let's make it an internet of beings when we see virtual reality let's make it a shared reality when we see machine learning let's make it collaborative learning when we see user experience let's make it about human experience and whenever we hear that a singularity is near let's always remember the plurality is here thank you for listening live long and prosper