 At the same time, no employee at Microsoft will feel good about working at Microsoft if they don't feel that we're doing the right thing. And our shareholders should care about our employees feeling good. And so I think the CEOs do in today's world have more of work to do to communicate clearly that stakeholder capitalism is somehow not against shareholders. In fact, it is for the shareholders long term benefit. Particularly today with Satya Nadella, the CEO of Microsoft. It's a particular pleasure, maybe a particular challenge also. Because you are one of the most admired business leaders in the world. You have done something which so much fits into the philosophy which we are promoting during this annual meeting to be environmentally, to be carbon neutral. And you will do even more in your announcement. You do not only commit to be carbon neutral in a relatively short time. But you say we want to recuperate from the air all the bad stuff which Microsoft has sent into the air since its creation, I think 40 years ago. Now let me ask you the first question. You are today's most valuable company in the world. And you always have been a promoter of this notion to create long term shared value and we discuss here as the adoption of a comprehensive unified ESG measurement system. But you have been a pioneer. You have submitted your company to all kinds of criteria. Now, what is your recipe? I put it in a very short question. What is your recipe to do good and to do well at the same time? First of all, it's great to be back and to do this chat with you, Klaus. And it's fantastic to, in particular, this year's forum I think has even a particularly different tenor even. And it's thanks to sort of a lot of what you have set in motion. So when I look at what's the social purpose of a firm or a corporation, last year perhaps there were more books written on capitalism and how capitalism needs to be redefined than ever before. And one such book written by an Oxford economist called Colin Mayer, there's a very good definition of the social purpose of a firm where he says a corporation finds profitable solutions to the challenges of people in the planet. The keyword is profitable because we do know that capitalism does have the ability to allocate resources in the most efficient way. But the other keyword is the challenges of people in the planet. It's not about creating more challenges for people in the planet, but to have real solutions to the problems. I like that, in fact, as a platform company at Microsoft. It's not about even doing ESG for ESG sake. The core business model of the company is this business model, if we succeed, will there be economic growth that's broad spread? Will the economic growth be inclusive? Will we build more trust in technology? And of course now can we even ensure that through our engagement in the world there is a more sustainable planet? So I think it comes down to having that core business design incorporate when you do well, the world around you does well. And if you get that right, then a lot of things fall in place. But if you're trying to just use these as band-aids and your core business model, your core operation is not aligned with the world, then you have a real challenge. Could you formulate it in the following ways that you say not to find any more solutions for your clients, but to find solutions for society? That should be the purpose of a company. And of course, by finding solutions for society, you'll find your clients and yourself's clients. That's correct. And people, I mean, we all exist. I mean, I think that's kind of perhaps the most obvious thing sometimes we forget. People and institutions are all part of our society. And they all are on our planet. And so if you don't sort of think about the broader context, the very system that we have come to depend on is not stable. Satya, when I was preparing, following, let's say, attempts since many years, and the US Business Roundtable Declaration, and I defends this notion of stakeholdership, inclusive capitalism, criticism, I always get relates to the CEO renumeration. Can you say, I know it's a very sensitive question, but can you say some word about it? How are you? You know, I think it's definitely a good topic to discuss. I mean, at the end of the day, if I look at the inequities in the world, renumeration of anybody, whether it's a CEO or whether it's a return on capital, all of those are, I think, worthwhile things to be debated. Now, the question ultimately goes back to is, what is the market saying and what is the society saying? There's a fantastic book I recently read where it's called The Narrow Corridor. It's written by, I think, two economists where they sort of say it's the real constant tension between what does a society want, and then in that case, he talks about what does the government want, and these are the two centers of power. And you have to find that narrow corridor. And similarly, I believe, between markets, democracies, and liberal values, we have to find the corridor that works where there isn't anything that's out of kilter, whether it's the CEO pay or let's talk about it. Like somebody described it to me, hey, CEOs are still labor. There's, by the way, there's a bigger return. It's called the capital. So if you want to litigate where the returns are, let's litigate all of it. When I wrote the Davos manifesto 2020, of course I was confronted with this issue. And I walked this corridor by saying the executive renumeration should not be only tied to profit. But should be tied also to the ESG achievements. Would you agree with that? I would agree with that. In fact, the way I have sort of interpreted it is I take all of the initiatives we've done. I actually go make the case to our shareholders that it is in their long-term interest that we do what we're doing. So for example, I'll give you even a very practical thing we did. A few years ago we said we want to extend our parental leave and sick leave, not just to our employees, but even to the vendors who work. Because after all, they're working right in the same offices and having very disparate benefits makes no sense. So we said, okay, how do we do that? Of course, we have to pay for it. In other words, we will have to subsidize it. So that's profit out of our bottom line. And at the same time, no employee at Microsoft will feel good about working at Microsoft if they don't feel that we're doing the right thing. And our shareholders should care about our employees feeling good. And so I think the CEOs do in today's world have more of work to do to communicate clearly that stakeholder capitalism is somehow not against shareholders. In fact, it is for the shareholders' long-term benefit. It's mostly a clash between short-term and long-term and not between, let's say, shareholder capitalism and stakeholder capitalism. That is absolutely right. If I may change the subject. In the name of national security, for example, you cannot sell your products in China anymore, I think from 23 on, to governmental units. Are you concerned that the world is developing in the direction of two circles with each with its own technology rules? And actually, we create this technology Cold War. What is your, because you are very, very well implanted in China and what is your feeling and your reaction and what would you suggest? You know, first of all, I would start by saying, look, every country does deeply care about their national security and their national interest even broadly, if I said, whether that's why they care about their trade and having fair trade deals. They care about their national security. And I think that will always be the case. But that said, I think here is the way I would at least urge us to think about what should happen. I would sort of consciously decouple, whether it's the internet or the trading spheres or what have you, all we will do is increase the overall transactional costs of our economy and everybody will be worse off. I think that's just going to be what will happen. I'll always go back. You know, I immigrated to the United States in 88. Then I went and studied and the Berlin Wall fell in 89 and then I went and joined the software industry in 90 and it was kind of like winning three lotteries in sequence. And the 30-year period has been just an amazing period of market access, you know, the ability to expand into all corners of the world. And I do feel in the tech industry we do need to grow up because the world is a little more complicated than that. And our responsibilities are more. But that said, I think any decoupling without some thought into how we can come up with mechanisms. For example, I believe trust in technology is something that the world will need more norms around, right? Take cybersecurity. We've called for the Geneva Convention around cybersecurity. We've called for a new... You were very active. AI ethics, I think they came on the topic of AI ethics. I think China cares as deeply about AI ethics as the United States to assume that somehow the Chinese people and the Chinese government are also not going to worry about the implications of AI run wild would be a problem. And so therefore I think both the United States and China and the European Union having a set of principles that govern what this technology can mean in our societies and the world at large is probably more in need today than it was in the last 30 years. Now one of the key issues in the economy of today is how we handle data. And I think it was you who at least you spoke out for data dignity. And of course we have attention. How much do we protect privacy? How much do we use the potential of those data for good? Now you have expressed yourself on several occasions. Could you just repeat what is your opinion about this whole issue of privacy which worry so many people? And how can we guarantee privacy and on the other hand still use the data for example for medical purposes and so on? Sure. I mean a couple of things. One is I would say data and privacy around data at an individual level needs to be thought of as a human right. I mean in some sense Europe has taken a lead with GDPR and has even effectively regulated that. And in our case we took that regulation and we are in fact hoping to see a more of a federal standard in the United States and in the world over. In fact we have taken some of the subject rights of GDPR and made it available worldwide. So yes I think we believe that data and privacy around data is a human right and it has to be protected. You have to be transparent. Now but here's the interesting thing. I think that this term I like which we use called data dignity goes one step further than privacy because data that you contribute to the world is got utility. Utility for you, utility for the business that may be giving you a service in return and the world at large. How do we account for that surplus being created around data and who is in control around giving those rights? That's the next level of I think work we all need to do where it's not just oh I have privacy and I just give away my data. I should be able to in fact control in a much more finer grain way how my data is being used to create both utility for me and the world and the causes I care about. Let's see Mr. Public would you agree with the statement? Yes I think we should encourage because not everybody shares your data. I think everyone should care about their data. Everybody should care. If not then that's a different world. That's the reason why I wanted to get you encouraged by the reaction of the public. But could you go one step further and you could see something I just didn't win the expression. I don't know whether it has been used a data wallet for everybody. A data what? Wallet. So he can decide whether he actually sells the data related to it. Absolutely. I mean I think that these are the kind of mechanisms we should be experimenting with which today if you think about the exchange what is ad funded business model and by the way ad funding is not a bad business model because in some sense you're getting a free service. But that's the only data driven business model purely where there is value exchange and you could even say that the value exchange is not fair because the middleman in this case the ad broker makes all the profit. What if this thing was split a lot more evenly? The consumer benefited a lot more than just the free service. The advertiser whose prices by the way keep going up because it's second price auction in fact goes down. So I believe some of the best economics work should happen around data dignity and new business models in the 2020s will hopefully allow us to get there. You spoke, you mentioned already artificial intelligence. You have written a book Hit Refresh. I personally I mean I even have written this book about the forced industrial revolution and coins in the ocean but I still have difficulties to understand the whole ethical concept around artificial intelligence. What needs actually, where are the broader lines? What do we want to capture when we talk about ethical rules? A couple of things. So first of all, why do we think AI does mark a real departure from let's say the previous generation of software and software development? I think it's an important topic because for the first time we have the ability for software to be written so to speak by data as opposed to software generating data. So if you sort of think of AI at a foundational level that that's the paradigm shift. That's the machine learning. Correct. So in some sense you know you essentially have data generating your software or the learning generating, being generated by data. But at the same time I think we should not be too, what should I say, fast in abdicating our responsibility in what is happening here. For example, we have set aside, we have defined a set of principles around fairness, around robustness, around privacy, security, transparency, intelligibility of the models that are created and accountability. But these are not just words, although there's a lot of principles now that are being discussed everywhere. You need to turn it into software engineering practice. So for example, if there is a language model that gets created you need to put in the hands of people who are developing the language model tools to de-bias the language model because after all the model did learn from the corpus of data. So I think we as tools makers, software developer, platform creators, we are very focused on building the best tool chain so that anybody involved in the process of creating AI can in fact by design have the ethical principle enforced. I would say that. One of the other things we have learned is you know what's the best way to ensure that there's no bias in AI? Have a diverse team. So the best way for us to in fact change the ethical frontier of AI is to have the diversity of the team creating the AI represent the ethics we want so to speak. And so therefore I think there are several mechanisms that we can create that are governors in how AI gets created for good not for the unintended consequence. I just want to make one other point because regulation can have a real place here and especially regulation at the time of use. There are two types of regulations you can have. You can have regulation at the time of design and the regulation at time of use. And so I think we should be thinking a lot harder around regulation at the time of use because facial recognition or object recognition by itself is not good or bad. It is just a technology. It's the use case that is good or bad. And so we have to be able to sort of even think about regulation so more at the run time not at the design time. But that's sometimes more difficult because the consumers and user you have a more split approach. You always have the strong opposition which we have seen in other cases. Let me ask you to come to something else. There was very struck. You wrote an open letter recently and I think to your employees empowering that was the sentence which struck which caught my attention. Powering everybody to achieve more in the common decade. What does it mean because you could say to achieve more this means to put more pressure on the people. What did you want to say? We talk about our mission as empowering every person and every organization on the planet to achieve more. And I felt that at the start of a new decade we needed to put more texture, more definition, more meaning to what achieve more looks like. For us as a company and obviously our and what does it mean for society and the planet I would say there are a couple of different levels to it. The first thing by the way I care deeply about as a platform company and in the fourth industrial revolution where we now have the most malleable of resources which is digital technology to help fuel economic growth because for all the talk the economic growth around the world is not actually that good. Because we have a problem that we don't have the economic growth we enjoyed perhaps in the third, at least in the beginning parts of the third industrial revolution. So we need to get back to real, I'll call it broad sectoral economic growth. So it needs to happen and it has to happen in a very inclusive way. That means it can't just be in the west coast of the United States and the east coast of China. It can't be just in the tech sector. Tech sector is 5% of GDP. It may even become 10%. What happens to the other 90% of the GDP is what matters in terms of real economic growth. It needs to happen urban and rural. So I think there are many, many of these divergences which we now need to converge back I would say and that needs to be front and center priority. Now by the way that economic growth will only happen if two other things are also in place for achieving more. One is trust in technology. The one way we can in fact not only have lack of economic growth, we can go backwards is if we don't have trust in the very factor of production that's supposed to fuel the fourth industrial revolution. So whether it's cyber, whether it's privacy, we talked about AI, ethics, internet safety, these are all big topics where we will need global norms to ensure that there is trust in technology and we as the platform creators will have to do our part in it. And lastly I would say sustainability, which is achieving more means whatever economic growth we achieve is going to also make our planet more sustainable. So whether it's water, whether it's waste, whether it's carbon or biodiversity have to be front and center. And so that's the framework we have for achieving more. I was just impressed by, surprised by the figure of 5% of the tech industry today. But if you look now at the transformation of the automobile industry, the automobile industry is becoming a digital industry. Absolutely. And if you look at the future of industries, would you, for me, I feel the two revolutions which will take place using digitalization data are probably still in many industry sectors but particularly also in the educational and agricultural field where we are still very, let's say, backwards oriented. Right. I mean precision agriculture, precision medicine, whether it's digital manufacturing, I mean everything is going to be defined by this fourth industrial revolution. But if you sort of say what's behind the fourth industrial revolution? You've written the book for which I wrote the forward. Exactly. The fundamental thing that you're seeing is the increasing digitization of people, places and things. Now once you have this malleable resource, i.e. digital, you can start doing things you can't do with atoms. You can do with bits. In fact, you can make your atoms more efficient where they are, how they move, how they're made because you can simulate, you can predict, you can automate all and suffer. And I think that's the broad phenomenon. Same thing with precision agriculture. The fact that you can pick the seeds, the time to put them, how to maximize or minimize the water usage, all can effectively be designed like how you would design a semiconductor. And so that that ability to take even what is physical today and convert it into digital and then make the physical better is I think what's going to happen everywhere. We are just digesting, let's say, see digital revolution, now artificial intelligence, but now we have quantum computing. And this may be even a game changer to your own company. Can you share some? You don't have to share your plans, but your general, let's say, appreciation of the importance of quantum computing for the future. Here's a real good way to think about what's the motivation. In spite of all of the abundance of computing we today have, there are many problems that today are not solvable by classical computers. Taking, in fact, if you want a catalyst that is going to capture carbon, that's a computational problem that is not solvable by classical computers. So in some sense, we need a quantum computer, but we need a real quantum computer, which is you need new physics, in terms of you need to discover new physical properties that exhibit quantum behavior that are stable. Then to be able to have a complete new software stack, everything that anyone has learned in computer science to date is null and void. You have to relearn it. And that's the kind of approach we are taking, which is we are building out, like always, a software stack. We even have a new language called Q-Sharp. You can go learn it today. You can program in it. And we're, in fact, simulating quantum algorithms on classical computers. In fact, Cleveland Clinic has done something very clever. In an MRI machine, one of the ways you can be much better, have a higher precision in discovering tumors, for example, is a search problem that can't be solved on classical computers. So they've used a quantum-inspired algorithm already to start refining it. So we're very excited about that next level of computation to fuel that economic growth we talked about. It will take probably 10 years from now, but... And there are many stops along the way. Is there any question? We have time for one or two questions if you accept from the audience. I'm happy to do it. Yeah. Let's take the first one from behind. Thank you. Ash Global Shaper from Toronto. My question is, what might be some arguments to perhaps help convince businesses that greater data dignity for individuals might be better for them? I mean, you don't have to convince me because in some sense, it comes down to new business models. For example, let me ask you this. How do you do price competition in a second-price auction? So in some sense, you could say, hey, let me reprice who gets what for data. And so I think what needs to happen is real business model innovation around the ad-funded business model today. Today, I think data... And again, I don't want to say that any one business model is problematic. The world needs all business models. That's the point, which is you can't have just one business model be the only business model. And that's, I think, what's not great for the world. So I do believe that there will be... By the way, the enterprise market, that's why. That's why there is data dignity. No enterprise wants their data just to be... It should create surplus for themselves. Why is that not true in the consumer space? Today, the value exchange is a weak value exchange because you're giving away valuable data for what is considered a free service. But is the free service you're getting really all of the value you should be getting for your data? And that is what I think is the question that needs to be asked. Let me take on this side any question or any comment. Not the case. Let me take the last one. Good morning everyone. Great to be here. I have a question on culture. As a former Microsoft employee myself, I left Microsoft in 2005 to focus on Building Youth for Technology Foundation, which is an international education technology nonprofit. And I'm part of WEF as a result of Hilda Schwab and the Schwab community. But thank you for your efforts to really change things at Microsoft. And when you talk about empowering everyone to achieve more, my question is specifically, how are you doing that to do good and do well, especially with employee culture, to prevent what many like myself suffered from in the early 2000s at Microsoft, which is burnout? It's a great... I'm glad you should come back now. You know, to me, the lived experience of the people inside the company is everything. As a CEO, I always say that my biggest job is curation of culture, right? In some sense, we sometimes forget that in order to get your strategies right and products right, you need two things that really anchor you. That sense of purpose and mission, which gives you direction, right? And then culture, which makes it even possible to pursue that mission. And that's where I think the tone of the top, the focus on it, what are the ways we go about, not just saying we have a culture X, we're going to change it to Y, and it's a one-time change, and then we'll forget about it. That's just not what it is. I mean, it's every day, all of the 140,000 people who work at Microsoft are going to come and look and say, is my lived experience close to what is the espoused culture? And that's the difficult framing of it. I would say whatever change we've been able to achieve is because the cultural mean we picked was inspired by Carol Dweck and her work around growth mindset, right? The concept is simple, which is if you go to school and you take two kids, one of them has more innate capability and the other has less, but one of them is the person with more innate capability is annoyed all and the other is alerted all. You know how the story ends. That applies to CEOs, that applies to companies, that applies to all the 140,000 people in the company. And so we have taken that and said how do we confront each of us, our fixed mindset each day when it comes to diversity and inclusion, when it comes to customer obsession, when it comes to even how we as a company come together. None of our customers can care less about sort of our divisional boundaries. So we need to come together. That's where you have to confront your fixed mindset. In fact, even today, sometimes people at Microsoft will come to me and say, Satya, we found the 15 people at Microsoft who don't have a growth mindset. That's not the point. The point is not to go look for the 15 people. It's to be able to, for me, to be vulnerable enough to say, I'm not perfect, I'll never be perfect, but I can learn. That's a good posture to have, to have a living culture that is constantly keeping up with our own aspirations because I think we as human beings are never going to be satisfied with sort of what is the culture around us because our standards are going to keep going up and so that's what we aspire to do and we definitely are on a journey and it's a continuous journey and I think it's time for you to come back. No, unfortunately, we come to the end of this fascinating discussion but I think what I take with me we are living in such a disruptive time we don't have the answer to all the solutions and I mean managing a company like Microsoft is such a complex challenge but what I admire and what we have seen also in this session you have to remain human and you have, even if you are at the top in the top position and you use the world you have to confess that you are vulnerable and as long as you know you are vulnerable you work also to make sure that you remain human. Before I end and thank you I just want to appreciate Microsoft's strong partnership with the Forum we spoke about the need to develop ethical frameworks around the new technologies what we are doing now in our network for the first industrial revolution where Microsoft is very much engaged and also to thank you among many things for the particular engagement for Uplink which is a joint project with CUN where we want to create a digital platform behind each of the SDGs Sustainable Development Goals to allow everybody to have people engagement to make sure that people thrive not just some businesses and NGOs but everybody should engage behind the SDGs and here thank you very much because it's possible thanks to the collaboration we have but coming back to the real purpose of this session it was to show to give an answer to some difficult questions and I think we got very satisfying answers thank you thank you so much thank you