 Usually, as you have seen, I chair only sessions with the political leaders, but this is a fascinating opportunity for me to have a discussion with a great leader in technological advancement. And Sundar Pichai, I think we have something in common, not only the interest for new technologies, and you know, you all got the book which I just published, Shapings, The Force Industrial Revolution, but we have something else in common. You cultivate a very low-key style, I try to do the same. You, just Davos, outside Davos, my wife can tell you, very low, very low key style. But your journey has really been a remarkable one. I mean, you came from relatively modest origins in India. You studied in India. You are an engineer as I am. But, Sundar, you added business school. You entered Google. And you are one of those unbelievable success stories, I mean, which also Silicon Valley provides. Now you are at the top of a company which I have, maybe I should tell you a story. I was visiting a prime minister, I will not tell you who. And he told me there are only five real global powers around in the world. And he mentioned three countries, I will not tell you which ones. And then he mentioned Google. So that's the context. And what I would like to ask you first is stepping into the role of leading such an important, I would add, maybe also impactful company, life-changing company in many ways. You took over from the founder Larry Page, quite big shoes to fill. And since then you have announced a major conceptual shift for the company from moving from mobile first to artificial intelligence first. And change is never easy. So what have you learned since you became CEO? Well, first of all, it's very exciting to be here. And thank you for doing this. You know, when I stepped into the role two years ago, one of the things that struck me was the core mission of Google in some ways is a timeless mission. You know, we undertook this journey to organize the world's information. And if anything, today people deal with more information than ever before. So one simple way to remember it is 10 years ago we launched Gmail, which was a way we searched through text to organize users email. But today we launched Google Photos. Thanks to AI, we can now search through photos. Well, over a billion photos get uploaded to us every day. We can search it and organize it. So if you type in and search for hugs, we can find pictures of people hugging each other. So the mission hasn't changed, but we can do it radically better with the AI. In terms of how we do it, I think maybe that's been a more important realization. As Google, we want to serve everyone in the world. So it's a privilege to serve billions of users every day, with which comes a lot of responsibility. You know, we want to be a global company, but we want to be local in every country we operate in. And I think with this comes the responsibility to engage better with society. And so understanding that being open to feedback and really interacting and engaging more with the external world is a big part of the realization. Sundar, speaking about artificial intelligence, it's not just a product. It's a systems enabler and changer. And so there's a book out which is called Reapons of Mass Destruction. And she talks about how big data increases inequality and threatens at the end, democracy. And so that algorithm can perpetuate bias or injustice. How do you respond to it? You have a special responsibility. You must have a response to this question. Look, I mean, one of the things in that was I've seen, there are a lot of important questions about technology, its impact, and what role artificial intelligence plays. First of all, I think it's good we are talking about it. As humanity, the way we solve things is by getting concerned about them. So if you take climate change as a framework, we all started getting worried about it. And that's why we spend a lot of time working together, creating things like the Paris Agreement and then working on solving it. I think about things like AI the same way. AI is probably the most important thing humanity has ever worked on. And I think of it as something more profound than electricity or fire. And anytime you work with technology, you need to learn to harness the benefits and while minimizing the downsides. You know, stepping back when you think about, you know, a lot of problems in the world today, it's because we typically have a constraint on resources. AI, for the first time, I think over time, offers a different construct. Things which are constrained and look like a zero sum game today may not be so in the future. Take education, for example. It's really difficult to educate people in a cost-effective way. AI may fundamentally change that equation. It may make it possible for us to have clean, cheap, renewable energy for the future. So I think a lot of things will play out in more positive ways than people think. But the risks are important. And I think the way we solve it is we think ahead, we worry about it. We do things like from, be upfront, you know, have ethical charters, think about AI safety from day one. Be very transparent and open in how we pursue progress there. And figure out global frameworks by which we can engage. Just like Paris Agreement and Climate Change, you know, using forums like this, I think we bring people together to engage on the hard questions. And I think answers will emerge. But I think it's important to be positive about it, especially in the West. I think, you know, I still today have a wonder for technology growing up. You know, I didn't have a phone for a while, telephone. We waited five years. We got a telephone. It fundamentally changed our life. People came to our home to make phone calls. You know, I still remember the joys of technology. And I think that will be true for AI. So it's important for us to explain that and bring the world along with us. But what you describe is, let's say, AI creates or has a common good nature. But son, you have AI plays a very important role in surveillance of efforts by governments. It is a key factor also in cyber security. How can you avoid that AI at the end leads us not into a totalitarian state? You know, to me, the only way to solve some of these deeper issues is global multilateral frameworks. To me, the kind of questions you're asking needs to involve G7, G20 discussions and, you know, countries we have to agree to demilitarize AI. And I think that's a common goal. Country should work towards. You know, there's no other way to solve it. One of the good things about it is AI is kind of an equalizer. So I think over time people will realize it's tough to weaponize it because everyone will have the same ability back. And I think that gives a framework for us to think about. You know, so you need a global stand down, global consensus not to use it for military purposes. Going to be very hard. But I think, you know, is the kind of framework we all need to work towards. And that's the only way out of it, I think. Let me let me turn to big data. And I saw in an interview last year, you spoke about, and I quote you, the data belongs to the user. We are stewards of it. So it was, of course, in response to concerns about Google's extensive reach over personal and consumer data. Also, Mrs. Merkel, in her speech this afternoon, spoke about the question, who actually owns the data? Now, at the moment, it seems the data are owned by some big companies like yours. Should we trust some big tech companies? Or is there any kind of self regulation? Yeah, I mean, it's an important question. I think today we always think we operate under the framework that users use Google because they trust us. And it's something easy to lose if you're not good stewards of it. So we work hard to earn the trust every day. You know, one of the biggest concerns with data as you saw through last year is security of data. You know, so for example, when you use Gmail, we work hard to make sure we keep your email safe from all kinds of attacks that's possible. And I think that's the framework by which we operate. But I think I always think data belongs to the user. And as companies, we are only stewards of it. You know, we work hard on making data portable. Tomorrow, if you want to use some other service other than Gmail, we actually make it very easy for you to take all your data out of Gmail and go use something else. I think that's an important framework to have. AI will actually make some of this easier. I think AI will allow us to put the user more at the center and build trust systems around it too. So but I think what you're asking is got to be the foundation for it. I don't think any single company should own these things or that's not the contract we can have with the world. That's a very important statement, I would say. And not necessarily what the public always feels the public feel. So data is in the hands of the big companies and they do whatever they want to do with it. You know, I think then they won't use this again. And so, you know, it's something and increasingly when you take areas like health, if you really want to do good stuff in the world, we you know, data security and trust are the pillars of it. So we have to be very careful. I may ask you a relatively concrete question. Google has a global share of more than 80% in internet searches conducted over PCs or smartphones. And in June last year, I think it was June last year, the European Commission find you, if I'm correct, 2.4 billion dollars or euros, euros for preaching antitrust rules in the shopping search service. What are the specific responsibilities of corporations in the technology sector who have acquired a quasi monopolistic position? And of course, underlying here is also the question. How do we create a tax system, which let's say fits this new type of companies? Yeah, two questions in that. And you know, the first thing, you know, when you operate a large platform and scale, we try hard to make decisions by putting the user first. You know, large platforms can be disruptive. So even when you put the user first, and make a decision, there are going to be winners and losers, right? And I think I think that has an has an impact on us. But from our standpoint, you know, we are pretty determined to stay focused on the user. And we're very confident, you know, products, we do benefits the users, including users in Europe. And that's the conversation we have with the European Commission as well. Tax is a is a great question, you know, as a company, you know, we pay, if you look at what the last five years, close to 20% in tax, we are happy to pay a higher amount, whatever the world agrees on is the right framework. So it's not an issue of the amount of tax we pay, as much as how you divide it amongst the various countries. The way we are approaching it, I said earlier, I want to, you know, I want Google to be a global company yet local in every country. Today, tax system works based on the R&D you create. This is why we are actually investing and adding engineers globally. I was in France on Monday, we are starting a AI research center there. We're going to be hiring a lot of engineers in France. And we're doing that in Europe. Over time, that normalizes the value of tax we pay, right, because it reflects on where you create value. So that's one good basis to do that. But we are open to any construct, we encourage the OECD to to actually solve these issues, which makes it much easier for companies to, companies to operate on. When you depend very much on the performance, resilience of the internet, and of course, there's also the philosophical or ideological issue about internet freedom. So let me ask you two or three questions. The first one is, at the recent Internet governance forum in Geneva, a number of people raised the fears that the internet, at least in the form we know it today, in 10 years will not exist anymore. What is your take on it? You know, to my earlier frame up, we always worry about things. You know, somebody forwarded me an op ed 150 years ago when bicycles were invented. And in the leading newspapers, they were worried that bicycles would give freedom to girls, and they would get on bicycles and ride very far away, and would really cause society to break down. So I think that's how we worry and constantly make things better. But things turn out very differently. To me, you can think about the internet as a set of technologies, and that will continue to evolve. But if you think about the internet as an ideal, the ideal of giving anyone in the world a chance to get connected with everyone else, give them opportunity at a global scale. So today, using Google, when there are ranchers in Idaho in the US who export their products globally, there are artists who put up things on YouTube, and they have their audience globally. That's what the internet does. I think that clock is going to march on. That ideal I'm not worried about. As a technology, if anything, I think it'll get better. You know, you see things like blockchain coming up. These are exciting developments. So I think the internet, we have to worry about it. But, you know, I'm optimistic. I'm optimistic about technology, not because I believe in technology, but because I believe in people and humanity. And, you know, I think it's a good construct to move forward. It's also one of the, let's say, key objectives of the World Economic Forum, always its new center for cybersecurity. But when you speak about, I mean, you have to have equal access if you really want to have a democratic system. But now the recent decision by the US administration to allow a two-speed internet, isn't that already the demise of this democratic concept? You know, net neutrality is an important concept. And, you know, on which, you know, the internet has been built upon. The good news is if you see the public sentiment where things are, people really cherish, people don't want their traffic to be discriminated in a certain way. And so I think that's an important ideal and is giving rise to a lot of good debate. You know, I'm hopeful we cherish those principles and preserve it for the coming years. Who will actually win and who will lose if you have such a different speed internet? You know, part of the reason as a big company, we advocate for net neutrality. You have to understand that, you know, if you walk away from it, it can actually favor big companies. But when Google was a small company and getting founded, they had a chance to reach users. And I think that's the principle we are trying to protect. And so, you know, I worry less about Google as a company when it comes to these things. But for the next set of entrepreneurs building their services and trying to reach users, and I think that's the principle we all need to fight for. Sundar Pichai, I asked you maybe a delicate question, but a question which is very much in the mind of many people. And I related to the protests in Charlottesville in Virginia. And in the aftermath of what has happened, a number of technology companies took action to remove hate groups from their platforms. So in some way, you are exercising a, how shall I say, a qualification of what's on the platform or not. And this leads to the question, should digital platforms be able to unilaterally decide what content is acceptable and not acceptable? You know, I don't think that's the right outcome. And you know, as a digital platform, we feel like we are on the cutting edge of the issues which society is grappling with at any given time, where do you draw the line of freedom of speech and so on. The way we approach these things, so for example, over the last year, we made tremendous progress in how do you deal with violent extremism. The way we did it is by engaging many NGOs, counter-terrorism organization, non-profits, and they guide us on where to draw the line, right? So that kind of gives you a framework as to how to think about it. Sometimes the government helps us by giving clear laws. And you know, sometimes that gives you a clear framework. But I don't think I think it's the function of a democratic society to decide where to draw the lines. You know, I don't think it should be in the purview of any single company or companies to do that. So we have to figure out how to engage with society, get the input, and that's how we are trying to draw the line. So it's some kind of a self-regulatory mechanism which we might. We are doing this with the European Union and developing codes of conduct around hate speech. And you know, we are doing that in a voluntary way. And I think that's that's a good. So you would feel what the German legislation which came into practice, I think, the first of January goes too far or? Well, I mean, I think, you know, these are, you know, we believe in engaging with governments. They are trying to solve important problems and we want to be a constructive partner. So there are many regulations in which we work together, you know, be it GDPR or be it, you know, even right to be forgotten, et cetera. So we are always working together to find the right right place to be. And I think that's the way we approach it. When when you look at Google's histories the last months and preparing for this session, you, but I come back to the leadership issue. We touched upon at the beginning. So what's a former employee? James Stammer, I think, who wrote the memo criticising the diversity initiatives within the company. And was there anything in the memo as you as the CEO took to heart? I mean, as a leader of 70,000 plus employees, what was your reaction? Look, as a company, we support freedom of speech and we give a lot of platforms to do that. We have to understand in the context of a workplace, you know, the representation of women in tech is very, very low. Technology products affect everyone in the world. So the only way you're going to do it better is by involving more women in the development of technology. I think it's a moral imperative to do that. And so clearly we want to create an environment which is more supportive for women. And it's, you know, it's personal to me. And, you know, a personal experience, you know, when I went to, you know, I went to a technical college in India, my wife was, you know, there were 20 girls in a school of 400 people. And, you know, I saw first hand how hard it is to function in an environment like that. So we're just trying to create an inclusive culture for all Googlers. And, you know, and that's why that's what the debate was about. I'm glad there was a public debate in which people, you know, engaged on these topics better. But so is a larger debate. And when we listened yesterday to Prime Minister Trudeau, and I come back to a, I mean, in your company, you, you have 30% of your employees are women. But if you look at the statistics, only 7% actually only 17% of the women's working for you, whole tech jobs. And it's a larger issue because study of the American Association of University Women shows that women in tech are actually declining and not increasing. So what are you doing to help such a technology sector continues to embrace diversity in gender races? I mean, in all aspects of social life. I mean, it's, it's an important question. We engage as a company in lot over the past years. We start a conversation in tech by being transparent about, you know, the presentation we have internally. I think there's a lot of positive news there. You know, if you look at academic computer science programs throughout, you know, in places like the US, you definitely see the tide is turning. We have to, we have to build on that momentum and you have to make sure the environment when women come into workforce, especially on very technical jobs is, is positive and supportive. And, and that's what we are working on, but it's a lot of work left to do. We just have three minutes to go. Google is, is a very, as you also mentioned, is a very international, it's a global company, but still you are perceived as an American company. What do you do in order to really gives a perception of a global company? You answer partially to this question already, but Well, you know, we've been thoughtfully doing this for the past few years. In fact, if I go back and look at the past four years, you know, almost greater than 50 percent of our hiring is outside of the United States. So part of the way we do it is by really, you know, actually hiring in a distributed way. The second part is, I think we are engaging with, you know, local governments, institutions. For example, if you look at our digital skills training, which I think, you know, we didn't talk much about, but the biggest urgent issue to do with technology is we all need to retrain the workforce. Gone are the days when you could educate yourself once and that would give you a job for the rest of your life. How do you do that at scale is something very important. And we are doing this locally in every country. We have committed to a billion dollars over the next five years. In Europe alone, we have trained three million people and we localize these programs in every country. And we plan to do a lot more of these things. And so the way we show that is by actually being a responsible local citizen in all the countries we operate in. And we are very, very committed to doing so. My last question is you are now since a relatively limited period of time as a CEO. And you were before more in a technical function in Google. How would you define if I would ask you what would be for you the three, four main complaints of which describe your leadership in Google as a CEO? I'm sorry, main attributes. Yeah, what are some main attributes? What makes you special? I know you are low key, but let me let me ask you. So so many people can afterwards also become CEO of Google. You know, I you know, I think at the end of the day, there's a lot of dimensions to the to the role. But I think it starts still with, you know, building great products. And I never lose sight of that. My happiest moments are even in my busy schedule, if I can spend an hour or two and start my day off with engineers and designers and product people thinking about the products we are building. I think I think it's something still I make sure I spend time on. And and you know, I try to build a very collaborative team. I think at our scale, you can get things done unless people work together. And so I missed a lot on a lot of time and making sure I attract the right people and create a culture by which we all work together well. So probably the two most things. And the third is I spend a lot of time on partnerships and engaging with the outside world. I think it's a really important thing we need to do well. And and which is why I'm excited to be here. And it's a great place to do that. And how do you keep life work balance? Yeah, you know, it's a good question over time. I've learned to learn to have a healthy perspective on, you know, there's a lot of things to worry about. But, you know, it keeps changing and there are good news around the corner as well. Some very optimistic by nature and I try and carve out time. So the only way I think you do it well is I am very particular about the activities which I do meet with my wife or my kids. And, you know, I decide to do that first and then my work fits around that. And, you know, that's that's one good way to do that. Thank you. So now we are just in time. And on behalf of all the participants, thank you for answering the questions in such an open, straightforward way. And we wish you for the next 30 years at Google. All the best. So thank you. Thank you so much. Thank you.