 Thanks and congratulations to Tia and it's an honor to be here. So what I'd like to try to do is to focus on a couple of themes that I think undergird this link between technology and what we might call truth and indeed trust under siege. And in particular, I'd like to look at this intertwining of technology, power, and truth, and see what it yields in terms of what we should do, because there's a lot of explaining and complaining. But at the end of the day, we need to take action. And I would suggest that we need to rethink the way we're making decisions in this technologically laden world. So to start with, power today, as we've been hearing throughout the day, is scattered. We heard it from the president of the ICRC. We heard it from Ambassador Eisenstadt. Power is scattered to people like the WannaCry hackers or like the extremist Buddhist monk in Myanmar who disregarded the government's prohibition on his preaching and just took to Facebook with his verbal abuse of the Rohingyas and some horrific photographs. The problem with the scattering of power is that there's no corresponding assumption of ethical responsibility for the deployment of the power. And in fact, we don't really even know who has the power. The second power dynamic is a concentration of power in the technology companies. All the time in the news, we hear about the so-called Big Five, the Amazons and Googles and Facebooks. But in fact, it goes much further and not just to the ubers of the world, but all the way down the chain to the startups. And the fundamental responsibility issue here is that they typically do not, and there are exceptions, but they typically do not think ethics first and then put their technology out there. In fact, many of them, I would suggest, have a proactive strategy of just doing and waiting until they have a head-on collision with a regulator or consumers who will stop them. And so the question here is, how do we rebalance the allocation of responsibility? And the starting point for me at least at this point is to say that this tagline that they're only a platform is simply no longer acceptable. We can't have online sex trafficking, recruiting of terrorists, and all manner of wrongdoing and have these companies just saying that they are just a neutral platform. On the other hand, we can't have regulators of targeting and quashing innovation in ways that can also be negative for society. Now, the final point about power and technology is that technology has disempowered state institutions. Starting with the law, we see that legal systems lag very far behind technology, which is constantly changing at an increasingly fast pace. And the law simply can't keep up. We see that legal systems are very ill-equipped with the cross-border impact of technology. And understandably, legislators just don't understand the technology. And similarly, state institutions are going to be falling short with respect to power. And there are many complicated examples. I'll stick to one, which is cyber warfare. I don't know of many states who could run a cyber war without recourse to the private sector or indeed individuals. So technology has totally disrupted this power dynamic. And the first part of the what we do question is that we need to make decisions in this new reality and not thinking about a balance of power that is outdated, even a year or two outdated. And now truth. Technology has also catalyzed this epidemic of compromised truth. So fake news is a major example. But there are other examples out there. There's a Chinese app called Meitu that allows one to take away a few wrinkles and take away a few pounds in a matter of seconds and then put a photo on a dating app. So they're all a matter of contagion. But in order to do the right thing, in order to make good decisions, we must insist on truth. The kind of scientifically verifiable or social science research based truth. And to Steven's earlier point, I had the privilege of interviewing Salman Rushdie a couple months ago. And he said, you know, it's not because you say the world is round that it's round. The world doesn't need you to believe that it's round for it to be so. And I think we all again need to be staunchly committed to truth. More generally, when we put all of these dynamics together, the power and the contagious nature of truth driven by technology, we have to ask ourselves what else about our decision making needs to shift. And I would suggest a couple of things. One is that we need to broaden the conversation. It can't be that the innovators and those who control the innovations be they large corporations or holders of super majority voting shares in Silicon Valley, they can't be the deciders on behalf of society about when and how innovation is unleashed on society. We need a much broader conversation. I have a personal challenge of trying to figure out how to do this, but it needs to involve academic institutions and think tanks. It needs to involve corporates and nonprofits and governments of all kinds. And above all, it needs to go beyond the US and Western Europe. Because all of the impacts of technology are different around the world, but they are certainly global. And at the moment, the only sort of checks and balances are institutions like the European Commission and largely sort of lobbying in America, et cetera. The other thing we need to do with our decision making, in my view, is to look at it through three lenses all focused on humanity. One is the individual. So if we're looking at, for example, gene editing, and incidentally, everything I'm saying applies across any kind of technology, whether it's Bitcoin or civilian space travel or gene editing or social media. But if we take the example of gene editing, a patient with Huntington's disease wants it now, and understandably, but at the same time, if we're looking at it through the societal lens, we're very worried about all of the risks of this, what the experts say is a scissor-simple technique and what happens if we lose control of it. And if we look at humanity writ large, we're very concerned about potentially permanently altering the human germline. So all of these questions, though, have potential implications for individuals, for society, and for what I would call humanity writ large. And then finally, we need to look at this very daunting and complex reality that we have with this complicated distribution of power, lack of understanding about where it is and who's responsible, and we need to avoid taking refuge in the binary. So we seem to be suffering from an epidemic of binary decision-making. As a London resident, I'll call out Brexit as the crowning example of a disastrous decision, the only result of which could have been divisiveness and waste. But there are others. A physical example is President Trump's wall, one side or the other. We have transport for London, Uber, in or out. And I think we should be asking not so much yes or no with these technologies that have both positive and negative, but we should be asking when and under what circumstances. How can we maximize the positive benefits and minimize the risk?