 Congratulations to Jane, it's an honor to be here. So what I'd like to try to do is to focus on a couple of things that I think underword this link between technology and what we might call truth, and you trust under siege. And in particular, I'd like to look at this intertwining of technology, power, and truth, and see what it means in terms of what we should do, because there's a lot of explaining and complaining. But at the end of the day, we need to take action. And I would suggest that we need to rethink the way we're making decisions in this technologically related world. So to start with power today, we've been hearing throughout the day, is scattered. We heard it from the president of the ICRC. We heard it from Ambassador Isaac. Power is scattered to people like the WannaCry hackers, or like the extremist, Buddhist monk, Imi Amar, who disregarded the government's focus on his teaching, and just took to Facebook with his verbal abuse of the Rohingyas and some horrific photographs. The problem with the scattering of power is that there's no corresponding assumption of ethical responsibility for the deployment of power. And in fact, we don't really even know who has the power. The second power dynamic is the concentration of power in the technology companies. All the time in the news, we hear about the so-called five, the Amazon group as a place for us. But in fact, it goes much further, and not just to the ubers of the world, but all the way down the chain to the startups. And the fundamental responsibility is to hear is that they typically do not, and there are exceptions, but they typically do not, they say ethics first, and then put their technology out there. In fact, many of them, I would suggest, have a proactive strategy of just doing it, and waiting until they have a head-on collision with the regulator or consume the stool stalker. And so the question here is, how do we rebalance the allocation of responsibility? And the starting point for me at least at this point is to say that this tagline that they're only a platform is simply no longer acceptable. We can't have online sex trafficking, recruiting of terrorists, and all manner of wrongdoing, and have these companies just say that they are just a mutual platform. On the other hand, we can't have regulators off-targeting and quashing innovation in ways that can also be negative for society. Now, the final point about power and technology is that technology has disempowered state institutions. Starting with the law, we see that legal systems lag very far behind technology, which is constantly changing, and at an increasingly fast pace. And the law simply can't keep up. We see that legal systems are very ill-equipped with the transport of impact of technology, and understandably legislators just don't understand the technology. And similarly, state institutions are going to be falling short with respect to power. And there are many complicated examples. I'll stick to one, which is cyber warfare. I don't know many states who could run a cyber war without recourse to the private sector or indeed individuals. So technology has totally disrupted this power dynamic, and the first part of the what we do question is that we need to make decisions in this new reality, and not something about a balance of power that is broken, even a year or two outdated. And now truth, technology has also parallelized this epidemic of compromised things. So fake news is a major example. But there are other examples out there. There's a Chinese app called Meitu that allows one to take away a few wrinkles and take away a few pounds in a matter of seconds, and then put a photo on a dating app. So they're all a matter of conveying it. But in order to do the right thing, in order to make good decisions, we must insist on the kind of scientifically verifiable or social science research based truth. And to Steven's earlier point, I had a privilege of interviewing Salman Rushdie a couple months ago, and he said, you know, it's not because he say the world is round, but it's round. The world doesn't need you to believe that it's round for it to be so. And I think we all again need to be staunchly committed to truth. More generally, when we put all of these dynamics together, the power and the contagious nature of truth, driven by technology, they've asked ourselves what else about our decision-making needs to shift. And I would suggest a couple of things. One is that we need to broaden the conversation. It can't be that the innovators, and those who control the innovations, be they large corporations, or fullers of supermajority voting shares in Silicon Valley, they can't be the deciders of a half of society about when and how innovation is on the side. We need a much broader conversation. I have a personal challenge of trying to figure out how to do this, but it needs to involve a lot of institutions, a lot of times, it needs to involve corporates and all-profits and all kinds, and above all, it needs to go beyond the U.S. and Western Europe, because all of the impacts of technology are different around the world, but there's certainly a world, but in moments, the only sort of checks and balances are institutions like the European Commission, and largely for lobbying in the United States. Another thing we need to do with our decision-making and why we need to look at it through three lenses also focused on humanity. One is the individual. So if we're looking at, for example, humanity, and incidentally, everything I'm saying applies across any kind of technology, whether it's Bitcoin or civilian-based travel or gene-editing or social media, but if we take the example of gene-editing, a patient with Parkinson's disease wants it now and understandably, but at the same time, if we're looking at it through the societal lens, we're very worried about all the risks of this, what the expert says, a super-simple technique, and what happens if we lose control of it. And if we look at humanity with large, we're very concerned about potentially permanently altering the human germline. So all of these questions, though, have potential implications for individuals and for society and for one of the popularized human lives. And then finally, we need to look at this very daunting and complex reality that we have with this complicated distribution of power, lack of understanding about where it is and who's responsible, and we need to avoid taking refuge through the binary. So we still have to be suffering from an epidemic of binary decision-making. As a London resident, I'll call out Brexit as the crowning example of a disastrous decision, the only result of which could have been divisiveness and waste. But there are others. A physical example is President Trump's wall, one side or the other. We have transport for London, Uber, in or out. And I think we should be asking not so much yes or no with these technologies that have both positive and negative, but we should be asking when and under what circumstances. How can we maximize the positive benefit and minimize the risk? Thank you very much. I mean, one thing I will tell you to ask the window, what I always tell you to do is, at least under American law, things like Facebook are not really publishers. They're just benefit of the obligations of public authorities. They're kind of highways on with all kinds of garbage to pass. So we have no responsibility for our travels on the road. We're just a road, right? And yet a minute you begin to talk about regulation in Turkey and Western democracy. You begin to have a psychic chill in the back. You regulate what? And it's really not true. And then you come back to the changes. I'm just curious to ask you, how much is this becoming a restriction on one sense of freedom and restriction that we have the party and our world and over the world? And it's a great question. I mean, first of all, I should say that all of these companies have a lot of margin for proactive ethical decision making before we ever get to infringement on free speech. And all of this will have to be about effective ethical decision making above and beyond the law. Because as I said, the law will never patch up. And we wouldn't want it to, because the law would undoubtedly cross certain lines. But I am very, I should say I'm very pro-innovation. I'm very pro-business. I'm very pro-free speech. And I don't think ethical decision making tramples on any of that. Even in the US and the UK, but in particular in the US, even the First Amendment doesn't protect some of the speech that I'm talking about. It doesn't protect inciting murder. It doesn't protect certain kinds of hate speech. It certainly doesn't protect child pornography and online sex trafficking. I think what's going on in free speech does not mean selling fire and a crowded theater. Next day, all of this, now here we've been talking about