 I have the tremendous honor and pleasure to introduce to you Xavier, the savior. He's coming to help us. He's a researcher in political science. And from that perspective, he's gonna show us a view on that paradox there in the future, the transhumanist one. I reckon he will start to open new chapters of human history. So give him a warm applause, a warm welcome. Light the fuse, put it in place, Xavier. Hey, so I'm gonna try in the next 20 minutes to live up to that introduction. Let's try and open a new chapter of history. But first, I just wanna make sure we're all on the same wavelength. So let me ask everyone here. How many of you here would accept, say, a US government-issued, US government-constructed laptop? Okay, I see one lonely hand in the air and it looks pretty sarcastic. And if I were to come up to you and say, hey, I'm a member of the US government, I just wanna run a quick virus scan on your computer, how many of you would let me do so? Again, a couple of lonely sarcastic hands. So, I think we're gonna get along great because today I'm gonna tell you why we need more political involvement in technology. So right up Hackers Alley, right? So what is the paradox I'm gonna talk about today? The paradox is very simple. It's that we live in states, liberal states, that protect individual liberty, but they're completely oblivious to the fact that technology is radically transforming the environment in which we exercise that liberty. Or to rephrase the paradox in terms of a question, what good is individual liberty? What good is the liberty that is protected by our states if someone else designed the playground in which you exercise it? Now, when I speak of playgrounds, when I speak of environments, what exactly am I talking about? What is an environment? An environment is nothing more than the parameters of the possible. It is what gives us rules of what can and cannot be done. So to make it somewhat more concrete and more familiar, we all know plenty about our natural environment. And first and foremost, the first environment which limits us and gives us rules for our lives is our body. And the rules of our body are the rules of our lives. They are the rules of how we think, how we talk, how we move, how we interact with one another. Our body is our primary and first environment. And then we have the larger natural environment, the physical laws, the earth on which we live, which is the larger environment. In both cases, we've lived with this for all of human history, so we take it for granted that the rules of our body are the rules of our lives. More recently, we've added political environments. And the political environment are simply the laws we agree to as a society. We're used to this as well. But what does it mean when we talk about a technological environment? What it means is that as we live more and more of our lives in and through technology, as more of our lives become dependent on technology, we increasingly rely and must follow the rules and parameters of the technology we use. Or to rephrase it, we could say, how does technology create this environment? Well, first of all, each new piece of technology requires the reorganization of the world around it. Let's get more specific. We've had cars for over a century now. And at first, cars, like most technologies, was a way of helping us to transcend our natural environment, help us transcend our bodies, the fact that we can't move faster. And yet the technology of the car also required, in order for it to function, in order for it to be useful, it required a complete reorganization of the world around it. First of all, in the forms of roads, we've cut the topography of the world up into roads. We've paved the world so that our cars could be useful and function. You go into any modern city, and you immediately see that the city, modern city, is built around the car. The way we move today, even when we're not in the car when we're in the city, it's built around sidewalks, it's built along zebra stripes, it's built along the rules necessary for the car to function. But even more than that, okay, cars require roads, big deal. But it also requires the reorganization of the world itself, of the planet we live on itself. It requires the reorganization of the world into resources, which is, as we can see right here in this picture, when it comes to cars, this is mostly the case when it comes to oil. But it's not just oil, it's not just the fact that we have to pillage the earth for oil and gas. It's also the fact that a car is made of many different particular elements, magnesium, iron, aluminium. All these different elements are needed to make a car. So in order for us to have a car economy, we need to have a global economy, because all these different elements are present in different parts of the world. So we see just in terms of the car, a simple device that helps us transcend our bodies in order to move faster. We have to completely reorganize the world. And just for the car, we live in a different type of environment. We live in a technological environment. But the car is an old technology. We hardly think of it as technology today. Today, when we think of technology, we think of computers, we think of internet, we think of artificial intelligence. We don't even really think of the car anymore. But what the car shows us is that much of human history, certainly in the last couple of centuries, can be said to be a replacement of our natural environment with a technological playground, the replacement of the natural environment with a technological environment. And this is gonna about to get much more obvious to us. It's about to get much more dramatic because technology today is much more powerful than it's ever been. In other words, technology's power to transform the world is much bigger than it's ever been before. When we talk about transhumanism, what is it? So most of you here know what it is. Just a quick reminder, transhumanism is a movement or many movements that think that it is possible and desirable to enhance humans using technology. That is the basic claim, the basic wish of the transhumanist movement. There are plenty of subgroups within it, but that is the basic wish of transhumanism. Now, how would transhumanism transform our environment? Well, first, we already talked about how the human body is the primary environment of our lives. The human body and its rules and its limits are our rules and our limits of our life. And so by augmenting, by enhancing the human body, we're talking about radically shifting our primary environment. Now, transhumanists disagree on how exactly we're gonna enhance the body. But what is clear is that our human body is going to be transformed using machines. And therefore, many of the limits that we take for granted today about how we think, about how we move, how we interact with one another will no longer be the same in a transhumanist world. Many transhumanist movements also base much of their movement on the idea that eventually, virtual reality and virtual realities will become so complex and so rich that they'll be more attractive to us, more interesting, more compelling than analog reality, more interesting than the reality we live in today. So that's the second way in which we could possibly completely change the environment in which we live. Finally, often we think for transhumanism really to get off the ground, for us really to begin augmenting ourselves, and for us to really be able to create, for example, compelling virtual realities, we'll have to create something akin or approaching to a superintelligence, which is to say we'll have to create a machine that is more intelligent than all humans put together. And of course, creating a superintelligence would be a bit like inserting a genie or a guard into the universe. It would completely change the rules of our universe, of our environment. And then finally, to make all this possible, in particular, if we're gonna create a superintelligence, it's most likely gonna need enormous amounts of energy. And so one final thing that transhumanists usually assume is that we're gonna have to colonize some large part of this planet and maybe other planets as well, just in order to feed the energy needs of the superintelligence. So in short, a transhumanist playground is completely different from a playground today. The environment in which we could exercise individual liberty in the transhumanist playground is completely different from that in which we exercise it today. Now, transhumanism makes it clear that there are two types of liberty. The one, which transhumanists will say will be largely augmented due to transhumanism, is simply the liberty to choose a specific action in a specific situation. And as we just saw, transhumanism is all about augmenting this type of liberty. By enhancing our body, we'll be freed from the constraints of our natural body, we'll have much greater liberty. In virtual reality, often we'll be able to choose the parameters of our virtual reality. So we see that in terms of this first type of liberty, transhumanism greatly augments it. But what we also see is that the very idea of transhumanism, the idea of technology so powerful that we could fundamentally transform our bodies and the world around us, gives humanity a second type of liberty, which is the ability to design the environment in which we exercise the first type of liberty. But although this is relatively clear when we talk about transhumanism and how it changes our environment, both transhumanists and liberals focus on the first type of liberty. So we all know transhumanists make great promises about how augmenting our body will free us from natural and biological constraints, and that's the first type of liberty. But I wanna talk a bit also about how liberals focus on the first type of liberty. And to be clear here, I'm talking about political liberty, political liberalism. So what do we mean by that? What is political liberalism? So first of all, political liberalism, it only recognized the first type of liberty, the liberty to act in a specific way in a specific situation. And in order to protect this liberty, it limits the function and the power of the state. So going back to the two types of liberty, it only recognizes this first type of liberty, doesn't even consider the second type, and it builds the whole state mechanism around this first type of liberty while ignoring the second. So how does the liberal state function? How is it structured to protect this type of liberty? In short, the function of the state is quite small. The function of the state is to ensure peace, protect individual liberties, and provide some basic services. These are the functions of the state. The state has no role in formulating or enacting collective visions. And the power of the state, which is to say the state's ability to carry out its functions, is further limited by a constitution and individual rights. So this liberal state is based and is built around this first type of liberty, the individual liberty to act in a specific way in a specific situation. But of course, as we've just been talking about, transhumanist isn't an individual choice. Transhumanism is a collective choice. What I mean by that is that what is at stake with transhumanism is our collective future, and not just individual future. And therefore, individual liberty and individual choices aren't adequate ways of dealing with transhumanism. Why? Because the transhumanism movement and all the sub-movements within it, they would affect us all as a group. They wouldn't just affect the individual who made the choice to enhance himself using a specific technology. It would affect us all. So on the one hand, and most obviously, what is at stake with transhumanism is the future of the human species itself. In other words, how? Let's say, okay, first of all, are we going to enhance the human species using technology? And if so, how? What types? What types of enhancements will we use? And we'll particularly, when enhancing the human body, what is the essence we're trying to keep? Is there some human essence that must be retained in order for us to remain human? Or do we even care about being human? So what's clear in that case is that transhumanism is a question about the human species. How are we going to change it? Is it going to be surpassed or merely augmented? Anyone who's ever tried to live in modern society, say, without a phone or laptop or a car, knows that technology is increasingly sort of an entering pass into modern society. And so transhumanism is also a question about the future of society. Because ultimately, if enhancements become the norm, if a specific enhancement becomes the norm, it'll probably become an entering pass into the society of the future. And finally, as we already talked about, the energy needs of the computers of the future will be such that the chances are that transhumanism is also a question about the future of the planet. If we're all living in virtual reality, we may not care that much if the planet is colonized for its energy needs. But some of us might. So we see here that transhumanism is very much a collective choice in that whatever happens, our technological future will affect us all. It will not just affect those who choose to augment themselves. Now, let's bring back to what we were first talking about in terms of playgrounds and environments. If we increasingly live in a technological playground, as we would in transhumanism, because in transhumanism, even our body would be primarily technological. So if we increasingly live in a technological playground, if the rules we live by are increasingly technological rules, if the limits we have to adhere to are increasingly technological limits, then shouldn't we all have a say in the design of this playground? Shouldn't we all have a say in what the rules are going to be for this new playground? Then the question becomes, okay, so humanity in a sense is faced with a collective choice about our technological future. What is the technological environment we want to live our future in? But then the question quickly becomes, okay, so we all want to have a voice in this, but we're not all hackers, we're not all working on AI, we're not all going to be directly involved with the technological revolution that's going to make it happen. So how can we be heard? The liberal solution, the solution that we're currently slouching towards is the market one. So what does it mean to leave technology to the market or to leave it to the individual? Because that's how rarely do we hear leave it just to the market. No, they always say leave it to individual choice, but what does individual choice in this case mean? What does it mean to leave technological development up to individual choice? What is the individual choice we're talking about here? The individual choice when it comes to new technology is essentially whether to buy something or not. And so the real choice is made before when we decide, when people decide what technology to develop, what research to fund. And as you guys all know, who are the primary funders of let's say probably the most important research field at the moment, artificial intelligence? Certainly one of the most popular and the one that could possibly create a form of superintelligence. So who funds this type of research? We all know about Amazon and Apple and Google. And most of you also know that of course perhaps the biggest funder of them all is the U.S. Department of Defense. So when we talk about leaving technological, our technological future up to the choice of the individual, let's be very clear that we're talking about letting the individual decide whether or not to buy the technologies that are brought to us by Amazon, by Google, by Apple, and by the U.S. Defense Department. And there's another problem with leaving the choice of a collective future up to the individual. Because some of individual choices, some of individual choices to buy something for themselves, a choice that they make with themselves in mind, that doesn't equal a collective decision. It's not saying, just by aggregating individual choices we don't create a collective decision. Now, so the paradox we're getting to here is that transhumanism can't simply be left to the individual because leaving it up to the individual isn't really giving the individual much of a choice at all. Simply giving him the choice whether to buy something or not, something that was created without any input from him. So it seems on the one hand that politics must be the means by which we decide. Because what is politics? Forget specific governments for a second. Politics, from the very beginning, is the means by which humans make collective choices about their organization, about the environment in which they live, about their futures. So in other words, if transhumanism and a technological future is a choice about the collective environment in which we wish to live, we wish to live, then politics would seem the natural way to decide upon this collective choice. Except that we don't just live in a political society. We live in a liberal society, a politically liberal society. And the specificity of liberalism is that it's a political doctrine that refuses to endorse or enact substantive collective visions of the future. Or in other words, if politics is the tool with which humans come together and make collective decisions about the organization and their future, liberalism is the political doctrine that has given up this unique possibility and power of politics. So, now, none of you guys wanted to accept a government laptop. I wouldn't either. But one of the things that the example of the laptops makes clear. And the fact that although none of us here would necessarily accept a government-issued laptop, we're all happy to use the internet, the dark net, satellites, all things that were mainly funded by various governments, mainly the US. So we're all relatively happy to use these technologies that essentially came from our government. And what the existence of all these technologies show, the fact that the dark net, the internet, satellites, virtually every major technology can in some way be traced back to government initiative, is that just because technology isn't political, just because we don't consider it political doesn't mean the government doesn't have its hand in it. What I mean by that is just because we aren't coming together as a political entity to decide upon, to discuss and decide upon a technological future, doesn't mean that our government doesn't have its hands everywhere in our technological future and present. The difference is that when technology is not political, when technology is not something around which we gather to decide collectively, when we ignore the fact that technology is another type of collective environment in which we all live, when we ignore this, we don't simply get the government outside of technology, it's just that the government pursues technology for its own ends, for its own ends, and not for the ends of some collectively agreed upon vision of the future. So simplify things drastically, most major government technology initiatives, as we all know, they come from the Defense Department. Most of the government's initiatives in technology have something to do with defense or attack, they have to do with war. And so what this leads us to realize is that making technology political is not giving new powers to the government. In a sense, it can be seen as quite the opposite, because whenever something ceases to become political, the government simply has a free reign to do what it will in that sphere, because there is no mandate from the people, from the political entity about what to do in that given sphere. In a sense, only once we recognize that technology is political, only once we recognize that transhumanism would constitute a new environment, a new set of rules within which we would live, only when we recognize this and recognize furthermore that we all would like to and all must have a say in the design of this future environment. Only then can technology become something political once again, instead of being something that is supposedly neutral, which it is today, and something which can simply be used according to the whims and goals of specific parts of the US government and all governments, but especially the US Defense Department, which has such a lovely, such a large, and untouchable budget. So to rephrase it, I think the essential transhumanist paradox has to do with these two types of liberties. On the one hand, we have what could be called the liberal liberty. It's the liberty we think about when we think of the word liberty, and it's the liberty to choose what to do in a specific situation. That is the liberal liberty. That is the liberty that our states are built to protect. But that liberty to choose what to do in a specific situation, it's nothing. If we don't also have the liberty to deliberate upon and choose our common future. For much of human history, this simple fact has been obscured by the fact that we haven't had the technological means to decide about the environment in which we live. We've decided politically, we've been able to change the laws under which we live, but the basic rules say of our body and of the natural world around us, those have been outside of our scope. We haven't been able to deliberate upon them because we couldn't do anything to them. But today we're entering, we're just beginning to enter the era where we will have that choice, where we have the technological capabilities to redesign the basic environment, the basic rules and laws under which we live. These are laws much more basic than our political laws. These are the laws of our body, these are the laws of the type of reality we live in, and these are the laws and the state of the earth we live on. And so only if we recognize this basic fact that with our growing technological power, we've also opened up a new type of liberty. Only if we recognize that this new type of liberty demands input from all of us and that it's a collective decision, only then will we be able to transcend what I like to call the transhumanist paradox. Thank you. Whoa. We could probably be sitting here and discuss this topic till dawn. Xavier, you really took my breath away. Guys, I suppose we have a tremendous amount of questions out there, isn't it? Jesus, or Xavier the savior. There, please, shoot. So are you saying that in order to have transhumanist utopia instead of dystopia, we would simply need to have software and hardware, of course, as technology, as open as possible for all kinds of people, as well as well open to modify and democratically deliberate upon by as many people as possible. In theory, I agree, but in practice, by opening, by opening up, by simply opening up, say, software and hardware, aren't you essentially putting our futures in the hands of the most technologically capable? And to the most technologically, yes, he's like, yes, well, okay, well, you and I come at this from different points of views because you're very competent on a computer. I could barely get my PowerPoint to run. So I think we simply come at this question from two different sides. But therefore you avoided to take your computer here on the stage, huh? So you wouldn't hack it. Exactly. Thanks for the talk, it was very interesting. What you mentioned about the fact that people would not use, as we saw, for example, an American government issued a computer, it maybe forgets a part that government already vets norms, standards, already emits guidelines, already is implied in many, many levels without even considering the fact that they were behind the beginning of a technology. So can we still consider that we are in the liberal and fact on this part since we already have this implication at pretty much every level be it the first idea or the way it's implemented afterwards? But I think that's precisely the point. Within a liberal framework, we have full liberty because the liberal liberty is simply the liberty to choose between specific things. In a sense, it ignores, like you say, the larger environment in which all the laptops we choose from even if they don't directly come from the government are vetted by them or the fact that many of the basic technologies that all these things rely upon were initially developed by them. So yes, the fact that the larger environment is in many ways controlled by the government is precisely what liberalism can afford to ignore. That was not what I was aiming at more than... Is it a rule set? Is it our government? Yeah, it's not a rule set. Is it something like you have a guideline, nationality? You have the choice to actually follow those guidelines for a couple of things. So you already have the liberal part, but the government is still implied. So, which does not render the paradox smooth, but I think... The fact that we have the choice whether to go along with it or not? Yeah. Well, but that's the thing. As I talked about before, when it comes to technology, first of all, technology transforms what it means to be a human. And second of all, when a technology becomes adopted, widespread enough, then it becomes an entry pass for society. In other words, you can't really meaningfully interact or take part in a society without also using these basic technologies that make it up. So, in that sense, the choice whether to use or not to use isn't much of a choice at all, especially when we consider that the larger choice that we're faced with is multipolar. It's not between using something or not. It's a question of what are we gonna build in the first place? Man, several questions. I start there. Sir, yes, you. You. I would like to ask what if transhumanist technology will only be available for the rich? So, how will you prevent that development? I mean, there's a lot of interest to keep it only to a small amount of people. And I think that goes back. I think that's one of the big worries about transhumanism. And that's yet another argument against dealing with technology within the liberal framework. Because if we leave it within the liberal framework of liberty, you have the choice to buy something or not. Like you say, the technology will initially probably be exorbitantly expensive, for example. It'll be initially available to those with the most means. So, and I think that's yet another argument for not staying within the liberal framework of simple individual liberty. Right. Sir. But I don't have a solution. I'm not saying that I know exactly how we're going to create the technology and make sure it gets to everyone. But I would say that making technology political, making it a democratic choice, is surely a first step. You're calling for assistant reboots. Be careful. We have seen with strong cryptography that government power to regulate technology is not limitless. What makes you think that we even have this choice whether to regulate transhumanism or not? But it's not a question of regulation. And I think that's a crucial point you just made. Because once the technology exists, regulating it, like you say, is more or less pointless. Let's take the example of the internet. The internet is a basic tool. Basically it's a tool for communication. And all efforts, even by governments such as China, to regulate the internet, to squeeze it, to make it different from what it was originally intended for, generally fail. Even someone like me can get beyond the China censorship laws. So I think it's precisely a question of getting beyond regulation and concentrating on what gets built in the first place. So projecting ourselves into the future rather than concentrating on what already exists. Traveling through times, that's the answer. Do we still have time for questions? No? No? If you have more questions than have to do. One more, one more, the man in green. Okay, one more, one more, last one. Do markets, is it working? Okay, do markets have a part in the society you imagine or capitalism? It's not necessarily that they don't. It's not necessarily, they don't necessarily not. But they are certainly not the guiding force that guides what technology is created. So my argument isn't that capitalism must be taken down. It is simply, it simply cannot be the primary vector through which we decide what's get built in the first place. Guys, I have to apologize, we have to shut down here because we would be discussing this till dawn, so I hope we will see you next year again because really this opens up new chapters of humanity.