 Hi, I'm Emily Miller and very happy to be here today with Gerard Leonhardt, a futurist and speaker and author who specializes in the debate between humanity and technology. Hello. Welcome. Thank you for joining me today. Thanks for having me. Sure. So, I've been interested in many of the things that you've had to say when you've talked about digital, you know, digital ethics and how technology and humanity, you know, are coming together. Some could say it's technology versus humanity or technology and humanity. One of the things that I thought was really interesting was when you spoke about the late U.S. Supreme Court Judge Potter Stewart, you proposed this as a working definition of ethics, that ethics is knowing the difference between what you have a right or the power to do and what is the right thing to do. So, with that in mind, what challenges do you see arising in the near future for technologists and technology companies? Yes, of course, that's a very, very deep challenge, right? Because, you know, basically technology is becoming infinitely powerful. We have maybe 10 years left and we have quantum computing, 5G networks, substitutes for minerals in the mobile devices, and basically the sky is the limit. And as we can see, genetic engineering, geoengineering becoming feasible, doable, and basically tech will have no limits in 10 years. And then we have to think about, well, if it has no limits, then how do we define what we want? Right now, we're still struggling with, you know, how do we get the Internet of Things to work and does it actually do anything useful and things like that and how to pay for it. But in 10 years, the question will only be, what do we want? Do we want technology to basically merge with us or not? And I think it's interesting how we, you made a comment when we were having a conversation just a moment ago about how technology doesn't have, you know, it's not biased one way or the other. It has no morality, you know, and that it is really, it is then up to us and how we decide to use technology. I think that's, I think that's the challenge about ethics, because first of all, who defines what is right or wrong? I think we would agree on many things like not having autonomous weapons killing without human supervision. Most people would agree on that. But otherwise, you know, defining right or wrong isn't necessarily easy, because for example, if you lose both of your legs in a car accident, of course you should have a prosthesis and become a cyborg, so to speak, because you had an accident. But if you voluntarily say that you would lose your legs, so to speak, to get new ones and are better. Is that the right thing? That's probably not good. And who would decide those? That's like Supreme Court material. And so that's one thing. And the other thing is about ethics is the understanding of the power that technology companies have now is bigger than any oil or gas or banking, because technology is everywhere. And so they have huge responsibility, but really what they do primarily is to invent new things, which is good. But you know, at a certain point, you have to also look at the externalities. What else do you create? Right, for sure. So how do you see our growing reliance on technology and this quest to kind of transcend the limits of humanity? You know, I think a lot of people think of that, but it's really a challenge then to retain our humanity as a result. Well, first, I think it's a good thing that we can probably solve most of the problems of society using technology. So we're looking at things like energy, unlimited energy, 20 years, roughly, diseases, diabetes, cancer, not not heal cancer, but prevent cancer, food, vertical farming. All of those things that we like water deceleration, I mean, those things are solvable by technology. But like we have solved, for example, that we can all listen to music very cheaply or watch movies, that's been good. But we have to distribute the benefit of technology. And this is what hasn't happened so far. So we've had all these achievements, but it's really benefit that most companies actually own the technology, biggest computer wins kind of thing. And so if we want technology to actually have a collective benefit, then we have to think about how we do that with taxes, with automation tax, whatever we have to entertain as a solution. And that becomes more political then, then we have to decide what we want. Right. It has to be for the greater good. And that has to be a mindful choice to do that. Yes. Well, I think if we have achieved things like cheap energy through solar energy, we probably have to license it to countries very cheaply. And currently we're not doing that. Currently the first genetic treatment called Cymria, which is by Novartis, costs $475,000 to cure leukemia. And you can only imagine when in 15 years we can cure diabetes and then it's going to cost a million dollars. So that seems like that wouldn't work in the straightforward approach. Also artificial intelligence will reduce the number of jobs that are based on routine. So then we have to invest and get the people out of routine. Right. And like in the U.S., that's a lot of people. It is. So one of the things that I thought was very interesting in looking at some of your writing is this notion of efficiency and how efficiency doesn't like mystery. Efficiency doesn't like imperfection. And I think that is kind of this challenge, right, that efficiency is a wonderful thing to have. But do we lose the things that make life somewhat interesting and mysterious? Yes. Well, efficiency is an interesting point when you don't have much of it. You want more of it. Right. That's kind of like being online. You want nothing more than to be offline when you're too much online. So when efficiency becomes too much of a force, like in social media, it's a very efficient dissemination of information. But the quality of it and the human is a bit of suffering. So I have this huge fire hose of anything I want, but nothing is really valuable. Oh, well, that's what this way. Some could be valuable. But in generally, it's the opposite of the human aspect. The human aspect goes much more about meaning and purpose and context and background and relationships, which is very hard for machine to define. Right. So when we let the machine take over, the machine will make everything efficient. And if the machine's job is to make paperclips, it'll make paperclips from us. If it can, that's called efficiency. And we would say, well, that's enough with efficiency, because now we're doing something that is the opposite, because humans are actually totally inefficient. Our bodies are very efficient. Right. But the way that we act, we have to sleep, we have to take time off, we have to digest, we have to contemplate. In many years we grow. All these things. And then we die. That's also very inefficient. So I think if we want to change all of that, there wouldn't be much left of us. One of the things, so in reading part of the book, the technology versus humanity, one of the things you discussed is this notion of amputation. So as technology gives us a capability, we then amputate the capability we used to have. And I had an example of this happening when I was in Las Vegas a few months ago. I asked somebody, I was at a store, a clothing store. I asked the person, what's the best way to get back to the hotel that I can avoid the big tourist areas? I just wanted to give my taxi driver the tip on which way to go, so he didn't take me down Las Vegas Boulevard. And the person behind the counter just honestly couldn't provide any sense of direction, even just a street name. And he said, well, do you have Google? And I was like, of course I have Google. But we've lost that ability like local knowledge to me seems to be something that we are losing. And I would just be interested to hear your thoughts on what do we need to make sure that we're not amputating? Because having GPS is wonderful. But if I don't have that, I still need to know how to navigate. And that skill of actually knowing how to use a map, is that okay to lose? Well, I think there's some things are okay to lose, and other ones are not, for example, if we were to never drive a car again, Germans would say that's terrible, right? But everybody else would say, hey, it's kind of fun to drive a car sometimes. But if we lose that, not a big deal. But to give up giving birth, for example, which people are proposing, that we can have children outside in an artificial womb, it's not a joke. I mean, you shudder as a woman to think about that, right? But people are proposing that. That is a ludicrous idea. So basically, we have to say, what makes us human? And driving a car does not make us human. But finding our way, I think that some of that we should retain. And then there's actually a job called the rewilder, which makes people wild again. So you go into the forest and you figure out how to tap back into those things that you still have. But what brought me most there is that we may be the last generation of people that actually knows what offline actually means. True. That's an interesting concept. 15-year-old kids are always online. And so I think that we have to go back and make a conscious effort of saying, you know, you should know how to speak to a person and how you do. You should have those skills. You should be able to write, for example, there's also people saying, now, why do our kids learn how to handwrite? Because they can speak to a computer, right? I mean, every psychologist knows that if you don't learn how to write, you're going to not turn out so well. You're going to not express yourself, right? Well, there has other side effects. So we have to define what those things are and what we want to drop. What we not want to drop, for example, we shouldn't make a partnership with a robot, you know, socially acceptable, normal thing, aberration. Yes, people do what they want, but normal. I'm not so sure. No, it's interesting because I take pride in knowing how to navigate a map from a book, but I don't know if anybody cares anymore. There's no, it's like the value from that is gone. Well, I think this is, you know, when I talk to the kids about this, my kids or other kids, younger kids, it's very often attributed to age, but I think it's really about appreciating a humanity, humanism, and to value what you have. For example, we can instantly tell another person by 0.5 second of a contact, roughly. Machines can't do that. Right. We can tell a story. We can make up stuff. We can be funny. We can make a mistake. We can do all these things that machines can do. And now technology is coming to us and say, well, let's transcend humanity. Leave all that baggage behind so we can live forever and be superhuman, and to which I would say that's probably not superhuman. That's probably a downgrade, not an upgrade. Yeah, interesting. So how does that kind of play into this concept of hell then? You talked about kind of the heaven and hell and coming together as like a hell then reality. Well, you know, I'm an optimist about technology. I come from that background, of course, but I think technology has the potential to solve these really gigantic problems, food, water, electricity, and science is progressing rapidly in there. But this could be heaven, but it could be hell if we use it in a way where we, A, don't give it to the ones that should all have it, like the developing countries and even other countries, where we don't spend the benefit or if we use it as a weapon, which is already happening with artificial intelligence, originating engineering, where all the major countries around the world are saying that we're going to be the leader because that's like having a bomb. I think that's a very bad idea because we wouldn't survive it. So I think that's the hell then. I think it could be heaven, but it's going to take some sort of collective will to actually bring that about. And leadership. I think leadership and setting the example. And so I'd be interested to know from your perspective, because a lot of the power and the control are with corporations who, you know, what companies are good examples that are exhibiting like interest or acknowledging that they have a role to play with digital ethics. This is really exploding right now, this discussion. In my book, I've said that we should spend as much money on humanity as we should spend on technology. So we don't just buy new software, but we also hire people with the high EQ and we create diversity in the workforce. We hire more women, more young people, more, you know, spend money on that as well, because that's really what creates the value and not just buying new tech. And if you take a look what's happening right now, for example, Apple is making a key point here and many people are, you know, criticizing Apple for other reasons. But Apple has always said our stuff is very expensive, but we don't care what's in it. So you can be pretty sure that Apple does not want to know and Apple has been fighting with the FBI for years about this. And so I think that's also when you listen to Tim Cook is quite clear that he says, you know, this is surveillance. The other business models are surveillance. So yes, take it with a grain of salt. Microsoft has made great steps in this direction also saying now they put the data into Europe, they have an AI council, SAP has an AI council now. IBM is looking in this direction, even though IBM, that's difficult because they're the centerpiece of AI. But all the tech companies that are really looking forward in general are now saying, okay, technology is great, but sometimes we don't use it. That's even better. Right. They're acknowledging it. That's good. Well, as I say, often my speeches on technology is great, but humanity is greater. I hope so. I hope it stays that way. So one of the things that, you know, again, I think was interesting about, you know, the five core human rights that you see forming the basis of that future digital ethics manifesto, you know, maybe if you can tell us a little more about, you know, this notion of the right to remain neutral, the right to be inefficient if and when it defines our basic humanness, the right to disconnect, the right to be anonymous, and the right to employ or involve people instead of machines. Yes. I think the key challenge that we have here is that technology gives us this idea that everything has to be efficient, which is great when things don't work. You know, we don't want an inefficient airplane engine or so. But efficiency is something that machines are very good at when they get good enough, you know, software hardware becomes efficient, like autonomous cars and so on. But humans are not because efficiency is not something that we appreciate as a value by itself. For example, if you live in a family and everybody's inefficient, then it's a mess. You don't like it. But if somebody's really efficient, you wouldn't give them a huge amount of praise. You know, that's more about personality and other things. So if the world becomes so efficient that our inefficiency, for example, we're tired, or we just don't have a good day, or we just had some bad incident or, you know, whatever it is, if that has no longer any room, then we're being reduced. That's called reductionism. So we're saying, okay, don't worry about doing something that takes a long time, because you can use this tech to make it cut down to 5%. So no musician ever learns how to play guitar. They all learn how to play the iPad. And it only takes 10 hours, not 10,000 hours. So that's one of the big things. And it goes with this idea of the right to go offline, to disconnect, because humans are connecting on a dozen different ways when we communicate in real life. And they're not electronic. And the electronic measure of connecting is another measure altogether. But if you eat information, like you eat too much food, you become obese, you eventually die. And I think it's the same thing with information and data. Information overload. And, you know, the social network fanatics that spend a lot of time on social networks, they're the highest cause of death is suicide. Because it's a complete overload and losing yourself. It's like a drug in many ways. So in some ways, we have to look at technology as a drug. No, it is an addiction. And I can see that I have two, I have twin girls. So I kind of get to run an experiment with them. And one just loves technology. You know, she wakes up, she wants to engage. The other one is able to turn off. So some of that I think is that personality. But it can be a very addictive situation. And as a parent, how do I think about, you know, how do I teach her? How do I let her learn about the benefits of technology? We went to see the movie Ralph Breaks the Internet, which was a very interesting way to try to dimensionalize the internet for children. And you're going through this, you know, amazing digital world they've created where they're showing all, you know, you know, Snapchat and my kids recognize Snapchat and all these things. And then they showed, you know, how people just watch cap videos. And I thought at the end of the day, is that kind of what the message is that the internet is cap videos? When I know like there's so much goodness that can come out of it. But on the flip side, there is so much that is, you know, pretty terrible. Well, it all comes down to balance, right? So I mean, with any drug, it's about balance, you can drink 50 cups of coffee, you're not going to be very happy. And we should not make drugs illegal. Most of them, some we should, but exactly for that same reason, you know, I think it's fine that a German 16 year old kid can drink a beer. You know, we should not make everything illegal or so. But there is a context of how we make it maybe moderation, or maybe if you do too much of the a social people look kind of, you know, you drink a bottle of brandy for breakfast, you look strange, even though you could. Yeah. But so that's the same with technology. So we have to find moderation. And then we have to, we have to be able to use it to the effect of actually being a benefit rather than like hiding behind it, like, like when you, when you smoke too much or drink too much, you're hiding behind it, you know, it's different. So I think that will be the challenge. And we need sort of an ethical framework, a social contract around this. Well, it is interesting because when I think about, you know, common, and it's also about having a common ethical framework, because for example, I grew up in California. I grew up experienced a drought when I was a young child. So I learned as a child that you don't waste water. You don't water your lawn. California has so many people that have moved in that come from the East Coast, other parts of the world, you know, where maybe they have more rain, they have more water to water a lawn. And it is a, it's a cultural trait, this notion of, you know, just because I can water my lawn and have it be green and basically a Mediterranean climate that is going to be to you want to soon, should I? And I think it's that same notion of just because technology gives us the power to do this, should we do it? And that's, that's tough when the opportunity is there to make money. Well, the thing is, of course, we're only at the beginning of this. And this is the problem. You know, right now we're talking about this is kind of a, I mean, it's not a minor concern, but it's not, it's not huge. It's not an epidemic yet. But think about technology ever progressing up the exponential curve, you know, in 10 years, that's roughly another seven steps or so. So we'll be at 256. That's from four, that's 50 times as far. So we'll be in a world where if you look at all the exponential side effects also becoming bigger, then we'll have a lot more addiction. We have a lot more loneliness. We have a lot more people who lose themselves in technology. We have virtual reality, people living inside of virtual reality, like they live inside the smartphone. And we have zombies, you know, basically, in so many ways. And I think that is something we have to take a good look at, because otherwise we end up with what we did with the oil industry. You know, we used the oil to get gas to drive around, you know, and now we have fire from the PPMs in the atmosphere. Right, exactly. And we have massive like highway infrastructures through our cities and, you know. And I think, you know, there's many scientific things that are quite clear. If you let your four-year-old kid watch the iPad the whole time, you're going to get certain results. And the kid will think that the beach is really boring, because, you know, the iPad can't be yours there. So everything in moderation, including moderation sometimes, I think. Yeah, that's, I think that's important. It's also to understand basic humanity in the sense of what we need. You know, what, I mean, the human brain is wired for experiences. So, you know, looking like this, relationships and meaning, you know, some sort of purpose, spiritual, whatever you want to call it. Those are the three things. If we take that all out, we have nothing. We can be an amazing machine, but there'll be no purpose. Right.