 There's some fear there that's useful, but generally speaking I think we have at least 50 years before machines can be like humans Which is when we should really be afraid of them. In the meantime, it's most about practical things like societal cultural changes at work For the next 20 years, it's really about how machines are changing our culture and what we should not let them do So right now companies are asking what can we automate, but I think the real question is what should we not automate You know to retain what we are and that is really much bigger question Yes, I think we need we need to have a moratorium on the development of Technology as a weapon like we have for nuclear so artificial intelligence genome editing and geoengineering But only for the level of where we say this can clearly be used as a weapon, right? So we should not develop weapons that would allow people to program humans as super soldiers Currently that is a little bit too uncontrolled At the same time if we can change if we can avoid cancer We have to do that and if we can get the world to be better because of AI Yes, but only to a certain limit. So I propose the Digital Ethics Council a global organization that does nothing but have professional thinking about What we should do what we should not do and to see if we can agree on the rules, you know very simple rules Like if we automate we pay a tax So that we can get people to learn their job And that we not use artificial intelligence as a weapon And we don't allow machines to be controlled by themselves like killing Just really obvious things, you know, I think we could agree on But usually humans respond to bad things, you know, we don't change voluntarily And with climate change now we're going to eat shit in the next 20 years, right? So basically we had to wait for bad things to happen In artificial intelligence, we may see a major accident Like a hundred million genomes used to create a superhuman That will cause a huge amount of problems and death and then we move That could be the case. I hope not but We may see a sort of a small war around geoengineering. Yeah, I think it's important for our kids to learn that Technology is not the savior of the world, you know, it's a tool but The skills that machines will not have for a long time are the skills that we all have already born and we have to develop those passion understanding foresight intuition imagination and so it's much better for us to be more human So that we can focus on that rather than to compete with the machines And so the less you are like a machine the better for your future And just until recently it was you know, you were a good machine a good worker, you know a Good soldier, you know By just doing what what the job was, you know, but the future is not doing the job It's doing a new job. It's inventing your job. And that's what we have to teach our kids And I think for that it's unlikely that we can learn that in the school So the schools have to be much more real life. I would much rather have my kids go to India for three months You know and learn how to deal with Then maybe get an MBA But the best of course that they can get both but for now