 sure. But let me ask you a personal question. Is there one element of the future that even though logically you can argue, yes, that's a positive thing subconsciously or inherently or deep in your gut, you are actually afraid of like that's something about the future that scares you. Kevin Kelly. Yeah. So, so one of the concerns that have a troubling concern, because I'm not, there's two troubling concerns I have. One of them is as we make AIs more capable, particularly if we embody them with some kind of bodies, and we make them ubiquitous, at some point it seems to me that we might come to treat them like slaves. Whether they're slaves or not, I think treating them like they were slaves might be corrosive to us and our spirit. So I have a concern about that, about whether or not, and that concern was sort of made visible by some videos of a robot mule called Boston Dynamics, where they make this four-footed robot. It kind of looks like a pony or something without a head. It's a robot that is walking and to show how stable it is, people kick it, to try to kick it down. And you watch it and you cringe, because it feels as if someone's kicking an animal, even though it's obviously not an animal. If you had four wheels, we wouldn't cringe, but the fact that it has four legs makes us uncomfortable. And I think we are going to have far more emotional attachment, maybe as illustrated by the movie Her. I think we have far more emotional attachment to some of these things that we're making, because in fact we are going to engineer emotion into them, because motion is very very useful to have. And so I'm concerned about how treating these things as servants or slaves might affect us, whether even if they're mechanical, that might have some long-term effect on us. It may affect our emotions, humans' emotions. So it's not so much what happens to the artificial intelligence, because it's artificial intelligence, but it doesn't think it doesn't feel in the way that we... But how it affects us subconsciously, because we're treating it as if it was a slave or a servant. So was that the first one or was that both? That was one. This is the second one is related to that, and that is I do have a worry about when we weaponize AI and cyber warfare and cyber conflict. And my concern there is that we are rapidly going into an area where we don't have any rules about what's acceptable and what's not. So we have the Geneva Convention and stuff. We have evolved over hundreds of years, which kind of restricts how cool we can be in a certain sense, how crazy we can be. It says you can be this crazy, but not this crazy. You can be this horrible, but not that horrible in war. And right now we are doing things to each other at the state level, the US, China, Russia, Israel, Iran, and others. We're doing things that nobody, that no nation is admitting that they're doing, even though we're doing it. And we don't have any lines, any control about what we were acceptable. So is it okay to disrupt someone's else's banking system? Is it okay to turn electricity off? Is it okay to let a plane crash or whatever? We don't have any acceptable norms for what we are going to tolerate in cyber conflict. And my worry is it'll take a disaster of some sort before we come to some agreement. Sam Walton's rules for building a business. This guy made billions and billions of dollars. In fact, he could have been possibly the wealthiest man on earth.