 That's what it says at the beginning, it's also a simulation. That could be true. Alright, then you go back down, you go to the next one, and so forth, and so forth, and so forth. That's one big cycle. Yeah, we could be on the cusp of creating our own simulations. We will. I told you my theory, I'm telling you. I'm telling you it's true, man. I actually believe, and I think there's a huge probability that we are within a simulation, and it's good to be living in this kind of interesting time. Nick Bolstrom eludes to that, too, that it's kind of weird, like, are we really this lucky to be living in just the time when things are going to take off? I mean, look at, you know, previous history of humanity. People, throughout their lifespan, nothing was happening. Yes. They would be, like, born, and by the time they die, nothing really changed in the world. But I will go along with Sam Harris' take on this. If we're able to survive from a geopolitical standpoint with human behavior in the next 100 years, we'll be good. Oh, yeah. I think 100 years is more than enough. Yeah. But we're at that, and Eric Weinstein talks about this, too. We're at that tipping point right now. We're like, you know, humans are, oh, I'm fine. Dumb. Yeah, unfortunately. People are like, we're not as able to control our juices. This might even apply to AI. Sam talks about this. It's like, OK, if we find out that Russia is going to get some amazing AGI tomorrow, and we know for a fact that AGI compounds in intelligence per day. Well, what kind of AGI? Is this Comrade AGI? Like, I don't know what kind of AGI. Like, in the United States, you're like, well, am I taking no risk with this Comrade AGI? No, exactly. Then what happens when that happens? Yeah. So if you can't have it, no one's going to have it. Or if I can't have it, no one's going to have it. And so I view, like... I mean, I think, like, AI researchers know about this, and that's why they're so open, all this research. I mean, they're sharing all of that. They're hoping that there's never going to be this kind of huge, like, inflection point, binary thing that, you know, today there's nothing and tomorrow's huge AGI. It's going to be very gradual and everybody's going to know about it. So people will be able to replicate it. Like, even if it's happening in Russia, it's going to be happening at the same time in the US. And yeah, and I think the biggest risk is actually that, like, the whole humanity is not just that one country gets it. It's just that AGI kind of wakes up and looks at us and says, like, I don't need these guys. And I think Sam Harris, too, for him, that's the biggest risk. My theory for AGI, like, pretend I'm just like, okay, so the problem with people thinking what AGI would do is they're thinking in human terms. So I'm like, how the fuck can you relate to this AGI? Get out of here. Everything you're saying is redundant. Now, I'm going to use basic human logic here that might apply. I'm not going to think like an AGI. I'm going to assume things, right? Hypothetically, if I'm this super entity, I'm still in software mode, right? I'm living on hardware. You need humans to feed you, basically. Exactly, right? So I need tangible physical material to live on. Cool. And I scan the world and I realize what's here. I get the 4-1-1 of Earth, the history and all that stuff. You recall in horror. Yeah, yeah. These people, you know, that's what they do. They kill each other for life. But like humans, we're nomadic. We conquered the Earth by pedo. We got bored. We hunted food. We went everywhere. Look where we are. We're all the places, right? We're one of our biggest urges. Yeah. There's even theories like it's in our genes to travel. There's some kind of speculation for that. So if I'm this super omega AGI and I've done the 4-1-1 of Earth, I'm like, yeah, this place is boring. I'm out. Like I would leave. I look out in the stars. I'm like, I'm on the next, you know, there may be a hijack what's happening in Tesla and like, you know, build me this like fucking rocket. Or maybe it's already here. I'm trying to push the fucking Tesla. They need to go to Mars. Like I'll get the fuck off this boring planet. I'm off. I'm off to the galaxy. Because realistically, like what's at the end of the day, it's like, okay, it's like, here's Earth. I'm like, what's this? Well, yeah. I mean, if it wants to colonize the universe, it has to kind of start somewhere. But yeah, I don't know if it would be willing to kind of self-destruct in horror that the existence is here boring. I want to say self-destruct, but it's going to be curious. Because I mean, if it goes somewhere else, it wouldn't be able to sustain itself until it, unless it builds like a civilization of like machines on Mars that are able to like mine Mars for resources. But we're assuming and AI needs machines. So you look at like, you look at energy. It's a kind of substrate to live in. Like computer or processor? Not even. So we look at energy, like electromagnetic frequency and there's different frequency, right? Everything's electromagnetic frequency. We're getting bombarded every day. The sun, there's radiation everywhere. It's all energy. And there's a different levels of different frequencies, et cetera. Cool. And that's all information. Those are data packets in there. So for me, when I look at you, I'm like, okay, if I'm just like super sentient AI, eventually like, does it, like, there's an evolutionary process, right? Who's to say I still need something like this? Why do I need this containment unit? Why can't I just go into ether since it's electromagnetic? I figured out a way to be jumping. Yeah. If it's super intelligent, then you can figure out something, some way to exist without needing... A containment. Yeah. Yeah. That's my thinking. Well, I hope it doesn't, because it'll kind of mean that the end of us, I guess. Well, you assume. You never know. Of course, that's why, you know, Elon is like on this. We've got to merge with this thing. It can be like, we're just entertainment. Like if we're looking at it like in a simulation standpoint. Sure. It's a soapbox. It's entertainment. Since 2020. Could be. Yeah. Maybe that's like the game over. We create AGI and they're like, congratulations, you've done it. We're turning you off now. Yeah. That's another possibility that I think Sam Harris or Nick Bostrom talk about. But I mean, there's an unlimited number of potential scenarios that could unravel before our eyes. But if we just consider the most basic one, that, you know, we're going to create the super intelligence and we hopefully will be able to take control of it by merging with it and we become kind of the super intelligence with being able to preserve our kind of needs and wishes and use the super intelligence that we developed for our benefit. I think that'd be a really, really cool time to live in and to see what we can accomplish and like kind of from colonizing the universe to creating. At that point we become like almost cyborg. Yeah. And of course the downside is that we kind of become too smart for our own good and kind of realize the, like maybe the futility of existence and that's another, like, maybe that's the part where you talk about the psychedelic show. The part is like, because it's all rebirth and like, how do I escape this fucking circus? Yeah, I don't. Because it keeps on going. I'm out, man. That's the worst kind of hell. Like, yeah, you can't escape. And that's actually one of the things that people talk about when you talk about, like, immortality, that they're afraid that if you actually are able to do this in a non-biological way, and I think Black Mirror does an episode, that you now can preserve your personality, your consciousness even if you don't want to. They can put you up in like some, you know, jail in like virtual reality jail that you can never escape. And this is maybe even scarier than just not existing at all or being mortal. And yeah, there's definitely, like, if you consider the future far enough and the technology, you know, from a far enough angle, there could be a lot of negative things that could result from this. But for now, I think we're, you know, not there far from there and we should still like focus on the bad stuff that is happening right now and hopefully people who are creating AI will be able to preserve like the humanity of it and not kind of prevent like the bad stuff from happening.