 So these neural networks and AI programs have a lot of people worried about what our futures are gonna look like. Some people think that this technology is gonna cause an even bigger upheaval to human labor and really our entire way of life than the creation of the steam engine and the internet combined. But so far, the greatest travesty caused by this technology according to statements and actions taken by Twitter, Microsoft, and even the White House has been Taylor Swift's AI nudes going viral. That's right, if a computer takes your job, then hey, that's just the cost of technological advancements, kiddo. Maybe you should have learned how to code instead of going to school to be a trucker. And if the computer that took your trucking job ends up causing a pile up on the interstate and killing a bunch of people, then again, that's just the cost of technological advancement. You gotta break a few eggs to make a delicious cyberpunk dystopia omelet. But if someone dares use the heckin' computer to create somewhat realistic fake pictures of the rich and famous pretty lady giving new lap dances to Kansas City cheesephands at a football game, then we gotta do everything in our power to shut that down. We gotta send SEAL Team 6 after Prometheus before he burns down the world with this fire that he stole from the gods. So these fake images, some of which got over 45 million views in under 24 hours on Twitter, have actually been floating around some of the seedier parts of the internet like 4chan and Telegram for months already. Now of course, the closer to the surface of the internet images like this go, the more they're gonna be seen, the more exposure they're gonna get. And the surface web that I'm talking about is sites like Twitter, Reddit, Google, the sites that everybody goes on. But at the same time, these online spaces towards the surface of the internet are much more regulated and it's much more likely that extreme measures will be taken to stop the spread of naughty stuff there. Like on Twitter for example, they actually temporarily disabled searching for Taylor Swift on the platform altogether. This is all you would get when that search block was in place while Twitter staff were in the background cleaning up all of her AI nudes from the platform, which apparently Elon also hired over 100 more people to work at Twitter as content moderators after the scandal. So look at that, Taylor Swift's out here helping people get their jobs back at Twitter. And most of the popular search engines like Google and Bing, they also scrubbed results of this from their search results, like if you search for Taylor Swift AI nudes there, they're just gonna take you to news sites talking about the story. So sure, this approach to just purging content from the surface web kind of works temporarily to keep stuff away from less internet savvy folks that don't know how to search in the right places. But as we saw with the Streisand effect, right? Like when Barbara Streisand tried to get pitches of her house taken off the internet, this isn't a permanent solution and it can oftentimes backfire. So these images are still out there. I mean, they're still on the internet if you know how to find them. And they're still definitely lurking on telegram since that's like a kind of dark web. I mean, there's a lot of actually illegal stuff just going on down on telegram. I used to know guys that operated unlicensed pharmaceutical delivery services via telegram. And when you think about it, there's literally federally funded agencies like the DEA and FBI and I guess also the IRS since people who do unlicensed pharmaceutical delivery sales tend to not pay their taxes on it. And then of course there's Interpol, there's all kinds of agencies that are dedicated to stopping that kind of activity but they're not very successful with it. I mean, yeah, they do catch some people using these platforms from time to time but those busts typically happen due to simple opsec mistakes like the criminals talking too much instead of an actual technical flaw with the platform or the creators of the platform working with the police to catch criminals on it. So I doubt something that, well, isn't even a crime already but when it does become a crime, which it probably will, I still don't think it's gonna be that easy to fight fake celebrity nudes on sites like telegram. And I bring it up because I think it's important to understand the technical aspects of this issue before we actually do try to think about legislating it or whether we should criminalize it at all. Now, of course, you already know that an AI image generation tool was used to create these images but funny enough, it wasn't stable diffusion or any of the local image generation models that you would think would be used to create these deep fakes. It was actually Microsoft's very own designer web app that made the most viral of these deep fakes and I think actually all of the Taylor Swift football deep fakes were created in Microsoft's designer. And what's interesting about this is designer along with every other online AI tool already has a lot of built-in stops in place to prevent people from being able to generate naughty stuff with these AI, like very specifically to stop people from generating nudes and anything that would be racist, offensive or anything like that. And Microsoft in particular should know more than anyone else about how to implement these checks and balances in their AI because the internet has been trolling Microsoft's AI creations for almost a decade now. Some of you might remember Tay, the AI that Microsoft put on Twitter that was originally supposed to mimic the tweeting habits of a 19-year-old American girl and it was also supposed to be able to learn from interacting with humans on the platform. And boy, did Tay learn a lot because after just a few hours of talking to folks on Twitter, Tay had become a full-blown ethno-nationalist whose favorite work of historical fiction was the Holocaust. And after Microsoft took down Tay, they made a blog post titled Learning from Tay's Introduction where they said that Tay is now offline and we're gonna look into bringing Tay back only when we are confident that we can better anticipate malicious intent that conflicts with our principles and our values, AKA where you gotta figure out how to make Tay not be racist. And Tay never did officially come back after all these years, which tells me that Microsoft never actually solved the offensive AI problem. And this is further confirmed by their current generation of AI, which is, doesn't just tweet, but it's able to draw, right? It's able to create these offensive images despite all of the DRM that Microsoft has put in place. Now, since Microsoft and social media companies like Twitter are really good at tracking people across the web, they might be able to identify the people who create and spread these images. And if we ever do get to a point where creating or sharing deep fake news is federally illegal, I mean, there is a bill that the Senate just introduced to make it federally illegal, then those ramifications of being identified and then going to jail might have an impact on people creating and spreading these images. But then again, CSAM is already illegal and that's done nothing to stop the spread of it on social media sites. And I'm not saying that they're equal because obviously the way that I feel about this is that it's a fake image, right? Nobody's actually being harmed to create these images, which is why I don't see it in the same light as CSAM or even AI-generated CSAM. But I've talked about that before. So I really see this offensive AI-generated images or just offensive AI-generated data in general as a purely technical issue that can't just simply be legislated away. So should we even bother with legislating it away? Personally, I don't like to criminalize activities that take away people's freedom unless the people taking part in that activity are actually hurting someone. And in this case, I think it's very hard to say that Taylor Swift is actually being harmed by these images. I mean, obviously she isn't realistically being like drugged or hypnotized or whatever and then these images are being created like they're just, they're fake images. And I obviously can't speak to how it would be affecting her mentally or emotionally. I mean, I'm not her psychologist, but I do think it's safe to say objectively these images have not have any negative impact to her career. I really doubt that any of the fake nudes that have been created of her in the past decade have negatively affected her. I mean, this stuff where you create fake nudes of celebrities really isn't new. Like I've talked about how people use to Photoshop, like they basically cut off a girl's head and then put it on a naked girl's body that looks similar. And I see this as just the evolution of that with like a lower skill set barrier. In fact, I would argue that these AI generated nudes are actually less convincing than the Photoshop nudes from years and years ago. None of the AI stuff that I saw like the Taylor Swift stuff has realistic genitalia. The boobs don't look real. The fake images I saw of her giving head don't look like she's actually sucking someone's dick. It looks like she's sucking on a greasy undercooked hot dog that you'd get at a gas station. And of course, in a classic AI fashion, a lot of the hands have too many or too few fingers. I find it hard to believe that anyone is looking at these pictures and thinking that they're real. And it's even harder for me to understand why people think the creators of these images should be jailed. I guess some people just want death to anyone that makes Taylor Swift have a slightly bad day. But if you are a militant Swifty, there's a bright side to all of this evil AI stuff. Because like I said, people have been creating Taylor Swift deepfakes and Photoshop's for over a decade now. And it's very unlikely that this stuff will stop or even slow down. So why not use this to your advantage? If you're a celebrity that does have real nudes, whether they've been leaked or not, or you're worried about them leaking, or if you're worried about like a paparazzi taking a candid photo of you having a nip slip or something like that, now you can just say it's a deep fake. Because there's probably tens of thousands of fakes out there. Like I came across a couple hundred just in doing research for this video alone. This technology gives celebrities like Taylor Swift plausible deniability if any real nudes did end up getting leaked. Generative AI tools definitely require less skill from the user to create these images compared to Gimp or Photoshop. And those tools require less user skill than painting or drawing. But at the end of the day, these are all artistic tools. And it isn't right to criminalize people for using them as they please.