 AI is becoming part of our world, whether we like it or not. It's been around in various forms for a long time, but now the takeover of AI seems to be inevitable. Just in these past few years, there has been a lot of work being done on AI, whether it's with art, whether it's with mimicking people's voices and faces, or whether it's just basic stuff, like assistance on your phone, like Siri and stuff like that. But as we continue into the 2020s, there has been more and more AI that has been coming out and that's been kind of taking over in a sense. And I wanted to talk about it a little bit today. Overall, I'll make sort of a blanket statement that I don't like AI. There are some things that include AI that I do like, I guess. Like I said, personal assistance, like Siri, Alexa. Being able to say like, hey, play this song, or hey, what's the weather, or hey, look this thing up. It's convenient, it's nice. It is a luxury of the modern age. But where AI starts to cross the line for me specifically is with art. At first glance, when you're just goofing around with it, it can be kind of fun being like, okay, I typed this thing into the computer, it spits out an image. And a while ago, it was harmless because you could very easily tell the difference between real art or a real photograph of somebody and AI-based things. One of the first examples of that sort of like beginner art AI that I think of was a website called pics2pics. If you guys remember that, I think I might have even done a video on it. I did, six years ago. This was the program. You would draw something in the box over on the left. It would spit out a realistic image on the right using AI. And as you can see, the results were not believable if you were to say that this was a real photograph or like this is some art that I made. It's like, okay, this very obviously looks AI-generated. All of the textures are all over the place. So you can see that it's taking things from XYZ place. And it's very easy, in my opinion, at least, to tell that this is AI-generated art. It was harmless basically. And it was just like, oh, this is a fun little thing. Okay, but now as the years have gone by, the specifically the AI-generated art programs have gotten so, so, so much better. Back in 2021, OpenAI released Dolly, Dolly. They're currently working on Dolly 2. Another program that a lot of people have used as well is called Mid Journey. All you do is you type in a certain thing. You would type in 3D render of a cute tropical fish in an aquarium on a dark blue background digital art. Spits out something like this. This you can kind of still tell a little bit that it is AI. But the stuff that kind of like freaks me out, honestly, is the actual like, okay, this looks like art. Like this kind of thing. A Van Gogh style painting of an American football player. Not only can you say, okay, like here in oil painting, you can say these are the specific attributes that I want this to have. This is a sketch, hand-drawn sketch. But you can even mimic specific artist styles, which I think is really kind of weird. And again, these programs have just gotten better and better and better. Like this monkey. That, like this, this looks real. All of the lighting and the reflections you can even see like in the eyes, the light source is coming from over here. You can see the highlights, but it's really, really, really well done. I think that if you're using it for personal reasons, I don't really know what I would define as personal reasons. Where it starts to draw the line is like, it's getting so good that we are having a hard time differentiating between AI and real art created by real people. And this is going to be a huge, huge problem. I think that this is specifically going to be a huge problem for artists who do concept art or storyboard artists, anybody like that. Because from a big company standpoint, why would I pay this person a salary to create conceptual things for me when I can just type in what I want into something like Dolly or Mid Journey or whatever. And it'll give me basically concept art for basically free in seconds. Not even minutes, in seconds. So that's something that I get really scared about is people like that. And obviously there have been so many layoffs in the gaming industry, in the entertainment industry. While not all of those have been because of like AI and stuff like that, but I think that companies are going to start playing people off and getting rid of certain positions because it's like, okay, AI can just do this faster and better than you can. And it's very, very scary. Little slide track from like art, art as in like paintings drawing, stuff like that. We saw in these last months in Hollywood with the writer strike. A large part of that was because of AI as well. Writers are worried that producers are just going to go to something like chat GPT, give it a basic outline of what they want and then they can get sort of like a basic pilot script. Like you can use chat GPT in that way. Write me a pilot script about a boy who wants to be a horse. Title, hoof dreams. It's giving me a whole thing. And while what it's writing is probably not perfect, it's something that people can work off of without paying actual writers. I've decided I want to be a horse. A horse? That's a unique dream, son. I'm serious. Writers are strong, fast and free. That's not true. In a video with me slamming AI, maybe I shouldn't. Maybe I shouldn't be going on chat GPT and being like, wow, this is entertaining. It sucks. Chat GPT, I don't like you. I believe we talked about this on the episode that we did with Aaron Hansen on the Brain League podcast. Give a listen, link in the description. Go listen to the podcast. Me and Sean do it every Wednesday. I think we talked about it there. If not, I talked to Aaron about this a different time where Aaron one just doesn't AI-generated art, but specifically, he is an artist. And he's talked about how some people, AI tech bros, are slamming people on Twitter when it's like, oh, this piece of art that I like, this is really cool. I like that a lot. And then somebody swings around and it's like, actually, that was made by AI. Ha ha, I got you. And then they're like, oh, well, I don't like this thing anymore. And it's like, well, how can you say that? You just said that you like the thing, but now that you've learned that it's made by AI, you just don't like it anymore? That just makes sense. And Aaron has made a very good argument in saying that it's not necessarily the art itself. It's that with art, it's fair to have the preference of I want it to be made by a real person. I wholeheartedly agree with Aaron on that. Like, I think that it's so valid to be like, oh, I want art to be made by another person and not by a robot. But I think if something is made by AI, then there's just like no soul in it, you know, literally. And so I think that that inherently kind of takes the meaning out of the thing. This is kind of a weird analogy, but imagine that somebody gave you a meal. They sat you down, handed you a meal. And that person was like, hey, I spent all evening cooking you this meal. I made this whole thing. What if instead they just gave it to you and they're like, yeah, I just microwave this real quick? Like, this is kind of a weird and a little bit of a stretch of an analogy. But I think maybe you get what I'm saying where, you know, somebody took their time and effort into creating something to give to someone, i.e. the person viewing the art. And it's inherently less meaningful if it's just kind of spat out by something else. That's what I think, Elise. Beyond scripts and stuff in Hollywood, there's also been a lot of scary things, particularly with acting and also just AI mimicking people in general. You know, there's all of the fake like AI voice stuff where it's like, oh, X, Y, Z person is doing a cover of this song. And while it's kind of funny at first, like, wow, we can just have an AI scrub through somebody's videos or all of their back catalog of movies and stuff and just use that voice and train itself to learn that voice and give us a thing that that person actually didn't do. They've been all over the place recently with like SpongeBob and Peter Griffin. Schlat made a really good video on it a little bit ago about talking about AI song covers. And then he did an actual cover because there was an AI cover of my way of Schlat. And then he went ahead and actually did a real cover of my way that wasn't AI. It's funny, but also just a little unsettling. Not only because the person doing the cover isn't actually doing it, but also like beyond covers, you can just have an AI basically say whatever you want with another person's voice and that obviously can be very damaging. And the same thing happens with deep fakes as well. Deep fakes aren't getting better and better every single day. There have been certain deep fakes that people have done that have gone super viral over the internet because people thought that it was a real person doing it when in reality, it was just a body double and they deep faked another person's face on it. Let's talk about your cable ace award. Let's talk about buttered sausage. Talk about buttered sausage, where it comes from, what it does, why is it doing what it's doing? Corridor made this deep fake and now this is four years ago, so it looks a little bit dated, but it's still like pretty good. And obviously this is four years ago, so imagine what they could do now with deep fakes. So it looks a little janky, but watching this on Twitter on your phone, like it could very easily be believable and be like, oh, shit, that's Keanu Reeves, he's stopping a robbery. So we talked about AI art real quick. We talked very briefly about chat GPT, about deep fakes. This came up on my Twitter today and then Justin, the lovely editor of the channel, or Justin, give yourself a round of applause. Bravo, bravo. Justin actually inspired me to make this video because he was like, you should talk about this new thing from OpenAI called Sora. Sora is an AI model that can create realistic and imaginative scenes from text instructions. This is getting bleak. All videos on this page were generated directly by Sora without modification. So again, like in the background, like it's a little weird, but the fact that this is all AI generated and it's just made through text prompts is horrifying. The prompt here was just a stylish woman walks down a Tokyo street filled with warm, glowing neon and animated city signage. The fact that it can make something like this with just text instructions completely from scratch, like this, God damn dude, that's, it's such a weird thing because it's incredible and horrifying at the same time. My brain is like, I don't know whether to be more impressed or more terrified. Even with like stuff like this, damn dude. This in particular scares me for obviously for editors, for VFX artists, like crazy for VFX artists, this looks so good. And not only just like general crew who work on film sets, but also actors as well, other than movies and TV, especially with actors who aren't like A-list celebrities and stuff, a lot of actors get work through commercial work. This kind of looking stuff, this looks like an ad. Like this looks like a weird AT&T ad, the way that it's lit and exposed, the way that things look, it has like a very digital, very clean look. This is so scary. Like this could be like a fashion ad or like a perfume ad or something. And so the fact that it's like, man, this entire section of the entertainment industry, of the film industry could be completely taken over by AI if they wanted to. Just like with Dolly, just like ChadGBT, like I was saying with the script, why would you pay an entire production crew and entire hair and makeup and camera people and actors and sound and then having it edited, having it colored, everything? Why would you pay for that as a big studio rather than type in what you want and get it for free? Like it's so scary, dude. It's so scary and these big corporations are absolutely gonna do that because they're money hungry. And it's like, why would they go through? Like this is fucked. This is so fucked. This looks like a person. Ah, God, that's so scary, dude. Doing an ad like this, I know that this is only, what, the 10, 15 seconds long, something like that. Getting all of these shots like this, if you were to do it for real, if you were to do it practically, especially like, this looks like, obviously it's A.I. generated, so it's not a real thing. But this looks like it would be shot on like this in the salt flats or something like that. If you were to do this for real and to do this at this caliber, this would cost tens of, if not hundreds of thousands of dollars between filming actors, production, editing, all that shit. So the fact, oh man, dude, that's so scary. How? Landscape stuff, oh my God. I've just thought about this. Stock footage is fucked. There are so many people that shoot things and then sell them to stock footage or stock image websites. So many commercials, so many movies, so much media has stock footage. You know, if it's like, okay, we have this scene in a city downtown, we need an aerial shot. A lot of the times they won't go and actually like get the shot. They'll just use stock footage and get the rights for it. So the fact that now it's like, oh, we don't need the stock footage, we can just have AI do it. That is horrifying. That is so scary. Animation studios, why pay millions of dollars for animators to make a film when you could just have AI make a fucking movie? Especially for like little kids stuff. You could do the whole thing with AI. You could say, chat GPT, make me a children's movie script about this little monster who his parents leave him on his birthday. And then you can have this animate the whole thing for free and then you can have AI voice stuff. Just, you could probably have it fucking create a voice. If it can mimic different celebrities, what's stopping it from just making a new voice? Or they can just say, hey, Camendillas, we want the rights to your voice for this movie. You don't actually have to do anything. We'll have AI do it and we'll pay 300 grand. It's so bleak. And again, it's like, this just looks like a bird. Oh my God, dude. This is horrifying. Like look at the way that the feathers are moving when it's breathing. The like micro jitters in its eyes with the light. Like it looks so real. I would like to think that people would use this stuff for good, but humans are, you know, shitty. People could use this for very nefarious purposes. People could just write up a prompt of somebody saying something really bad. People could do AI generated porn. Like there's so, there's so much shit that can go wrong with all of this. And the fact that it looks so real is horrifying. Let's read this safety thing real quick. We'll be taking several important safety steps ahead of making Sora available and open eyes products. We're working with red teamers. Domain experts in areas like misinformation, hateful content and bias will be adversarily testing the model. We'll also be building tools to help detect misleading content such as detection classifier that can tell when a video was generated by Sora, we can include C2PA metadata in the future if we deploy the model in an open AI product. That's good and all that you can go back and say like, oh actually this was generated by AI. A lot of times at that point it would be too late. I imagine if somebody made a video with AI of me beating the shit out of an old woman on the street and it looks real, it goes out, it goes viral on Twitter, blah, blah, blah, blah, and then they come out and say, oh actually this is, this was AI generated. That's great that they can say actually this is AI generated, we have the metadata and the blah, blah, blah, and the blah, blah, blah. But at that point I feel like a lot of the damage would already be done. It's just, it's just scary. It's just scary. We plan to include C2PA metadata in the future if we deploy the model in an open AI product. I'm not familiar with what that is. C2PA is an open technical standard that allows publishers, companies, and others to embed metadata in the media for verifying its origin and related information. C2PA isn't just for AI images, same standard is also being adopted by camera manufacturers, news organizations, and other to certify the source and history of media content. If it's in the metadata, cool. But also what's stopping somebody from taking that video and just screen recording it and then putting it up. This C2PA data wouldn't be in the metadata of the screen recording. I assume at least, I'm pretty sure that's how that works. To show you the ways that people are getting around it, here is one example. Here's a douchey photo of me shirtless that I took. So Photoshop has a new beta that has a bunch of AI features. A lot of them being with generating different images. And there are times that it is actually really useful. A lot of the times I will use it for stuff in the background. But what I want to show you is that it's supposed to detect if somebody in the photo is naked. And if you're trying to expand on that using AI, it won't let you. So in this case, we are going to uncrop and we're just going to press generate. The fuck? So a lot of the times if you ask it to do a generative fill and there's skin involved like here, it won't actually do it because it doesn't wanna make something naked. Well, now it's doing it, which is cool. When I want to make a point. So let's take this for example. There are a lot of useful stuff with the generative fill because basically what you can do is you select a thing, you do generative fill, and then you type what you want to generate. Maybe something like this will work a little bit better. It's something that's actually kind of a little bit trickier to paint out. If I wanted to get rid of this stuff, and it's hit or miss, sometimes it works well, sometimes it doesn't. Yeah, so it's okay. That's fine and believable. If we wanted to do something a little bit more drastic here, we could take this whole thing and just type in wooden tabletop. Let's see what it gives there. And not only will it do different objects and materials and textures and stuff, but also it will expand. I don't know if I'll do it here, but it will expand upon my body. It'll imitate my skin texture, stuff like that. It'll do a really shitty job with hands. It'll do stuff like this. It's a bit janky again. It's in the beta. We can just kind of use this to add in whatever we want. Okay, you got a little glass of wine. It does the shadows and stuff. I asked it to make an art piece to put above here. It made it like this one. Add in a fork and stuff. You can just add a bunch of crazy shit. At the end of the day, it's kind of cool, but it also kind of takes the fun out of making shit in Photoshop, if I'm honest. Again, it kind of takes the heart out of it. If we went to chat GBT, we were like, give me an idea for an interesting thumbnail or a YouTube video about cooking. Let's see what happens when we do this. Let's take every prompt and make, okay. We'll do that. All right. It's a little bit janky, but this was all made via AI. I had chat GBT create the idea for the background main image and added in a chef's hand, text overlay. And honestly, like this is a pretty good thumbnail. Anyway, I'm not really sure what my point is with this part of the video with the Photoshop thing in particular, but it's scary, dude. AI has taken over a little bit. I think it's becoming dangerous. It's becoming harmful. And I think that it's going to continue to get more harmful. It's going to take away jobs. It's going to crush creativity. And I think in many ways, it's going to make people, it's going to make people less motivated to follow what they want to do. Cause it's like, oh, like, I would love to be a graphic designer or even with something like this, I would love to be a thumbnail designer for YouTubers. What's the point? They're just going to use AI to make their thumbnail or they're just going to use chat GBT to write a script for this movie. Or I'm not going to shoot stuff to put on stock footage websites because they'll just use AI to make that. I'm not even going to try and become a cameraman anymore because they're just using Sora to make commercials now. I don't even want to become a singer because they'll just fucking use AI voices to create new songs. God, we didn't even dive into AI influencers. Oh God, that's its own can of worms. You'd like to see me dive into that a little bit more. Let me know. And maybe I can go down that fucked rabbit hole. Anyway, this was a little bit different of a video than I would normally make, but I thought it would be fun to just sit down and talk about this. Give you my thoughts. Please give me your thoughts in the comments. If you stuck around this far, maybe toss a like my way. Wow, I love a like. Don't you love a like? I love a like. AI is taking over the world. We're all fucked.