 Oh, good morning, good afternoon, good evening. I'm your lovely host, Eric Jacobs. You may have seen me before. Joining me today are Luke Derry and Derek Gris. We are going to talk about Blender and remote rendering and attempt to hack our way through making it work on OpenShift. Luke is getting his virtual background ready. Oh, that's cool. Did you render that background with Blender? No, although I do have a render. I can use that. Oh, OK. Look at the hat rack. Oh, yeah, that's right. So Luke, talk to me about remote rendering in Blender in general at a high level. What is it? How would you do it? Blender has a command line interface, and it also has Python that you can use for what we would call remote rendering. Otherwise, you're rendering in the actual app itself. The CLI allows you to render in the background. So you can see the dash B flag there says, run in the background. Don't load up the whole interface. In addition to the CLI, you can also do Blender straight from a Python file. I don't think I don't know that we'll try to do that today because I think there's a lot more involved with installing it. You have to build Blender as a Python script. But the Blender itself comes built in with a Python instance that has all the objects that you need. So typically, you can run a Python script from the Blender CLI. So I think that's what we'll probably try to do today. I think there's definitely complexity. You can go out the window and just do all sorts of crazy things. But I think for the purposes of today, we'll focus on trying to execute a Blender CLI background command that also executes a Python script to do something fun. Well, it's an open source project, right? So the limits of what you can do is like, how much code do you want to write? Yeah. And honestly, because it has Python hooks, it allows you, people write plugins to do all sorts of crazy things. And any of that would be accessible via the Python hooks that it provides. OK. And I was going to say, one of the things, as we're doing this, I do want to do one of the new, the current version, which is 2.92, because I think there's a lot of things that you can do in that, especially with the geometry notes that Andy, for little experiments we can do today. Sure. What's the, nope, not volume. There's a way to do variables in container file. I thought it was just variable, but oh, n, I think, maybe. Oh, a good environment variable, yeah. Yeah. Yeah. OK. So first thing we're going to do is lots of the wrong thing. And then, yeah, so as far as setup goes, if you want, the easiest way to do this is to make a path, like put the Blender folder into your path, so then really all you're doing is you're running Blender, not, you know, you don't have to run all the path to it. Yeah. Dash v, pwd, colon, opt, let's see what happens. OK, cool. That is exactly what I wanted to happen here. It's actually not what I wanted to happen. That's what I wanted to happen. OK, cool. So let's see, echo path. So what we're doing right now is we're going to try and build Blender into a container. I guarantee you there's already a container, but why not struggle? Well, yeah. So I've definitely, I started scripting out that I'm a front-end guy by trade. So container stuff is always foreign to me. But yeah, essentially, just everything is in that tar file. So you just extract it. And wherever you extract it, put the path to that. I'm doing, I'm following lots of horrible practices here. Flying by the seed workings. That's not what I want to do. tarxvf-temp-blender-version-linux64-tarxc-move-blender-version-linux64-blender. All right, this is probably going to fail spectacularly. But we'll give it a try anyway. Like the hallmark of using Linux enough to feel comfortable is being able to remember tar variables, assings. Yeah, right. Do I need a J? Do I not need a J? I know someone who is unfortunate enough to think that they knew and then put them in the wrong order and ended up doing some horrible things. Oh, no. Was that you? It was not me. I don't risk it. I always go Google it. Once that happened, I was like, oh, I guess the order matters or the capitalization or whatever it might have been. One of the things that just came out recently was my under-foundation. I'm not an instruction author. Really? Yeah, God, Eric. Don't get yourself credit. So it's funny. It's like if I do maintainer, then it bitches that maintainer is not used and that I should set the author field. Oh, this is label. OK. I was saying one of the things that came out of the Blender Foundation very recently is apparently the Irish studio behind Wolfwalkers' beautiful animated feature film used Blender for all of its 3D vision scenes and then for a lot of its 2D animation. Blender really has been, well, I won't date myself too much, but in college, Blender was something that everyone was like, oh, yeah, they would try it. And it was like, it would weed out all the people that weren't really die-hard 3D modelers and things because they get into it. And the first thing they do is they try to left-click on things. And it's like, hmm, I can't even click on things. And they would just kill it. But the recent UI updates, they've just done a lot of quality of life things that have made it more accessible. And I think that in and of itself, people in the industry have taken a serious look in 3D plus VFX plus anywhere that can use it. And the thing about Blender is there's so many uses. I would call it a Swiss army knife of media in general because it really, I use it for 3D modeling. I've done VFX with it. I just do little pet projects. And it's like even just basic graphical things where a 2D artist would be like, all right, I'm going to load up Photoshop or something. I'm like, yeah, I'll just create 3D models. And that way I can do anything, I can pull it out and we've done it. But the latest and greatest things, like Netflix had the, oh man, I can't think of what the name of the series or the movie was. Give me a topic. So like a computer animated robot. I can't think of it. Big Bup Bunny? No, I'm just kidding. You're talking about like death-loving robots, I think? No, not that one. It was a kid's next gen. Very different. Yeah, yeah, yeah. Yeah, no, next gen. Blender was a primary piece of what created that. And I think for a lot of people who with kids, especially when you point to that and you say, yeah, that was done in this building, I think one of the series on Amazon that I really liked, Man in the High, no, Man in the High Castle? High Castle. That, a lot of the cinematic shots in that were done with Blender. Oh, wow. That's fantastic. Yeah, so I mean, it's gaining, and a lot of media houses are investing and putting things into the foundation and sponsoring work on Blender. I think generally it's gotten more popular within the mainstream industries that it touches. And it's great to see it. Yeah, absolutely. Well, and I think part of it, too, is you're kind of seeing a different generation of artists that are more technically capable and kind of willing to deal with the brick wall of learning a complex tool. I remember it. So I've been using Blender almost 20 years now, which is, that's a crazy thing to say, and I hate saying it. Because you're plus 20. That's not why. And about 16, 15 years ago, somewhere in there, I taught half a semester, basically about a five-month long class for high school with Blender. And it was an adventure and a lot of it was working around the UI and kind of trying to craft an entertaining and engaging class where the results were much more immediate to keep the kids interested, because there was so much GUI to kind of learn that brick wall. But I used to do all these demos to classes and to colleges and professionals. And I would blow them away with how quickly I could model and animate in texture and everything. Oh, my god, you're so fast. But I don't like that GUI. I don't recognize that I don't understand it. And I don't want to learn how to use it. Yeah, yeah. And I think that's where, because I did this. I taught, well, it's been five years, I guess now. Five years ago, I taught like a middle school, high school. And the idea was it was multimedia, but the idea was to just use Blender for everything. And the first class, it's like, all right, go in. Change your settings so that the left click is select. That was like, if you want to do this and not bash your head against the desk, change it to left click so you have that knowledge transfer. And I think generally speaking, getting into 3D modeling and CAD generally is there's a big, big first initial step of there's no knowledge transfer, because you're working in 3D space. The mentality is different than, I'm going from documents where I'm typing and it's all two-dimensional. You have to have other controls to move in three dimensions. So nothing is going to be, all of the interface is going to be more complex. And I always tell people, Blender will be the hardest program you've probably ever labors and who's not doing that stuff. I say Blender is probably one of the hardest programs you'll ever learn, because there's no, you don't have the knowledge transfer all the time. Although they, like I said, in the latest versions, they have even something as trivial in your head as control C to copy. That was not the case three years ago maybe. Yeah, well, it's that whole thing around Blender's UI was designed from the ground up to be the quickest way for a highly experienced professional to work in 3D. And it really was like from the shortcuts and everything, but that also means every convention you would be used to another software if you used it for is completely dropped out the window. And it produced some really impressive results. Like I said, nobody using Maya, 3DS Max or even ZBrush or any of the other competing software could ever keep up with anybody who's very experienced with Blender. If you're an experienced person with Blender, I mean, you're literally three times as fast. But that's also kind of interesting because even with the GUI improvements nowadays, when I show people that, when I show some of the younger people that are getting into 3D modeling and animation and film production, the speed compared to using Maya or 3DS Max and it used to be pretty decent at Maya, they're convinced. Like they don't even care how hard the software is. The younger people are just like, okay, it's a technical hurdle. I don't care, my life is full of them. Yeah, it's like, it's all right. And then you have things like the EV render now gives, I mean, I think that that's probably when I think back about over the years of using Blender. I think for people, accessibility to people, like you said, people want the immediate, I want to be able to see what I did immediately. And EV gives you kind of that happy middle ground between what used to be the Blender renderer and cycles where you get that real-time, real-time, there's a few things missing between EV and Physically-Based Rendering, being what it is, EV does what it can. I think it's definitely a really great way to preview immediately like changes that you're making. Even if the, I think the big thing for me is always the lighting and there's ways that you can hack around it. Well, yeah, and lighting in Look Dev in itself is like one of those things that not a lot of people are aware of is like, that's a huge profession. You can literally build your entire career around just lighting and Look Dev. And one of the people I went to school with, Scott Knapp, he's been all over LinkedIn recently because his portfolio is absolutely insane. And a lot of it is real-time lighting is the stuff he's been getting into even though he's the guy that just does film. Like that's his thing, he just does film, he's worked on Marvel movies and things like that. But all of his stuff recently, it's just, it's real-time from Unreal to Blender to Unity and that sort of stuff. And it's really cool to see that kind of taking hold of the industry. You would think that I'd built a container once in my life before. I don't understand what I'm fighting with here. It's like variable expansion combined with annoying like container file syntax. So if you do, I really have to look at me. If you do, it's XJ, capital J, there. So no, no, it has literally has nothing to do with the path that you're going down right now. This is, yeah, it's quite literally like, I cannot figure out the syntax of how to have a variable in the run command, even though I'm doing the exact same thing as a different one that has variables in the run command. Like I, at this point, I don't even understand what I'm doing wrong. Like, does that have to be... So if I put everything in quotes, the variable expansion works, but then it tries to treat it as like a single command which doesn't work. And then if I don't put it in there that it doesn't work, user bintar, all right? Let's try user bintar. Because it's not replacing it. Yeah, and so then if I put it all in quotes, too many terminals, then it's trying to put that into SH and it's saying no, it should file our directory, but it's definitely there. So I just don't, I'm so confused as to why this doesn't work. So if anybody's actually watching the stream and can tell me what the hell I'm doing wrong because this doesn't make any sense. I can end build time variables, I can be inspected after, and this is available during the build, it's fine, setting values, but I want to use the value. Do you just have to put the, just the blender, like the file in quotes? Separate from, looks like XZ is an installed. I'm using, so for the person who said it looks like XZ is an installed, I'm using the same exact command on the same exact image. And it seems to work, CD, temp, blender, what? You know, file storage issues for some reason? No, it's something may have changed in the file system. So let me try this again. There we go. So tar, XVF, blender. Oh, look at that, XZ is not installed. Lovely, separate from tar. So what the heck provides XZ? Well, so let's see. Oh, is the right tar not installed? Maybe, is that like, yeah. Lovely, I hadn't actually tried to run it. So I've been chasing my tail on stupidity for a few minutes. I mean, my excuse is that it's too early in the morning. What's your, We don't get that. It says XZ is installed. This is so weird. Oh, no, it's not. Okay, I did it. Yeah, I got it, okay. Hey, look at that, son of a biscuit. All right, so let's go back here. Okay, so it doesn't need to be, it doesn't need to be in quotes. It doesn't need to have the full path. It does need to actually be installed. That makes a whole lot of sense. Yeah, at least somebody's paying attention. Let's try this again. This will be slow as nuts. Actually, I should separate the install from, Thanks Daniel, look at that. All right. So now the next step is it will untar it. Hey, look at that, all right. Hey. Marginally successful here. And then I'm sure that my move command will fail or something like that. But at least the move command's in a separate layer. So if it does fail, I can try and tweak it. Can one of you all maybe, Luke grab a blend file or point me to a blend file that I should use for testing this? Yes, I put one out. So if you go. I mean, something's happening. That's good news. Yeah, it's working. Or it's more working than it was before. And we like to call that progress. Eric, I put the link to the repository in chat. And the zoom chat? Zoom chat, does that work? No, I just need, yeah, I just need to find the zoom chat. Chat. There we go. All right, so. And there's the, it doesn't matter which one. If you do ownership visuals, that one has. That's this, this file background. Yeah. Got it. Okay. Then you have to do raw or download, I guess. Raw won't work, I guess. I don't know what I'm doing. So that's fine. All right, so let's. And just for people watching. So what I did with this blend file, so that, I mean, it has some asset, like it has this HDRI, this photos here I took at the office years ago in it. So there's a setting in Blender to bundle all the assets. Otherwise, you have to port all of the assets with the file. It just makes things a little bit more self-contained. Now it does drive up your, I will say it does drive up your file size if you start adding a bunch into the file itself. No such file or directory. So this is now, now we're trying to get Blender to work. So. So that you probably, so if you do. Oh, it wouldn't be Ben, sorry. That's dumb. So library, not a binary. So you may not need that in particular, because if, so the catch is. I just tried to run the help and it said it needed it. So I'm pretty sure I'm going to need it. Try dash B dash H just for the segment. So B is in background. So it won't load. No, okay. Still wants it. Silly. Yeah, I remember one of the earliest tasks I had to work with with Maya on a film was handling directory portability with massive sets of files because you couldn't embed the assets. There were just gigabytes and gigabytes of them. At the end of those terabytes. Yeah. And I mean, you know, as far as like thinking about Blender workflows on OpenShift, obviously you'd be able to theoretically at least have separation of that as long as, as long as between, between pods or whatever you're running that they have access to it. It just, the Blender wants everything to be relative to whatever path it's running from. So. Somebody, oh, Roddy says the cycles render is headless but the EV render isn't. But I don't know. Yeah, so EV requires. So this one should be the, I mean, what I would say is just try instead of trying to do the help just try the render. So if you want the, I can send you the command line flags Sure. So it's for those that are watching it's docs.blender.org has all the, all the latest and greatest. Yeah, but what, what do you actually want me to run as the command? Oh, sorry. So do Blender and then dash B. So we want background and then we'll do, do that file dash O because we want to tell it where to render it. So in this case, just point, yeah. Yep. And then give it a file name to just do like image.png. Okay, so that dash show tells us where that render output's going to go. And then do render is. So we need a blend file after dash B. Sorry. And it's dash. We need to point it at the blend file. So temp. Yeah. Blender, openshiftvisuals.blend. Yeah. And then it says we need and then I capital F for image format. We don't need that dash F. We don't care about the frames. It should, I think it's this should be enough. Well, I think it's dash. I think you have to do dash lowercase F and just tell it like frame one, like one. Cause essentially when you render, you know, blender within the blender file, it has frames. And so you want to tell it that render frame one. Nope. Still wants. Well, I hadn't installed the X11 in this one. So, but this is a better test. So we're trying to do exactly what we want to do. Go ahead. Yeah. I think you have to install it no matter what because the cycles renderer like is both a client and a server. And so it talks to itself over SSH, I think. It's kind of silly. But what I remember from four years ago is if you want to do headless blender, you have to have a build of blender that's headless. Like you have to build it yourself, which means you have to do with headless flag. And then you have to, unless you have open CL running on that, on this system, you'd have to do like with cycles device false. Yeah. Cause yeah. So you always have to have EV won't work in headless, essentially. Because it's tying into GPU. Oh my gosh. Relative thing. Lots of depth here. I used a very minimal. Well, you didn't even do UBI minimal. I would have thought. That's true, yeah. That's fine. Cause I was starting to build an image to do this last night. And I, with my other stuff, I've been using minimal. But I guess. That would have been even worse. Yeah. It would have been like, sorry. Just no. Oh God. Is there like a, is there a glit? You're further down the alphabet. So that's good. We're gonna work our way all the way to Z today in the stream. Is there a smarter way to like, traverse all the depths at once, but I don't have to like keep doing this in weird sequential order? No, I don't know. Oh, Jesus. I don't know. At least this one, I'm pretty sure it's just gonna be LibGL. Hey, look at that. Yeah. Oh, yay. Did a thing. Look at that. It's doing the stuff. That's good. Oh, this is gonna take a while. Well, so this is where some of the settings in that blend file, it's the 510 is, it's breaking up a giant image into chunks that render. We can also, so you can also set the resolution as well. So we can mess with that. And that's something we can tweak that stuff in Python too if we wanna. Well, those containers, right? The whole point of this is that we can break this up and spin up, you know, 10 containers, 50 containers, whatever each one, you know, rendering. And the idea, so I kind of was trying to dig into like what this would look like if you were trying to create an open shift-based Blender render farm. And generally, you know, I think to me that's where the, It would look like lots of Python and some serverless throwing in for good measure. Well, so, I mean, it would be this, but what you're doing is you're taking a blend file, you're gonna look at it, see what the resolution that you're wanting out is, and then you're gonna send jobs, like this job with a render region, like essentially the 512, you know, the 510 or whatever it is, each of those tiles could be sent to a pod to render, and then you suck them all back in and put them, stitch them together and you get your frame. And so they'll be distributing that out. And theoretically the time to render would go down because, you know, in this case we're CPU bound and you wouldn't, you would be distributing that load across multiple things. Right, and so from what you described, you would have a Python program that you would run as a job by itself, which would, or potentially a serverless application, which would process the blend file, and then it would create, it would need API access, like be a service account in OpenShift, to create more OpenShift jobs, and basically it would create a bunch of jobs to create all the parts and put them in some persistent storage somewhere, and then you would need a final job once all those jobs are completed that would stitch all the parts together. Right, and theoretically, so like what I was kind of trying to investigate was it would be interesting to use something like WebRTC to be able to do like almost direct peer-to-peer types of streaming to like, here's your blend file, kick me back to the image that you render and have it flowing in. We did a thing. Look at that. Cool, all right, now it's running on OpenShift. So first thing I need to do is we're gonna rebuild the image using our cool dependencies, which is gonna take forever. No, it won't take forever. Yeah, see, I was gonna, I really have every intention of doing image, like creating an image for you to do this. And unfortunately, I live out in the boonies and so my upload speed, the larger the files, the longer it takes and I really didn't have time to upload. That's cool. But I will try to work on some fun stuff with, now that I've kind of started to wrap my head around OpenShift and being able to build images and things. So I'm gonna try to do that. Is that your coffee, Derek? More prolific on that. What's up? Is that your coffee beeping? No, that's not me. There's no beeping from my end. Luke, is there beeping on your end? I think it's on yours, Eric. I'm not hearing it. It's coming through the microphone so somebody's putting a beeping on the line. I'll mute and see. Luke's on, gotcha. Yeah. I don't know. It's not, I don't hear beeping. Oh, now I do. I know what it is too. My kids have a stopwatch that they play with up here and it beeps at what time is it yet? Right around now. Yeah. Like I'm always in a median or something when it beeps and I'm like, oh, you're grabbing nuts. Oh, that's funny. They hide, this is like their, it wasn't meant to be their playroom, but. I had the stopwatch. Yeah. That's why I have this makeshift green screen. I just put it up on important meetings where I need to look. Oh yeah, Blender, so important. OpenShift TV. I mean, this is like, I mean, I've already exhausted 15 minutes of my 15 minutes of fame now, so. Yeah, there you go. It's not your first time on OpenShift TV, is it? Yeah, I did. I haven't been on the game game yet. Oh, okay. We'll need to get you on the game show. So actually, that's one of the things that I wouldn't mind trying to figure out at some point, not necessarily today, but is with this version of Blender, it has these new geometry nodes. And so one of the things for the game development on OpenShift that you guys are doing, I have the rocket ship. And I still owe you guys a spreadsheet for that of the rocket flying at different angles so that you can do the animation a little bit cooler. And so with Geometry Notes, what I want to try to do is have the one ship and then use Geometry Notes to actually reposition it on a plane. And because this is something for me, like for our summit demos, a lot of times we have a 2D artist that I think if you go back two years, he did a spreadsheet for me of a walking sequence. And it looked real janky, because I mean, it's 2D and stuff, but with Blender, the idea would be you create your 3D figure and then you do the animation and then you just reposition the camera and then be able to do that for a 2D game, you position orthographic camera and then render the different positions using this version of Blender, you can do Geometry Notes and then it should render relatively fast. Cool. So what I just proved right here is one of the things that OpenShift does for security purposes is randomized the UID of the user that the containers run as. And so I picked a random number and I was still able to run Blender. I should probably run it to completion, just to see if it'll actually be able to write to temp. I can prove myself in a different way, touch, Blender, testy. Cool. All right, so I'm allowed to do that. So this will work. So let me log into Quay really quick because I'm probably not logged in where my login has expired. Quay.io, copy password. Oh, God, too many things in front of it. Podman login, Quay.io, username is my username. Password is this password. Oh, it doesn't paste the password, so we were safe anyway. Oh, I doubt this is gonna work. That's what I thought. Any windows. So let's go to Quay. Oh, you have to create your... Yeah, I don't have a repo for that yet. Let's see, create repository. I should put this in game dev, but this is fine for now. What are you called? Blender remote UBI8, Blender remote UBI8 public empty repo. Repo already exists. That's lovely. Oh, let me do this. Settings, how do I make you public? Okay, let's try this again. I guess the first time it created it, but it was like half created? 570 mix. Wow. Might take a minute. So 140 of that is well known is Blender. They're all like 150 probably. Yeah. Actually, maybe it's more, that's the compressed size. That's probably more. Well, while we're watching paint dry, let me go over here and see if there's actually already a Blender container. Oh, no, it's 200 is Blender. So 200 of that is Blender. Is there already? Thanks, thanks internet. Yeah. Usually if you throw a 3D in there, that'll filter it down, but even then. Yeah. I think it... We'll do this. We'll search Quay for Blender. See if anybody's put one here. Oh, our PSAP guys apparently have already done this. Way ahead. Crowd render. Uh-oh. Maybe I need a tag, go ahead. So this is what we had, we were, Roddy and I were talking about this yesterday is that's for an older version of Blender. Yeah, two eight. And so the two nine series, apparently they haven't updated crowd render or the plugin at least has not been updated. So crowd render is a plugin and it would be doing a similar thing. Interesting. It's unfortunate, it doesn't list. Actually, I think they have a GitHub repo. They do. Bummer, but they don't have the... Oh, it's not public. Well, it's not here. So let's do, was it Blender, crowd, whatever? Well, an Alpine actually has a headless build of a Blender for it already as a package, which is interesting, Blender-headless. It's maintained by a mark, rightisle. I'm probably saying his name terribly. Rightisle? Like with an SNK? No, no Ks. Cause that sounds like a red hat person. I don't think it's a red hat person. It's definitely a Linux person. Let's see. Kubernetes Blender Render. Setting up a Blender, you know, using Docker. And we, I think that's when we looked at yesterday too. Raspberry Pi cluster, Blender as a batch, setting up a render farm, Terraform render Kubernetes. All right, let's ask the Google. Sometimes Google's better at tech. Docker Blender render cluster. But that doesn't sound like Kubernetes. Using Docker and Kubernetes. Docker, Kubernetes Blender. Why? Yeah, this is when we were looking at yesterday. Oh, did it work? So if you go to the second part of, so the first part of this is setting up Blender, or setting up a Docker image that uses. So it just gets it working. Oh, VGPU, got it. The second part will actually, I think it sets up the actual containers. Okay, they're still just running it in a container. I don't see Kubernetes with this. Yeah. It's just container. Bummer, teas. Where are we at here? 162, gosh. Really? I'm gonna do a thing. Okay, can you? It's a, to be perfectly honest, what does it say? I don't know what I'm doing. Awesome. I need to put that in more blogs. To be perfectly honest, I don't know what I'm doing. How to script a digital clock, old school hindering code, hard coded values. So I don't know. So here's the way this would play out is, what we want is we wanna do the same thing we just did, but then add, I mean, I think essentially, what we can do is, we can add a Python script in there and that could, the Python script could be referenced from a config map, I guess. Would that work? Yeah, or a secret. There's any, there's a number of ways to inject files into a running container. Okay, so, I mean, because then what we would do is we would take, essentially, we would want the command line flag that we had, we would add one to it to point to, I mean, it's, let's see. To point to a Python script? It's like dash P and then point to a Python script. And that will. Hi, Thun. In what you're doing. Let's not say. Do dash. Manline arguments for a full list of arguments. Yeah, that's the one. Go down, there's like Python options section. P is playback window Python. Oh, look at that. Yeah. Oh, wait, what? Maybe I'm not looking. Run Blender with an interactive console. Run the given expression as a Python script. Run the given Python script text block. Run the given Python script file. It's capital P, not lowercase. Yeah. Do you have a Python script to run? Yeah, so if, I mean, we can, I can give you some commands to put in. All right, we can do that. Let's see. Like, so the big thing was, right, what we saw was that it's rendering a giant image. You may want to do something that doesn't do the giant. Like, let's say you just want a thumbnail. So you could, theoretically, you could, you know, if you think about the workflows for that, people might want to do is, hey, I have, I have a blend file. I have a cool thing. Render me out, you know, four different formats of it. Render me. Render. Sorry. It's a Mad Max reference for anybody who's seen them. New file. Render Pi. Yeah. It's been a while since I've written any Python. Do I need like a, oh, Python. Get started. Look at that. Thanks, VS Code. Yeah. Get a new file with a Python extension. I just did that. So what you're going to do is you're going to do import B P Y, which is the blender. B P Y? Yeah. So what this is doing is it's executing the Python within the blender context. So part of what's bundled with Blender is its own version of Python. And so they have B P Y that is there. Is that in lib? Oh, 2.92 Python. Look at that. And do lib. Lib. All the things. And I don't know where it is under there. Lib things. So do I have to execute Python 37M? No, no. So that's like, if you, that's why the, the benefit of the CLI is. Oh, because Blender loads the Python file with its own Python. Yeah. Got it. Cool. Okay, cool. All right. So import B P Y. And then, you know, if we want to just change the resolution. Sure. Let's just change the resolution. I feel like that's probably a safe thing to do. You're going to be. Safe. Well, I say that because I don't know, I don't really know if it'll work or not. Because we don't know what we're doing. What we'll do though is because it's, it's going to be in the interactive thing. We want it to quit after it does its thing. Okay. Right. So at the end, like this would be the last line. Yeah. Wait, did you say ops as in options? No, sorry. O P S dot W M dot quit, quit underscore Blender. That's it. And I think, I think, and two prints. Sorry. It's a function. Yeah. A method. I was using. Unable to import B P Y. And it won't, it's not a global thing. Yeah. Thanks. Thanks VS code. Yeah. Smart enough. And then the other thing you can do in this is the nice thing is you can actually reduce some of the command line flags that you're putting in. Cause you can actually set the, you could. Well, if you wanted to use environment variables, you know, there's all these things where we see the frame the number that the frame to render and the blend file and the other stuff could be in here. All that stuff could be in there. Well, the blend file won't be. Cause essentially you're saying run Blender on this blend file and execute this Python script on that blend file. But things like the rendering itself, the path, you know, those can all be stored in secrets or, you know, config map types of, you know, environment variable and Python will be able to get to those so that you wouldn't need them. Right. Like the idea would be that way you set up the config of the pod or whatever that's running it and then it will just run the right way. What, so how would we change the resolution? So in this case, the resolution will be, and I just gotta get this right. Cause if it's run outside, we'll see if this works. So there's two contexts and I always get, I'm a little shady on these because I haven't done the Python stuff in time. Oh, you're shady. All right. Okay. So the, to set the resolution that it's going to export, you do bpy.context.scene.render.resolution underscore X would be the X resolution. And then Y, Y is the same thing with Y. And then if you want to add the file path in there, I don't know if you want to do that too. No, we can leave that on though. Okay. I just want to test this basically is what I want to do. Okay. So in that case, the other thing we need to add is the actual render. The render command, you mean? Yeah. Cause like essentially the render, like it will actually- Python script tells it don't do anything that's not in the Python. And right now the Python is just saying set some things and then quit. And then quit. So it's not going to help. Yeah. And it doesn't, it wouldn't even, you know, theoretically it wouldn't even save those. So you're getting nothing out of this. Right. Let me just hear real quickly, figure out. It's like bpy.obst.render.render or something. Yes. It's render.render. So it's same, so it's bpy context scene render.render friends. Yeah. And so that says given everything else I've got set, render, and you know, so that in Blender there are two, there's render which does the single frame and then there's render animation I think, which is maybe that's not even right, render.render I think should be the right one. I mean, I know the ops call for it, which is bpy.obst.render.render. Yeah. And maybe that might be the, yeah, we'll see. You need to set the active scene. All right. So I got the script right. Well, do we need anything else here? We're questioning whether it will be bpy.obst and that may actually be unseemly. All right. Well, we're about to find out. Yeah. Cause I'm going to run it while you guys are trying to figure it out. Oh yeah, there is that render is in there on there. Blender dash b. So it. Blender openshift visuals dash p temp render render dash. I would assume this is going to work because it will default to the first. Unassumption dash. Oh temp blender, whatever. And the output. Yeah. Cause we didn't set the output. Yeah. You'll need that one. It's doing something. So it's rendering. It didn't explode. Whether it'll run the actual Python. Well, it must have because now it's only doing 130 tiles instead of 100. Yeah. Yeah. So it must have. Yay. We did a thing. Celebration. Yeah. All right. So now we want to get interesting here. So we're going to rebuild this container file. So we want a volume at let's go opt blender data. Python. So we're going to put the Python script in a specific place. Right. Opt blender data. Brain. Blender. So that's going to be the blend file. Then we want another volume that's, oh, I can do this in an array, I think. How do you know that? Well, the VS code told me. Oh, okay. Okay. It says you can. Good job. Oh, okay. Yeah. Great amount of point. How do they have to be? Well, we'll find out if it works or not. And then we'll have opt blender data output. So we'll have three volumes. And the reason for that is that then we're going to specify the command and we're going to have a very specific syntax blender. You know what, let's just do this for giggles just to see what happens. So I'm going to build this again. Okay. So now if I run it, it should just give me the help output of blender. Oh, right. That's actually okay that we did that because that just indicates that it's probably not in the right path. We didn't update, right? Like it's, you didn't put it into the path environment variable, right? I did not. Okay. Yeah. I mean, that's, if you like it on, on a Linux if you install it, you get blender in the path. If not, you just have to do it manually. Oh, I didn't actually rebuild the image. So we'll see what happens. All right. So blender is there. So the command should be, oh, okay. Opt blender, blender dash H. I have to rebuild the image to make that happen. And then I need to run it and we get the help. Okay, cool. So now what we want is on the Python script. Yeah. Well, so I want to make folders that match the things that we expect here for giggles and then move things around. And so I'll move the Python script into Python. I'll move the blender into blend. And then we can delete that. And then we have our output. Okay, cool. So now what we're going to do is we're going to run blender dash B with pop it's blender output blend file that blend. Oh, it's a blend, right? Not output. Opt blender blend slash blend file. Good point. Good point. Blend, blend, blend, blend, blend. It's a meaning of the word now. I see you like. Prayer programming for the win. And then let's see. So what did we do in here? Looking up the syntax that we used previously. And then we want. And you can put the flags wherever. Dash P. Opt blender, Python script.py. And then we want dash O temp. Nope. Opt blender, output image PNG. And then for the sake of simplicity, we'll force dash F1. Yeah. And it should all be blender underscore data before you build this. It shouldn't just be popped. Like I haven't had my coffee. So come on. And there's probably a way to make this multi-line on. No, it's okay. You're doing good. You're doing good here. Yeah, you just put the slash. Is it for that? All right. So I have to rename these now. So this needs to be script. And this needs to be blend file. Okay. So now if I run this, hopefully it should just do it. Right. Or explode horribly. Cannot read. No such file. Opt blender data blend file that blend. Okay. So this is where we do one of these. And then we, oh, cause I didn't mount them. That's good. Yay, success. Okay. Yeah. So we want. This was intentional. Yeah, totally. So we want to mount PWD slash blend to opt blender data blend. And then we want PW, oops, dash V PWD slash Python into opt blender data Python Z. And then we want PWD output to opt blender data output Z. Look at that. Yeah. I'm a magician. Cloud computing, as they say. I'm a cloud computing. That doesn't make sense. I'm a cloud computing. Okay. I am a cloud computing. So now we can try and open shift the shiz out of this. So if we go to, I do have a cluster. I have logged in. So let's go to. This is the cooking show, right? Sort of. I mean, I didn't have to install. I didn't have to install open shift. But yes, this is kind of cooking show ish. So skip the tour. We're going to make a new project. Let's create a project blender. And so the first thing we're going to do is we need a I guess we can do this as a config. No, I need a volume, volume snapshots, PVC. Okay. I want a PVC to actually store my stuff. We will call it blender output single user. Yes. One gigabyte should be enough. File system is fine. Okay. Cool. So now we can actually attach this to a thing. Next, we want a config map. I'll create a config map. Oh, really? Can I not? All right. Let me do this as a secret instead. That'll be better. Secret or sense to me. I don't know. I don't know what the difference is. So. Oh, man. Do we want to go down the rabbit hole? No, no. No, not really. Not today. Not today. Okay. So this is going to be blender data. So the key, the first key is going to be the script. And so then we'll put the file in as not presentations. Blender, oh gosh. Blender remote. Python script. Then we're going to add another key value. And the key is going to be the blend file. And we're going to do this. And we're going to do this and then this. Oh, bummer. Okay. Well, so part of it is a file limit. Yeah. So let's go back to config map. There's a, I can do this with an OC command. No. Okay. I was going to say you can, the, the maturity of that file size is really the limit like that. This is going to be exciting. This whole thing is exciting. We're just here. OC config map blender data from file equals key one from file equals blend equals, right? Yes. I mean, we can go into the nuances chat. We're happy to nuance away. Rendering per angle. God, I don't even, that sounds awful, whatever it is. All right. Let's just see if this works at all. Cause we are uploading form X into a config that entity to large. So this is a potential problem. Okay. So I'm assuming that there's going to be large assets. So we have a couple of, we have a couple of options here. We can go brute force and just pull the raw blend file from the Git repo as a URL param and then do some really hacky nasty stuff where like we fetch it. I don't actually know how to directly add a file into a persistent volume claim. So like there's a file storage now somewhere like that lives in the cloud, but I don't actually know how to put anything into it. You know, ideally in like a real company, if we were a real company, the plan file would probably go in like an object store or something like that. Oh, right. Is there a way to separate? Yeah. I mean, really, do you have any image editing software? I do. I have lots of imaging editing. So. Imaging and then we go, go in, open up, open up the, oh, you can't do it in because I embedded it. Right. Hold on. You're going to have to make, you're going to have to make me some new stuff. Why don't I do this? Or we need a simpler, smaller. Yeah, Tony, that, Tony, that flawless demo was purely by accident. I cannot take any credit for the fact that this actually worked so far. We did, we did spend the first like 10 minutes completely effing it up, but you know. Well, okay. So instead load the, load the spaceship blend file. Spaceship. I think that one will be. I've lost your repo. I was in the chat here. Nope. Except. Did I use. Can be quasi. Oh yeah. Open shift blender. No, that's you. Yeah. Which one? Spaceship. Spaceships. Spaceship. All right. Lego movie. So let's go. See. Blend. No. Although it may look really funky because I think that one's used in the nature. Oh gosh. Come on, dude. You're killing me here. Well, here. I may review what I can do. Let's just. Blend file that blends. 15 megs. Wrong direction, sir. Hold on. Hold on. All right. Can we just like render a sphere or something? Okay. Okay. I mean you do that with Python. Well, yeah. Is that. Worst case scenario, we just add it to the script and ignore the blend file. Yeah, that's true. So we would just have an empty blend file. So we don't even need a blend file. Just need to put a file in. But if you dash B, you need a. File. I'm saying we don't. We don't dash B. We just dash P. Which that sounds weird to say. Will it not? Would that work? Pretty sure. Well, let's try it. We'll try it locally first. Sounds great to me. Okay. So give me, give me the code or paste me some code to draw a circle sphere or something. Let me grab. Because Derek had shared. Let's see. Where's that? Yeah. Here's a link to this. This is a Python script to just render. Render a cube. A cube. Oh, except. Well, yes. What do they do? Yeah. They just run blender dash B dash capital P Python script. And then. I don't see why this renders a cube. It's because, so it's the default, like it's like a new file in blender comes with the cube and the camera already. That's neat. Yeah. Default cube is like in blender circles. Oh, but it needs input, which is annoying. Oh, I mean, you could. I could change that. Yeah, just copy the part without it. Yeah, ignore all the top stuff. Ignore like 90% of that. I mean, you can keep all of that. You only need like, yeah, the bottom portions here. Like I would just copy like from where it says, yeah, that stuff. And you don't even need cycles renderer, just output resolution is fine. Like all that stuff is fine. Oh, we need to. You can just delete those lines. Like are they, are those optional arguments? Honestly, I think you don't reading through this. You don't need any of this junk. Okay. So I see, I see what you're saying. Why are they even using camera object? I don't even know if you need that. Like you could probably just run the Python script as it is and it would probably work without copying any of the stuff that Luke sent you. Oh, so just go back to the original. Yeah. Just run it without a file and just render a cube. Yeah, cause that's what they're doing, right? Like they're setting up a lot of other stuff, but. Yeah. Theoretically the default, like just give me something with nothing in it is actually something with something in it. So we want blender dash P blender data Python script. We want dash, oh, blender data output image PNG dash F1 needs B. No, you still do B. Yeah. But I don't think you need a file still. Cause I think that. Under settings object has no attribute render unable to open a display. I think we, I think we need to feel it. Oh, you have to, Oh, here you have to have, you have to pick the engine cause if it's defaulting, let me go back. I need a dash F you mean? You know, you have to pick the renderer. I've got the, there you go. It's BP. Well, here I'll just put in there. bpy.context.scene.renderengine equals cycles. Got it. They're doing it. Kind of throughout. Maybe we'll see. What's it rendering? That's the question we're about to find out. Ladies and gentlemen. I'm excited. Cannot save. Oh, that's my fault. But it did. Yeah. It tried to, it tried to do it. Yeah, cause it doesn't have a file name. No, it's, it's actually not that it, here, I'll show you. Well, I was curious, I was curious cause it's outputting image with a, yeah, it makes sense. Cause we're doing it because, so we don't have that folder. It doesn't have group right permission. And we are, we are root group, but not root user. And so I just need to Chimad 775 Blend Chimad 775 Output Chimad 775 Python. Now I can run this again. And then I can run this again. Blend file? We don't need no stinking blend file. Yeah. Well, I mean, we will eventually. And lo and behold, there is an image and lo and behold, it is. Is it the default cube? Nice. Hey, we did it. All right. That's what success feels like right here. Yeah. Thank you. All right. So let's try and, oh, we don't need a blend file now. So we can skip that. That folder will be empty. We do need the Python script though. So let's do this. So from file, we want script equals Python was it script.py? Script PY. All right. So now if we go back to OpenShift and we look at the config maps, we have a config. Oh, I was in the wrong project. Well now. A chat says, wow, which, you know, I read that and I can't think of it as anything other than the Owen Wilson. Wham. That's what I'm imagining chat doing right now. All right. Blender data. We have a script, which has this thing. All right. So this is where it becomes awful because now I have to figure out how to write a Kubernetes job to consume the config map, the image and everything in the right place. Oh, and it's probably gonna explode because there's no there isn't a blend file.blend. Is there a way to have an empty blend file? Well, I mean, you could, yeah, let me hear, let me just open a blend file and see what it looks like. Make me something pretty, Luke. Come on, something small and pretty. All right. A little more than a cube. Share your screen and show us what you're doing. Oh, okay. All right. Share your screen. Oh, I'm disabled. I can't share. Oh, hang on. Oh, yeah, sorry. Well, you need the dash B, but I don't know if you need the blend file. Yeah, I don't think you know, but I'm saying like, I can... Oh, you have your image, your container's already there. Is that what you're saying? Hang on a second. I'm pretty sure you can just delete op blender data blend file.blend. Like you need the dash B, but you can delete the dot blend file. Right, but anyway, nevermind. Yes, I can. That's right. Eventually when we want to do it, like when we want to use it the real way, we would need that. And that's where, I mean, this is where the Python script would be really, the Python script can be the source of record though, because we could load the blend file in through the Python script. Yeah. That's I think most of the other... This would work. This will work even when we have a blend file to use, because we can... Oh, I see what you're saying. Since it's a config map value... So then let's just assume there's not gonna be... No, we'll leave that as a volume. Why the hell not? Okay. So that's pushed. We don't need that. Okay, we do need the script. All right, so now we need Kubernetes job. Jobs Kubernetes. No, I'm not looking for a job. I need to create a job. We're building jobs here. Containers, image, restart policy. All right, so we're gonna do this and it's not gonna actually work, but that's okay. So let's go back here. We'll go to topology. I'm gonna add from YAML. So we're gonna make a job. We're gonna call it blend one. We're gonna call the container blender. The image is quay IO, RX, what did I call this stupid thing? Blender remote UVI-8 latest. There is no command, which is probably going to fail, but we'll see what happens. Ah, it's doing a thing. Events generated from job controller. Pots, here's a pod. Container creating amongst downloading 500 megs. And then theoretically, Eric, we could just have, like you could have a different pod that hosts a server that presents to the web the image, right? It worked. I mean, it failed, but that's okay. We knew that was gonna happen. Okay, so the next step is to actually, oh, poop, where's my jobs, projects, jobs. Really, I'm gonna have to file a bug. Workloads, jobs. We can delete you. Good job, that's fine. The question from Chad was, what is Blender considered? Blender is just a software application. The BPY, it would be the API into the software application. Well, so, I mean, more generically, right? Like Blender is a 3D rendering application. It has a headless mode that can consume Python files. And so in this case, we are taking advantage of that headless mode. And we're still using the 3D application for rendering. We're just doing it in a, I don't, I lost my train of thought. API, something something API, it's cool. Yeah, yeah, things and stuff. We're doing programmatically instead of in the application. Using generate name with jobs. Oh, please tell me you're fixed because that would be awful. I've already forgotten. Let me look at the jobs again, because I know it's gonna fail, but I don't want it to keep trying. Writing a job spec, pod template, pod sector, parallel execution, controlling parallelism, handling pod failures. Okay, if you do just back off, if I fail a job, I have some amount of retries. Okay, I want restart policy. Oh, I don't wanna restart. Restart policy equals never. All right, when, where is restart policy? Oh, perfect, you already have it. Yeah, but I don't understand why it like kept trying. What's the back off limit then? Oh, that's the number of retries before it's failed. Yeah, so I just, I changed that to one. That's fine, so we're good here. All right, so that was cool. So now we have to figure out how to do the config map because I need, we need the script in there. Cleanup, finished jobs, DTL, job patterns, template, advanced issues, pod selector, alternative application controller, single job, starts controller pod. I just wanna pod that workload reset pod, here we go. What is pod using pod? I want config map. How do I attach config map? Yeah, yeah, there's actually, there's a question from chat about setting up animations or a comment, I guess. But yeah, there's a GitHub repo that actually has a ton of example Python scripts that look neat, I guess. Okay, well, maybe we'll try one in a moment. All right, volume, volumes reset. Okay, here's the volume. So we're gonna be, oh, to Kubernetes support. This is gonna be interesting. Can we do a volume on a job? Well, we're gonna find out the hard way. Blender data, raised from keys. I'll see describe, config map, blender data. We have a key called script. We want to put that, oh, there is no path. I guess that's also script. Okay, then we wanna attach that into the container as a volume mount. So that's gonna go in line with the container name. Oh, this is the template for the pod. Hang on a second, spec containers, spec containers. Okay, so this has to actually go here and needs to be indented to the level of the containers. There we go. All right, let's just try this for now and see if it works or explodes. So what I'm doing is things, let's see, blender, create job. We are going to create a job. So I want to have the job that runs the blender, but I want the config map to actually exist as a volume. We're not mounting it anywhere, which is fine. I just wanna see if it doesn't blow up. And sure enough, it didn't blow up. We got a generated name, which is cool. We got a generated name for the job, which is cool. One error is expected. If we look at this, the log's probably gonna be the same, which is unable to open display because it's not working right, which is cool. I can't get a terminal because that pod is gone. I thought that was a way to launch a debug pod at one point in time, but that's okay. That's okay. We're good. So we'll delete you. All right, so we want to mount that config map into, which has the script into opt blender data somewhere, container file, Python. Okay, volume mounts is at the same level as containers. No, it's at the same level as name. Volumes is at the level of containers. So job.yaml, volumes is at the level of containers, but volume mounts is here. Oh, wow. Auto complete for Kubernetes YAML is pretty rad. Name, blender script. No, yes, sure, fine. Mount path, opt blender data, Python. Ooh, that's not what I want. Sorry, but we have a specific file that we want from the config map. We don't want to mount the whole config map, using fig maps as files, mounting fig maps on it. It's funny, like I've looked this specific thing that I'm trying to do up like 52 bajillion times and I never remember how to do it. Creates two files. I don't want two files, this was just Ray and the items. Oh, this might work. Yeah, okay, nevermind. This might work. No, it's not going to work. I lied. I need to change the config map. Let's see, delete config map blender data. I want to call the key script.py, which probably won't work, but we'll try it anyway. Okay, that did work. So the key is the file name. So if I end up doing script.py, what I will get is script.py in that Python folder, fingers crossed, read only true. All right, let's try to create this job and we'll see what happens. Open shift, create job. Okay, so in theory, this should mount into opt blender data Python script.py from the blender data config map, which has our little square render doodad, not found blender script for field. Oh, this is the name of the volume mount, which has to refer to the config map, which is config. Hey, hey, look at that. Did a thing, container creating logs. Ah, ah, oh, wow. Look at that. Okay, so does anybody know ahead of time what's going to go wrong here? Or what the problem is? It's a very simple problem. Well, it's last. Yeah, no, none of you? Okay, so we're doing the thing and we're writing output into an ephemeral container. So I actually have to now find the volume that we created to store the data and then put the data in there. Although I just realized that there's actually another problem here, which is going to be really funny, but we can probably fix that with a really, really horrifically ugly hack. So let's go back to, yeah, so it did its thing and then it finished, oh, under quit, output, attribute render scene. Oh, that's not good. I'm not sure that that's a problem. So here's our script. It's probably because it doesn't have a blend file. That's why. So we just set the scene or something like that. Essentially, because you load a blend file in, it comes built in with all the settings for render. Yeah, got it. So we would have to manually tell it a few things, I guess. Okay, but it looks like it created the image anyway. It worked, right? Right. It's just, I think, when it's running the script, it's like, yeah, I don't know what you're talking about. Cool. Oh, why is it still pending? Oh, shit. Probably because it exited. No, no, this is something different. Oh, okay. I was gonna say, if it exits with an error, let me. We may be done for today. So I've made a claim for storage, but it's still in pending status. It's not being filled. So there's no GP, like this is a problem with the underlying OpenShift environment. Let me look here. Someone has to go in and approve it or something? No, it shouldn't need to be approved. It should just work. The question is why it's not just working. Oh, it's pending because it hasn't been tried to be used yet, I bet is what it is. I created a claim, but I didn't try and use it anywhere. So maybe that's the problem, but we'll see. All right, so how do I attach persistent volumes into persistent volumes? That's what I want. Binding, user creates has already created a claim, control loop, binds them, and hopefully that's, hopefully it'll work, delete, recycle. All right, volume mounts, volumes. Oh, that's a recycler. Reserving a persistent volume, I did that. Expanding, claims, volume expansion. I want to use the stupid thing, GPV capacity, my gosh. Why is documentation universally so bad? I want to use claims as volumes. Okay, there we go. Job.yaml, okay, so what we want is in the volumes, we need to specify a volume. So this is going to be name, output, persistent volume, claim, claim, name. What is my claim? Name, blender output, okay? And then we need to mount it. So in the volume mounts, now we have name, blender output, mount path, volume mounts, mount path name, opt, blender, data, output. Go back to our job. We will delete this job because we don't care, but it did work. Yeah. Create a new job and we'll go here. Now what we may find is that if it does the same thing again, so it may be generating the image but not storing it and that may be the problem. That's something, I'll try to get the Python ready for you. Well, we're getting close to the time here, so. Yeah, and I may have to... Blender output, not found. Dad duty soon here too, so. Oh, it's just called the output, that's annoying. So let's see. All right, it worked. Yeah, okay, I think it's not gonna work. Oh, it's creating, that's good. This is gonna hopefully start doing its thing in a moment. All right, so that's working. If we look at persistent volume claims, we see that it's bound to an actual persistent volume. So the trouble here is that I have no way to get the data out of that volume. Oh, right. But we need something to provide access to it. You would need something to serve that, yeah, okay. Well, this is cool though. For me, this is a start, because what I wanna try to work on, and maybe, I mean, we can talk about this later, but is trying to figure out how to do a render farm, you know, in some relatively efficient way. Yeah, I'd love to be able to like, throw a Blender foundation, you know, render farm operator or something in the Red Hat Container Catalog, right? If you're like a Red Hat subscriber, just go grab that and toss it up there, right, on your AWS or Azure, whatever public cloud you're using. Throw it on OpenShift and it'll just throw files at it. So we still got the error, but it claims that it saved the thing. So what can we do that's a... Coopie, go ahead. In OC, can you go like SSH into the pod? Pod? No, I would need a pod that has that volume attached to it, which I can do, but the other challenge here is that these are read, write, once volumes, and so, but I can create it like an Apache serving pod, right? The question is how, what's to do? I mean, maybe Eric, you know, maybe we could do another one where we do... Another stream. Yeah, and I'll try to... Yeah, and so probably... Yeah, yeah, yeah, and so probably what we would wanna do is just shove the output into S3, which would be easy, so somebody said SFTP and that would work if I had some place to SFTP into. So right now there's an Amazon EBS volume. If I had access to the Amazon console, I could probably dig into that volume or whatever, but the easier thing for us to do would probably be to, I don't know if the Python that Blender embeds has S3 support, but we'd have to figure out how to get S3 into the Python script and then basically write the file somewhere and then shove that into the S3 using the provided credentials. Yeah. Yeah, we can, yeah, I'd love to do another round of this. Yeah, yeah, totally, this would be a fun Friday thing. Yeah, and I just linked in a Python script there that I put together for animation. I don't know if we wanna try messing around with it or not. Yeah, I can run it locally. And then I'm gonna have to go, guys, I gotta go pick kids up from preschool, but... Okay, man, we're gonna fire this off. Thanks, Luke, so much for your time today and we'll do this for giggles and see what happens. All right, cool, thanks, good. Cheers, man. All right, so here's... If full disclaimer, I have not run this locally yet, so I'm curious. Okay, let's see. Oh, it needs the dash dash render nm, though. Well, we'll see what happens if I don't put that. You know what, I can just change this. Oh, I can override the command, nevermind. So let's do this. So we're gonna run this and then we're gonna run opt blender, blender. I don't even know if that argument's supported anymore. What, render nm? The render nm, yeah. We'll see. Opt blender, blender dash b dash p. Opt blender data, Python script, PY, render nm. Oh, is it piping it into FFNPEG or something? No, that's the second command you need to run to assemble the animation once you have all the frames together. Got it. Line 75 X. Look at that. Now, that's copy. So for some annoying reason, when you copy all with Chrome, like I get cruft only in GitHub raw, it's really weird. I don't understand why it does that. All right, name, you, Tills is not defined. Unable to open a display. Oh, where the heck did I leave you, Tills in there? It's right here. Right at the top, that's an easy to replace. So what do we need to fix? One second. Oh, you're fixing it. I'll give you the file, yeah. Got it. Eventual consistency. Yeah. Post it to an available entry point when created. I'm not sure what Perkinsa means about posting it to an available entry point. Although if we had an app where we could like post the output to, that would also work. Just like a simple HTML receiver kind of thing. But since we're already like barfing out the file, we might as well just put it into, well, it would be cool. You can't do it with Amazon because it's block storage. It would be cool if we could just have an Apache container where it just dumped the output in the volume into the Apache container and then you could just like pull it down immediately, but that would require some more ingenuity. Do you, did you fix it yet? One second. Updated. So we've updated the script. You, Tills is still there. Shouldn't be. Yeah, I might have to update it on GIST. Yeah, it like reverses it. Yeah, that's probably why. Utils is still here. Current object, Utils create plane. And it's easy to fix as well. Fixing the things, doing the stuff. Just waiting for your just eventual consistency. Come on, baby. Still seeing Utils. Yeah, one second. Oh, sorry. Copying lots of 00.00.00. Yeah, I bet. All right, give that a shot. All right, primitive plane. Let's see what we got here. Zero, G, paste, Max, remove, okay. Set smooth shading is not defined. Let's see, how do we tell Blender to set smooth shading? Can we just ignore smooth shading? You can, you can delete that. So just delete it? Yes. And subdivision surface modifier, looks like it's gonna fail potentially too, but we'll give it a try anyway. Yep, not defined. So let's comment that out for Giggles, see if it works. Mesh has no attribute location. Yeah, this is like a bummer. I think we should abandon ship. Unless you're super passionate about making this fancy Python script work. I mean, we'll probably get working for next show or whatever. Yeah, but you have a container image that you can use. Absolutely. It is out there and it works and it's based on RHEL. Look at that, super cool. All right, well, I guess that's a good stopping point for today. Thank you, Derek, for joining. Yeah, this was super cool. I appreciate everybody's patience and all the viewers online for sticking with us throughout this thing. And we will let you know in the future when we're gonna do this again. Thanks so much, have a great weekend for those who are still watching and for those who might be watching this later, we hope you enjoyed. Talk to you later, cheers. Cheers.