 OK. So welcome to the Dillingoo Studio's custom technology talk. We have a studio that focuses on anime, NPR style stuff. We're an animation studio. We post stuff on YouTube at youtube.com slash Dillingoo. And we have a studio that's run all around the world. So today we're going to be talking about the custom build of Blender that we've made, as well as how we run our fully remote studio and our pipeline. An example of some of our work is stuff like this, which is a fully 3D scene made specifically to replicate stuff like Makoto Shinkai, who is of course an amazing anime director, trying to get that 2D style looking as convincing as possible with as much 3D fidelity as possible. This is another example of what we can do. This is just the project that we're working on right now. And it hasn't been released yet. So spoilers. But yeah, so this is something that is fully 3D. We have a little bit of grease pencil, but most of it is done dynamically through shading and a lot of shaders for effects as well. But generally speaking, a lot of this is only possible because of the amazing tech that we've been developing in-house. Big thanks to this guy, who is our main developer late as usual. Yes. And he's actually going to go through some of the custom features that we have, which ranges from light groups, custom nodes, and some little rendering tricks. So take it away. Yep. Hello. I'm Chris, although most people know me as late as usual, which is not the best username. So I'm talking about this custom Blender build that we've been using for the last maybe two years. And basically, we were working on some things and we were like, Blender can't do this. So instead of maybe using something else, we were just like, we'll just take the source code. We can do whatever we want with it because it's open source, of course. So yeah, I will just start off by just going through some of the features we've got. So we've added a few custom nodes to EV. We've added some extra features to EV, like light groups and the ability to detect shadows on objects or whether they're on themselves or on other objects, which I'll explain a bit more later. That sounds a bit confusing. Yeah, so I'll just go through them one by one, basically. So the first thing we added was light groups, which you may know Cycles has light groups. We added this before that was a thing. So this is only an EV and it works kind of differently to the Cycles light groups. So you can see an example here is that the face and the body have separate light groups. So these are basically a tag-based system. So you add any lights that you want to include in a group. You just add the group to that light and the same in the material. And you can toggle shadows individually on each light group. So you can have some lights not cast shadows and some lights only cast shadows. And that's basically it. There's not much more to it than that. The way that's implemented is just there's 128 possible groups that are just stored in a parameter that's passed to the shaders. So that's a really straightforward, simple thing, which I haven't added to vanilla Blender yet because I haven't got around to it, which is going to be a recurring theme. A lot of the stuff in this talk is quite hacky and a lot of it probably will never make it into the main Blender build. But the source code is public, so if you ever want to have a look at it and try things out yourself, you can always just download it and run it yourself. I'll talk a bit more about that later. So here's an example of just the light groups. You can see that each of these cubes has a different light group attached to it. And each of the lights has a different light group as well, which is really handy for lighting characters and environments separately without having all of your lighting interfering with each other. The reason we didn't use compositing for this is a question you might ask, is because we didn't really know how to use the compositor. So yeah, that's kind of the truth. My next thing is if you're doing NPR stuff, you might know about the shader to RGB node, which is basically the main thing people use. So you chuck in a diffuse shader, you get the output of that, and you can use that to do color ramps and things. But we needed a little bit more control. So we added this custom node, which is called Shader Info. Again, not a great name, but it's basically just a sort of deconstruction of the diffuse shader. So it gets you separate passes, so this is what the diffuse shader looks like just for reference. It gets you separate passes for just the lighting. So this is without any shadows or anything like that, and without any of the lighting from the world. It gets you shadows that are only cast from other objects and not on ourselves. So you'll see that the top cube is entirely flat, because it doesn't cast shadows on itself. And then there's a separate shadow output for only shadows cast onto ourselves and ignoring other objects entirely, which is really important if you're doing face shading on a character and you need shadows from the nose or from the hair that's separate. Yeah, and then the last thing is you can get the world lighting information separately. And it supports a normal input, same as with a regular shader. And there's also this world position input, which is used for you can control where the shadows are sampled from in the scene. So here's just a regular scene with just a cube in the shadow, and you can just offset the shadow position anywhere you like. Or you can add noise to the shadow, which allows you to do some distortion effects like this, which is it's a bit tricky to use, because you can end up with things like if you offset the position up to the way, the entire thing will turn black, because the whole thing is considered under the floor. But it is useful for doing things like grass textures, where you don't have a huge amount of color information to work with on all you've really got is light and shadow. And being able to define the edges of your shadows in the grass is really useful. So that's basically that. Yeah, just to run through again, you get all of your separate outputs. And you can do lots of fun stuff with this when you're doing character shading and stuff. That means you don't have to go through that whole diffuse to get it to RGB, whatever pipeline. The next thing is the screen space info node. So this is, if you've done anything with refraction shaders, you'll know that you can do some fun things with them, but they're a little bit tricky to use. So this is just again a sort of deconstruction of an existing EV concept, which lets you input a position in camera space, and it gives you an output, which is the color of the scene and the depth texture of the scene, which you can use to do sort of custom implementations of refraction things. So here's from a short, we released about a year ago, I think, where it's using some sort of custom refraction stuff that it just, it can be a little bit cleaner than using a sort of physically accurate refraction shader. And you can use it to do lots of interesting effects. So this is just an example of using it to do a sort of simplified refraction where you just scale the camera space coordinates down a bit so you get a sort of zoomed in view, which is on this plane in the middle, and also getting the depth output, which is useful if you want to do really funky stuff, which I will explain in a second. So for example, you can use it to do custom smears. So there's no geometry being deformed here, it's just a plane in front of these two cubes and you just add this shader with some noise to the position. And we use this tons for fight scenes and things where it'd be really difficult to rig up the character to have dynamic smear effects built into the rig. And so it's easier to just sort of paste this on on top. This is another thing that you would probably do in After Effects or another compositing thing that we just didn't want to do, but it is actually really fun to play with in the viewport and you can see it, and key frame it in real time in Lender, which is really handy. So here's a couple of other, just little things I made to show it off. So yeah, you couldn't really do this with a refraction BSDF just because you'd be a nightmare to work with. This is just a bunch of distortion textures being layered on top of each other to make this cool slicing effect. Or here I have a cell factor on a plane and I've stored the object position on a UV map and then when it distorts it takes that object position with it. So when you convert that to camera space you sort of get this effect where it sort of peels away parts of the background and takes it with it. And again there's no geometry other than the plane being deformed here so you can literally apply this to any scene and it'll work in basically anything as long as there's no other refraction shaders underneath. That is one limitation. You can't overlay multiple shaders on top of each other. There's only one refraction path so it just renders whatever's on the top layer. That's another example here. This is something I made recently. So this is, by using that depth texture output you can re-project it back onto the scene to basically have a shader that covers the entire scene on top of all of your existing shaders for free essentially. So there's no geometry being moved around here at all. There's just a plane in front of the camera and the rest of the scene is completely unchanged. So yeah I mean it's pretty cool looking but it can be a little bit impractical sometimes cause yeah you can't use alpha blend or other screen space effects within this and it messes up things like ambient occlusion but yeah there's a lot of fun things you can do with it. Yeah so don't ask me to explain how this works. You can talk to me later if you wanna know more about that. I think I actually did upload the nodes for this when I posted it before because I hadn't done this talk yet. Nobody knew what those custom nodes were so now you know, there you go. So then the next thing up is curvature which is another really really handy thing particularly for anime styles. You can see there's lots of examples here where curvature is important like rim lighting and curvature on this post box scene or on the railing up here. So this is basically just the same as the curvature shading that you get in the workbench view like the solid rendering but there's a little bit of difficulty in that EV doesn't have a normal pass it only has a depth output which means that it is a little bit less accurate it's a little bit noisier and it has some issues like it doesn't really work with smooth shading everything looks like it's flat shaded to the curvature shader but for our use cases that's fine because we're really just using it for rim lighting on round things and on sharp things we want the flat shading anyway so that's fine. So here's an example of the sort of it does curvature on the outside does cavity maps on the inside and it also has a separate rim output. So the curvature itself the scene curvature output is signed so it's either positive or negative so if you want to get the cavity output you just multiply it by negative one and you'll get that. And then the rim lighting is a sort of edge detection thing which you can use it for sort of ghetto line art but yeah basically it's just an effect that our art team uses I don't really use it that often because I don't really know how to do that but yeah that's basically that. So here's an example of the rim lighting on the right and the just regular curvature on that post boxing that you saw earlier on the left. There's not much more to it than that that's just it. The these inputs here should be fairly self explanatory like the samples is just how accurate it is at the cost of performance. The radius is how wide the effect is and the thickness is for the rim lighting how far away things behind it should be to trigger the rim lighting effect. There's one more thing which is we have this set depth node. So if you've used the in front checkbox on objects that renders objects in front of all other objects it's basically the same idea as that but this lets you change the depth to anything you want so it's similar to what they call pixel depth offset in game engines. I think Unreal and Unity both have that. So here we're just using it for these eyebrows basically and all you do is you just chuck your shader in before the output and put in a custom view depth. I think I have an example here. So here's an example just on the cube. You basically it looks like it's just being moved towards and away from the camera. I mean you could do that in geometry nodes as well but this is an entirely shader solution. So you can either render things further back or further forward without having to move any geometry around or have any extra modifiers which is, yeah I mean mainly it's used for things like eyebrows on characters but you can use it for some VFX stuff as well. You know spooky portal effects and that kind of thing. So yeah that's the Google Blender build. There's way more features than that in it but those are the kind of visually interesting ones. You know there's a bunch of extra tweaks and UI changes that our team just requested. There's a few things that originally were in that which have now been moved out and ported to the main Blender release. For example like the volume cube node which I think was quite popular, geometry node was originally made to do some gooey effects like this but then that got moved into main Blender because I thought it might be handy and also I had time to properly write out the code and make it actually usable. And the other thing to mention is that the Google Blender version is not an LTS release. We keep it basically up to date with the master as often as possible just because we always need the new features more than we need the stability of keeping the old ones around which does cause problems in our pipeline sometimes but generally we're working in quite small productions that only take three to four months at a time so we can just update whenever we need to. But it does mean that the commit history on the source code is very, very, very messy so please don't read any of my commit messages, they're all kind of a bit, it's a bit of a mess, yeah. So yeah, that's basically that for the customer features and things. I'll just hand it back to Dylan to talk a bit more about pipeline things now. So as you guys can see, he's very talented and we would be lost without him. But yeah, so that's mostly the Google Blender stuff. We obviously have a lot of anime NPR research that we've been doing. If you guys wanna know more about the stuff that we've actually been outputting and working on, I do have another talk tomorrow that sort of highlights that in the theater. So we can see some examples of that tomorrow as well. But in the meantime, let's talk about our favorite thing, pipeline, yay. So if you guys don't know, pipeline is something, a bit of a nightmare for most studios. It's a lot of organizations, especially working between different files, different people, different locations. So we are a fully remote studio. We have people all over the world. In fact, I don't think we have a single group of people in the same city. So that's fun. Actually, I've met him for the first time today. But yeah, so the biggest question we get asked, of course, at the beginning is what file syncing software do we use? A lot of people talk about Google Drive, Dropbox, a couple other big ones. The biggest limitation that we found with that, of course, is storage space, which you have to pay for. And we're a small studio, so we don't like doing that. So we are using Resilio Sync, which is actually a P2P service. And it has no file limits. Because it's P2P, it's actually just fully local. There's no servers. It's just on people's computers and file syncing between people's computers. And that's just sort of how we deal with it. It's a very kind of hacky solution, but it works quite well, actually. Everything gets updated locally for each person, and every change they make gets uploaded to other people, and that's pretty much how we do it. We have a little bit of security issues in terms of make sure you're not working in the same file at the same time, because that'll override some changes. That's the only thing. Everything else is pretty much flawless. This works really well with our library override workflow, which is on a different slide, which we'll go over in a bit. Anyway, Shot Tracker stuff. I'm sure you guys are curious about that. So this is something that a lot of studios use Shotgun for. I've used Shotgun once, and I never wanted to again. So I don't even have that on this list. The very first thing I tried was Google Sheets. So I guess a quick overview, if you guys don't know what a Shot Tracker is, if you haven't worked in a studio before, it's basically a spreadsheet. I mean, that's really all it is. It's a spreadsheet with status updates. You can actually track how far you've gotten the project finished that way. You can track who's finished what shots. You can assign people shots. So Shot Trackers are pretty important when you start working on a team, and getting that organization down was one of my first priorities. So the first thing I did was I used Google Sheets. Google Sheets is obviously a free software. I mean, it's a website and whatever. And when you use Google Sheets, you have a lot of customizability, and we were able to get a lot going with Google Sheets for about, I think it was like, we used it for like two, three years or something. I don't know. And then it got by. We had a lot of redundancies. It was also very messy, but it worked. And we had integrations built specifically with our add-ons that allowed the blender to communicate with Google Sheets directly through the API. So there's a lot of great integrations there, which is something that I loved about Google Sheets. And then I thought, one day, one day I'm gonna make this pretty. So I started looking into Notion, which is a documentation software where we started having a bit more customizability in how we do things. Unfortunately, Notion didn't have a good API for Blender, so we dropped that pretty fast. And then we found Koda. So Koda, if you guys don't know, is my friend described it as Google Sheets but on crack. So that's pretty much all I have to say about that. But no, it's actually great because it actually combines the documentation features of Notion and the customizability of the format on how things can look pretty, as well as sort of how to create progress bars and stuff like that. You have full customizability of how the UI and stuff sort of looks. And you can have tables that are immediately referenceable by the API through Blender. So that's been huge and we've been using that for probably about a couple months now and it's been very, very good. So definitely recommend that and definitely do not recommend Shotgun. All right, so this is something that I'm actually quite proud of. This is the Synced Timeline with Continuous Integration tool, which I'm still working on the name at the moment. But one thing that we found when we were working on so many projects was that as people updated their files, whenever an animator finished a shot, he would play blast it, he would send it over and we would have a bunch of shots that we don't really know how they look in context. We have to pull them in and be like, okay, I'm gonna, we have to spend some time to manually import the files into a timeline to make sure it's matching everything else and stuff like that. And that workflow was fine. I mean, we certainly did it. But I was like, I'm lazy. So as a lazy person, I spent even more time working on an automated solution. So what we have is this Synced Timeline with Continuous Integration workflow, which is actually using Blender's VSE, or Video Sequence Editor, which is actually godlike, I think. It's amazing, you guys should definitely use it. I've actually stopped using Sony Vegas. Thank you. Thank the Blender gods. But yeah, so it's actually incredibly powerful, specifically in being able to generate essentially empty videos. So what we have here is I use a script to generate the timeline with empty strips that point directly to a specific file structure or folder structure. And we have that predefined, of course, as such with Pipeline Organization. We have to have that rigid, but we have the folder structure so we know exactly where each end product file will be to reference. So every video that gets playblasted gets published onto the server, which is through sync. And as that gets updated, we can go into this VSE file and just refresh it, and it'll have every single update on there automatically. So we can get an updated version of our project at a glance without even having to break a sweat. As you can see here, we have a couple layers. We have Layout. I'm pointing on my screen for some reason. Layout, Animation, Effects, Final, Render, Comp, and we have some shot labels as well. Shot labels are very important. But yeah, this is all automatically generated. And as each layer gets filled, it covers up the layer beneath it. So you always get the most up to date, and you can hide them if you wanna see the older versions and stuff. So yeah. And then we have Library Overrides. So with all this stuff so far, we've been using sync, as you can tell, with the continuous, whatever it's called, timeline, anyway. So that thing uses sync because it gets updated automatically, even on someone else's computer. In this workflow, we have Library Overrides, which is basically Blender's version of referencing. If you guys use Maya, which I'm sorry. But yeah, so the Library Overrides essentially will reference other Blender files, and this works for Character Rigs and Environments, and that's a big use case for RStudio as well. We use the Character Rigs and Environments in their own source files, and we reference them in for the actual animation files. This helps because obviously, if the Character Rig is not done yet, we can update it later, and it'll propagate through all the files, and it'll be fine. But we also use it for something else, which is something more of a recent thing that we've been doing, which is actually pretty amazing because we've essentially created a finaling and rendering workflow that uses Library Overrides of the animation files. So we have an additional Library Override layer where we link in the animation files into another file where we can work on the effects so that when animators update their animations, the effects don't have to be redone, they don't have to be, you know, resinked. It all matches up to the animation of the updated animators work. So that's a really cool step there, which has paid off immensely as we work with several people around the world, and we also have a finaling one, which just sets up our render layers, and that doesn't have to be touched every time someone makes an update either. So we have this sort of redundancy layering that helps everyone work and update things at their own pace without destroying anyone else's work, which has been hugely helpful, and I think probably the reason why it got some sleep last night. So, and then we have render check. So render check is a solution that I was originally working on as a very crude version of it, just as a batch renderer, something that we do obviously is we render a lot, and we have a lot of files, especially with so many people, since you want to split up every shot into a different file for distribution. So because we have so many files and so many things to keep track of, I wanted to have a way to automate batch rendering and automate project status, essentially. And I made a really, really bad one. I think it was like with Tecinter, Python or something. Yeah, it was bad. It was like Windows 90 UI or something. And I got it done, it worked, and then I showed late and I was like, hey, I got this thing. Do you think you can make it better? And so he made this, which is something that's honestly, I don't think we could live without with how much it actually provides value for the project organization. We have so many files. I mean, I think this one has like 100 shots or something. I don't know, I can't read. But the shots all get basically listed out in the project folder automatically, and they also scan the render files. So all the render files get accounted for and put on a progress bar so we know exactly which files have been rendered, which ones haven't, which is otherwise not super accessible. There's a couple other solutions that we have out there that are available for rendering and stuff, but we found that this one, as a custom solution, gave us exactly what we needed. We also have a lot of custom sort of like conveniences like you can open the blender file from this window and you can also open up the folder to check on the renders themselves. And even open a video player, which is Blender's video player, which has frame by frame, which is really cool. So that is render check. And now we have an announcement. Okay, as you guys already know, late mentioned it in the beginning, but officially, Gooblender is gonna be open source and it's on GitHub right now. As of yesterday, I think. Yeah, and we're, yeah, yeah. And yeah, this is the first time we're sort of like letting other people try it out. We have no idea if you guys are gonna find stuff that breaks earlier than us. You probably will, but let us know. But if you wanna check out the GitHub, you can. And we also, of course, have the pre-compiled version on Patreon if you wanna support us to continue developing Gooblender. We have a lot of things in the works and a lot of new features that we wanna add. But yeah, I would really appreciate if you guys check it out. I think you guys will like it because I've been enjoying it for the past two years. I mean, not every feature was there two years ago, but you know what I mean. But yeah, that's the biggest thing. I think for now, if you guys have any questions, feel free to ask. We have a bit of extra time, but just to let you guys know again, I do have a theater talk tomorrow that goes over a bit more of the NPR stuff and with some examples that we've made. And I also have a classroom talk tomorrow on about cameras, cause I'm a director and an animator, but mostly director. But yeah, so yeah, any questions? Do we now? Oh no, yeah. So that's simply because we had some re-renders, I think, and also changed some frame ranges for production when we had two hours of sleep. So yeah. That's also probably true. We took this screenshot from our Discord server and I'm pretty sure the Discord message was, hey, this is broken, what's going on? So usually it works better than that, but yeah. Who knows? Yeah, sure. The library overrides. Yes. Yes, we have all the issues. Library overrides are sort of like, God giveth, God taketh away. It has so many advantages. It is incredibly necessary for pipeline stuff, but it does cause a lot of crashes. It does have a lot of loading times. The biggest thing that we found that helps is we turned off the auto-resync for library overrides. It's in the experimental features. If you turn that off, it will prevent sort of like big opening times, but what you'll have to do is you'll have to manually re-sync some stuff, or you can just save it after you open it, maybe re-open it with re-sync on and it'll be fine. The lessons I've learned. Yeah, I mean, there will be, but. That's the main reason that we keep updating the build regularly is because library overrides were being developed at the same time. Yeah, just to repeat that, we do update the build instead of using LTS almost specifically because of library overrides. It's one of the biggest things that has a lot of great bug fixes that we do need. So we put in the extra effort, primarily honestly because of that, along with a couple other features. Question back there? What about mesh trails? Difficulty with mesh trails. I think probably Lake could talk about that a bit more if you want to, but mesh trails works pretty well. I'd say it's great. I mean, mesh trails just recently, I added Olympic support because I forgot how to do that. So that's been added now so you can use it in sort of better studio pipelines because it does have some issues because it's an external add-on. For example, you can't use it with library overrided meshes because it needs to be able to edit the mesh data. So it does have some quirks like that, but yeah, generally mesh trails we've used basically every production we've used since I made it, right? Yeah, every production. Yeah, I mean that clip that you saw right at the start is another mesh trails showcase. You saw some trails being used in that. But yeah, mesh trails was sort of just a fairly straightforward thing, like we needed mesh trails. So we just made the add-on. It sort of worked fine since then. That's all there is to it. Yeah, it just works. So hopefully that answers your question. Yes, into master. I'm really one with that. That is a good question for late as well. I think... So the screenspace info note is about five lines of shader code because it is just like exposing internal stuff that's already there, which is the main reason I didn't put it into master because if I, yeah, like it's just a sort of hack that exposes some internal data that might not always be there or might, you know, it doesn't work in cycles, obviously, because cycles is just a completely different engine. And the same with the shader info node. Actually, recently with the new EV sort of evolution that's coming along, a lot of the shader code has been sort of juggled about a lot, which broke a lot of my code, which has since sort of fixed in a bit of a hacky way. It's quite roundabout. But I think when the new EV arrives, which is coming sometime this year, maybe next year, you have to ask the Blender does about that, that will probably break a lot of our custom stuff as well, because the shading pipeline will be different. So chances are that node is going to go away at that point anyway. That's, yeah. I mean, the source code should be pretty easy to port over if somebody wants to. I'm sure somebody will. Yeah, that's that really. Yeah, one of the things about our custom build is that we almost exclusively use EV. We actually, I think turned off cycles at one point because we didn't want it. But because of that, it's not super compatible with master, which is one of the reasons why we have our own branch. Yeah. Yes, in the back. I think a lot of the answer to that question is money. Some things cost money, and I didn't want to have to do that. And also just ease of setup. I think another thing was just having a software because Resilio sync is very, very sort of like low barrier to entry. We have a lot of people who are not super technical, who might want to join and be like, I don't, there's a lot of room for error for certain things that have like three or four steps to unlock a file or whatever. For Resilio sync, it's completely automatic. As long as it's hooked up, it just works. We just ask them to be a little careful and we're good to go. So that's one of the reasons why I like it. But it was also just something that I was like, I knew it would work because I was familiar with it. And we just ran with that. Yeah. No worries. Yes. Like what feature from the main branch will we want? Is that what you're saying? Not on our eyes at the moment. I would say honestly, the geometry node stuff is pretty interesting. Usually they have a lot of great updates and that's absolutely worthwhile to get in there because we use it quite a bit, especially for effects. So probably geometry nodes. And then if there's any more library override updates, then definitely that. Yeah. Cool. Any other questions? Yes. Yeah, that's a great question. So we actually have local versions of those files on people's computers and their user folders. And then we have a publish button that pushes it to the VSE timeline. So the old versions and stuff like that are all pretty much in their user folders and their working directories, which we can all reference because we all have access to them. It's not like we don't have an automatic backup system per se or we have the versions there. But we just say if it's ready to go, publish it. If it's something that we don't really like, we can revert it because we have the previous file still there. It's not like it's overriding anything. Yeah. Yeah, so that's pretty much it. We just never had a reason to sort of like back it up again, essentially. Yeah. Cool, any other questions? All right. Thank you very much. I'll see you guys tomorrow at the theater talk. He's amazing.