 Oh, see, the inspirational music just ended. That means it's time to begin Logic Live. Greetings, everybody. Welcome, welcome, welcome. I'm so happy to see you all, see you all, see your smiling faces and your, well, what I'm assuming are all smiling faces. Everybody, please make sure to set your chat settings to all panelists and attendees so that everybody can see what you're typing. And welcome to Logic Live. Let's get this thing going here. All right, Logic Live, as always, is brought to you by Cinesis.io. Solutions Development Integration and Support. These guys have been my personal reseller for 15 years and I could not do, we could not do what we do at our company without them. For more information about their remote streaming solutions, please check out their website at Cinesis.io. Cinesis, supporting Flame Artist since 1997. It has been a huge couple of weeks here in Logic Land. We launched the new forum at forum.logic.tv. I see Randy's here. Randy, hands a round of applause to Randy McEntee here. All right, who's the brains behind the operation? If you haven't had a chance to check out the new forum, please do. Just go to forum.logic.tv and sign up. We discussed it on the show last week. We had several hundred members. Randy, if you could give us some stats in the chat, that would be great. I think it was 40,000 page views as of last week or something like that. So it's really 50,000. Wait a minute, I'm gonna play something for that. There you go, 50, it was Randy, by the way, who taught me how to add sound effects to my stream deck. So 50,001, wait, there we go. I see a theme here. But the forum is going strong and I'm really excited today. We're gonna be giving away one of the Logic Phone Chargers to a lucky entrant into the two sentence tip little category that Randy set up over there, which is a great collection of tips and tricks. We also launched a new one frame of white for 2020, one frame of white is back and better than ever. So for those of you in the list here who aren't familiar with one frame of white, I can't imagine that there are any of you, but just in case one frame of white is the greatest creative challenge of all time, your challenge is to make the most amazing thing you can think of using only the tools in Flame and the theme for this year is joy. Because dammit, I don't think we've ever needed something funny or something happy or something inspiring more than we do right now. Maximum length of entries is 30 seconds and the contest is underway. Entries are due on September 30th at the end of my day and we're gonna announce the winners on October 11th here on Logic Live. Everyone who enters receives a 30 day license of Sapphire OFX, which you can use in your project and you can download a 30 day trial of Flame, details on how to do that and everything else are at oneframeofwhite.com and hey, let's talk about prizes for a minute. We've got a tremendous lineup of prizes this year. First place is a Dell Precision 7750 mobile workstation courtesy of our friends at Dell, Intel and Nvidia. And this is an actual frame I pulled off of the Dell website. So that looks like Flame and seeing it on someone else's website other than my own or Autodesk is a beautiful thing. This is a fully loaded workstation and I'm looking forward to giving this away. I can't wait. Second place is a 12 month license of Flame and third place is a 12 month license of flare courtesy of our friends at Autodesk. Fourth place is an IO4K from AJA. Fifth place, a 12 month license of everything that Boris offers in the Boris FX suite. Sixth place is a $500 store credit to actionvfx.com the absolute best stock footage on the internet. Seventh place is a set of AirPods Pro from our friends at Synthesis. And eighth place is a $99 gift certificate to FXPHD. Again, the contest is on now. There's still some time left. It's never too late. So if you want to sign up, head on over to oneframeofwhite.com and fill out a registration form. All right, let me stop the share. And there we go. And let's invite Christoph to turn his camera on. It's- Hello, Andy. Hey! Hi, how are you? You know what? It's always golden hour behind- Yeah, you don't want to see the real background. So I thought I'd make Zoom do some cheap keying for me. So yeah, from an undisclosed location, somewhere in Hamburg. Well, welcome back, man. You were on the fourth episode of Logic Live, showing us Nuke for Flame Artists. And we're at episode 26. It's been six months since we've been doing this. And I just want to thank you so much for being a part of it in the beginning and helping to get this thing going. And welcome back. Thank you for getting the whole thing rolling. I mean, this is awesome. Amen. Well, let's dive right in, man. You're going to show us some beauty techniques in Flame, correct? Correct. So I'm going to start my screen sharing here. So now you should see- Oh! So now you should see my Flame. And the first thing I want to talk about is the A2 Beauty shader. I think everybody knows about it by now. And I've seen in the audience, Alex Oz is here and he a year ago probably gave me a really helpful rundown. So I'm going to try my best to repeat that for you guys in the audience because A2 Beauty is a really powerful tool. So what A2 Beauty is trying to replicate for those who don't know is a procedural way of replicating the Dodge and Burn effect. So Dodge and Burn, if you Google that, you'll find a plethora of tutorials to suspiciously all run a little bit over 10 minutes to break the YouTube ad market and they absolutely tell you nothing about this except that you can Dodge and Burn in Photoshop. The technique originated in the photochemical processes. So it actually meant that you dodged or burned parts of your image by exposing them longer or shorter or by hiding them behind a little flag. So that was the original idea. So what we are seeing now is the procedural way of doing it in flame. And A2 Beauty, as I said, is a matchbox shader. So we're gonna pull that right up and it's right up here on the top of the list. And I'm gonna zoom in here a little bit, get rid of my panel. So now you should all see that. And I'm gonna concentrate on her nose here. So if we switch to the result view, you can already see something happening and this is the first thing I usually do when starting anything in A2 Beauty is dial the recovery amount back to zero. So really I don't want any recovery happening which means restoration of details. I wanna set up my fix first, which is this area here. So I'm gonna dial this up and if you haven't got any ratio or angle setup, this will look on first glance like a normal blur. So I'm gonna set it up to somewhere like here. And then you can give this a ratio, something like that. You can pull it above one. I don't think a lot happens above one. So now we, you see now it looks like a directional blur. We've seen that before. We can now change the angle. And as I said, I'm concentrating on the nose here. So if we go back to the source view, you can see I'm trying to find a setting where I lose all the spots here and I especially picked that image because we've got some big dimples. We've got some small molds. So it's a nice mixture of different things we normally would have to tackle with front offsets or things like that. So let's see. Let's go back to the result view, bring this somewhere like here. So now I'm losing all of that. So now I can start to bring back my detail. I want to do this very slowly. And now you see you're getting this poor structure back. We're not getting the zits back, but rather just a little bit of texture. And you can further refine that by changing the size. So this is the size of the restore. If you dial this down to low, this will look very strange, very patternly, so to speak. And you can also change the search radius, which means it changes the area around each fix, how big that search area is. So let's set it up to something like here. So now if you imagine if we would mask this just for the nose, this would be a good start to give me a clean patch of that nose rather than the original nose. So what we also got here is the ability to bring back highlights and shadows. I would advise you to use this really, really carefully because if you dial these up, you're restoring the original highlight. So you can see, I just turned this to a value of 0.27 and already I'm getting the highlights of the zits back onto the nose, if I crank this up, you see now I get all the highlights back, same with the shadows. And this what I'm doing right now is basically what you never ever wanna do, crank up both to one because then you're landing on your original image, you're restoring back all the highlights, all the shadows, this is for nothing. So let's bring this back to a really, really reasonable amount. Something like that, or leave it off altogether. So something like that. And lastly, if you wanna see where the effect is taking place the most, you can switch on this dodge burn preview which basically is a difference key showing you where the effect happens the most. So this is like the basic setup without any matte supplied. But we all learned from Andy's great course last week that we can use machine learning to generate ourselves some mats. So I'm gonna skip over that part because we all learned that. I'm just gonna duplicate my A2 beauty shader because it's got two matte inputs. So I'm gonna connect my source first like that. The second matte input is for the matte itself to localize the effect. So if we switch to that now, let's switch off my preview, you see now the effect is just happening on the inside the mat. I wanna pay, want you to pay close attention to the bottom here of the nose because we get a little bit of an edge here. I don't know if you see it with a compression but this is not a really a smooth transition and there's a reason for that. Right now, A2 beauty is looking everywhere in the picture to draw for the detailed search to bring back the details and that can leave to some hard edges. But we've got the second mat input which is a custom detailed search area. So by that mat we can tell A2 beauty just look in that areas of the mat for details to restore. So if we hook that up here and switch on use custom detailed search, I'm gonna zoom in really close here. You see, we're getting a much softer result here. This is not a blur on the mat or anything. This is really just constraining the detailed search and especially if you're tackling larger things, for example, this crane here, then a custom search can help quite a lot because if you're working, for example, close to the eye you might get details of that eye itself back into the restore. So this is a really helpful thing that you can define this custom area. Another thing I want to say is what you really wanna do is use A2 beauty locally. You want to apply a different setting to the nose, for example, as for the cheeks. At least if you're this close like in our example here I know a shader like A2 beauty tends itself to be this one bullet solution, but I think it really is more powerful if you think of it. Okay, I'll use this one just for the nose and then I'll pull up another one and hook that up. Oops, let me change that. For example, for the cheeks and give that its own unique settings. And I think that way you can achieve a much better, much more natural look. So that's working. That's properly, we can go with very little here. Yeah, but that's basically the idea. And before we leave the A2 beauty I want to introduce this cousin to you because while developing A2 beauty, oops, sorry. Autodesk also developed the washer. So the washer, it's got a different algorithm. The idea is principally the same, but it's got a different algorithm. It's faster, it's maybe not as precise, but in some use cases I found it quite helpful. So this one has got just two inputs. So you can't define the search area in this one. And for those of you who played around with Silhouette Paint, for example, this does something very similar to the blemish brush in Silhouette. So we've got the wash amount and the grain amount. So if we dial back the grain down to zero, no, no, to one, one is minimum, okay. So the wash amount is the amount of blur basically. And then we can reintroduce with the grain some of the original art effects. And we can define a threshold for that. So this is not really the best use case, but I wanted to introduce it to you anyhow. Yeah, I want to echo what Randy just put in the chat. Like, I mean, I saw in the release notes that there was something like a matchbox called the washer. And then in my ignorance, I just assumed it was like the paint tool and said, well, if I ever need it, I'll go and look that up. Where was this? Oh, my ignorance. I could have used this a million times in the last couple of months. And it's one of those tools, you'll pull this up. And I would say four out of five times you'll choose A to Beauty because it works better. But there are these one or two shots or one or two instances where it just doesn't do it. And this one works. So that's why I wanted to highlight that it's there. And by the way, same goes with the shading. This does the exact same thing as the shading sliders in the A to Beauty. So it restores the highlights and the shadows back on top. So use them sparingly. These are really, really, mm-hmm. Yeah. That's great. And Kristoff, thank you so much for, I'm not only opening my eyes and everybody's eyes, but the way you illustrated using A to Beauty I thought was perfect. Thanks so much. Hey, tying in the ML stuff from last week was a beautiful natural flowing transition. So thank you for always thinking about, you know. We're going to set this up as a multiple series. So I'm going to. This would be like a year long arc. Right now, in the next episode. That's right. But I just want to give mad props to Alex for spearheading the development of this tool. It's just been like a gift to the whole community. So thank you, Alex. It's great work. And I use it every day. It's such a great tool. So thanks to Alex and everyone involved. Randy was just asking a question. Does anybody know if what makes this different from a median blur or from a median? Evar? Evar, are you there? Moving on. OK. OK. Maybe Alex can chime in later. It behaves different than a median blur, but I can't give you any technical specs or anything. But let's go to the Gmas Tracer. What I would like to call my ode to the Gmas Tracer, because I feel this is such a powerful tool. And it got some bad rep at the start when it was introduced before, because it was different. It was different to what we knew from the Gmask. And I think what I also only realized over time is that this is not only technically based upon action, but it is an action optimized for making mats. And if you know beauty work, you need a lot of mats. And yeah, machine learning is all swell. But in a lot of cases, you need your own dedicated mats. And the first thing I want to show, because I've been asking for this for ages, is that since Mocha 2020.5, you can import shapes from Mocha into the Gmas Tracer, which is so awesome. And I'm going to stop. Oops. Where is it? Where is my Gmask? Wait. I just wanted to remind everyone. My mistake. I have to. I was just going to cover for you. I just wanted to remind everyone that if you do have a question, feel free to put them in the Q&A panel, just in case someone doesn't have the chat window open. That way we can ask them. So just a quick side step into Mocha. I want to Mocha here very quickly. So this is something I like to do with a lot of my beauty shots if the tracking isn't too hard. Just to prep it, basically track a lot of the planar surfaces I can find in a face. And even if the brief doesn't initially cover, for example, the forehead or something, if I once get down to tracking a shot, I'd just like to apply these and use later down the road. And this has been quite helpful. So I'm thinking of Mocha more and more in kind of a prep tool, not just use it when the traditional trackings don't work, but rather run it through it like through a match-moving tool, generate those tracks, and utilize them later down the road. And what's now really great is that you can select any of these and say export shape data. And you see now you've got a preset for the flame tracer, which is the GMOS tracer. And you can export either a bake shape with a basic option or with the animation move to the axis, which is really great as well. So that's that. And I already loaded that up into my GMOS tracer like here. And there it sits. Pretty neat. That's good. What also a lot of people don't know about the GMOS tracer, and that's what I find really cool is it's got an integrated keyer. So and this is really, for beauty work, really nice that you've got a nice localized keyer. So I know I'm only going to key inside my mat. And now I can select that shape, go over here, and go to the tracer down here in that menu. And switching to F8 to get my result view and hitting it again to get the mat view, I can now set up these two boxes. Now the colors might look familiar to you because the red is for the foreground and the green is for the background, like with any node and badge. So now you just pull the green box where you want stuff removed and the red where you want it kept. So you define your font, your background basically, like that. And the nice thing is what I really like about this kind of keying is you can pull up as many of these as you like. So if I need another background to get rid of that stuff up here, I'll just add it, bring it there, get rid of that. And I can also animate this. So if I find a frame, for example, here, let's say for argument sake that I want to get rid of this one as well, I can add another analysis here. So now I have to move my boxes, of course, like so, like so. And now flame would interpolate between those frames. And this way I can set up a really good key even for the little localized stuff. But I also use this for classic green screen keying. And it's so nice to have a separate dedicated key for each and every shape. And it's basically taking the whole concept inside out rather than starting off on one key and then using various shapes to bring in other keys. You're starting off with the shapes and setting up individual keys until they form a complete other. But I found this really nice and intuitive way of keying. So I wanted to highlight that. I was so happy to see this introduced back into the GMS tracer or when the tool was. I remember when the tracer boxes were first introduced back like in the year four or whenever it was that they were brought under the modular keyer. And I mean, I remember the sample footage. It was like a lion, you know, big mane and everything. And it's just it's a really it's one of those under either underutilized or underappreciated tools in flame, especially as you described because you can set that stuff. And it really also really, really behaves differently than the old tracer in the old Gmask. It's a different algorithm altogether. And so I only encourage everybody to give this a shot. I think it's great here. Christoph, there was one question in the chat about maybe just a clarification about the the mask you exported from Mocha into GMS tracer. Are there options for access or or shape animation? I think that's basically the one that I highlighted. So you've got the basic. Let's let's try this out. I haven't so this is I'm winging it now. But I think the logic live exactly. So we're going to export the basic shape. That's left eyebrow. We're going to save that. And I'm going to name this basic. So all right, gotcha. So this is my shape. Sorry, my keyboard switched to German for some reason. So now we're back. Yes, great. OK, so this is the shape we just had. So we've got the animation on the access. So this is what it looks like in the channel editor. So speed rotation scaling, I think shearing as well, is all on that shape. So let's see what the basic one looks like. I'm going to pull up another GMS tracer and load that up. Left eyebrow basic. This looks a little bit different. I think there's a little bit of an alignment issue here. But as you can see, we don't even get an access imported. So the animation seems to reside there. Yeah, sorry, this was totally winged. So this is not the result I would have expected. I always went with the shape with the with the access and shape, which works neat for me. But yeah, so we'll have this have another look at this. Oh, thanks for thanks for giving it a shot, though. All right, I imagine if you did more if there's yeah, it's like if there's more shape animation versus access animation, you can get that separate. So next thing I wanted to highlight is like action. We've got multiple outputs out of the GMS tracer. And what I find really helpful about that, it's OK. The one idea is, yeah, you just got one note instead of six G-masks, old-style G-masks. But the other thing is, if you look inside here, I use the motion vector to drive these shapes. So if I switch now through the different outputs, those six are all driven by the same analysis. So you do one analysis. If that works, you can drive six mats with that. And we know that motion vectors are a little bit fidgety in regards to caching and all that. So at least that way, you only have to cache once and not like six or seven or eight times or something. Honestly, I hadn't heard that. Honestly, that's news to me. Yeah, no, I must only be on this side at this point. I wish people would speak up about it more online. So another thing that maybe some people don't know about and maybe that's why some people don't use motion vectors for Roto is, let's take a look at our example here. So we've got, once again, this shot here. So now let's switch to mat view. Mat view, like that. So you see the mat is traveling, but the shape stays in the same position. Well, this is due to a technical limitation. First, the shape gets drawn and after that, the motion vectors get applied to it. And that's why the GUI can't move along with that. Well, that makes Rotoing pretty hard, but there's a shader for that. So that in this case, this is camera shader that Autodesk introduced and that's the mat viewer that you can attach that to the camera. And by using that, if you go to the comp view, which you've got in the GMOS Tracer, even though you need to output basically just the mats, you can fill that area up. So now you can't move the GUI itself, but you can now interact with it and adjust that. Oh my God. Oh, sorry. Oh, that's great. And also you can set this up in a couple of different ways. So you can choose your color, obviously. You can also choose to only show the semi-transparent areas. That is quite helpful if you want to get in here a little bit with more detail. And you can also do a slap comp like that. And of course, this works once you got it set up, this works on every output. You just have to activate it like that. So now you've got it there, or on the cheek, like that. Set it up like that. So there you go. This is the greatest. I've used motion vectors with the GMOS Tracer before, but I had no idea about that matchbox. The manual. I had a similar reaction when I learned about it. So it's pretty helpful. Oh, this is the greatest episode ever. Sorry, everybody else. You're making blush. So... Wow, how long has that been available, the Matt viewer? I know I see... 2018 or 2019. Wow. Oh, since year four. Thank you, Charles. So I stopped reading release notes. It's probably how long it's been. Oh, thank you. So there's one more thing I wanted to show with the GMOS Tracer. And that is, I said it at the beginning, it is basically an action optimized for masks. Action is a 3D compositing environment, right? So what you can do, and this is actually something I did on a job today, and which was quite helpful. Keep pulling up action. That's the GMOS Tracer, right? So some of you guys might have noticed the GMOS Tracer contains a camera, which means we can import ourselves a match move. Like that. Fine. So now, oops. I have to switch my camera, of course, like so. All right. So now I've got all my tracking marks, like that. And if my shot is already a match move, these can make my life a lot easier, because now let's get rid of the material quickly. So that one. Now we can take a GMOS, attach it to that. And basically now that GMOS gets filmed, so to speak, through the camera, oops. So now that sits in there. And this is pretty neat. And big shout out to Fred Warren and to Randy, because my first posting on the new forum was about this. When I prepped this for today, I thought, yeah, and now I want to take this to the Tracer. Now I want to key that out of focus lady. I went to F8 to get into my object view, and this happened. Wait, let me see. Show this once more. So suddenly my shape got to a position where it went back to default. And there's a little knob you have to switch. And it's, I didn't find it for anything. And to make another point, I asked this question, I think a year ago on the Logic Group, and I couldn't find my posting anymore. And I know Randy made a big point out that Facebook is poorly searchable. So I posted on the new forum, and now it's there for eternity for anybody to look up, if you don't find this one, because under the mask tab, here's this alt of frame button. You have to change that. Oh my God. Show your camera. And now you can go to the Tracer, add your analysis. So now you've got a 3D track shape utilizing the GMOS Tracer. This is pretty neat. Okay, so just in the course of 60 seconds, you've shown the world something that they didn't know existed. You've solved like five problems that everyone struggles with every day. Identified button that we've never played with, and justified the whole creation of the new forum. So this, I mean, you're clearly like the greatest guest we've ever had here. Oh, stop it. Yes. All right. Okay, top that. So, Christoph, this is amazing. Cool. Thanks so much. So, and by the way, I should give one big shout out at this point, because this is all from the FX PhD courses I did. And I just wanted to say a big thanks to John Montgomery for letting me use this material for today. And yeah, if you want to listen to me babbling a little bit more, there are a couple of courses on FX PhD where you can do just that. Yeah. In the chat, MB mentioned that you were showing this. You had shown some of these rather in the, as part of your FX PhD courses. So definitely check them out. So anybody who already watched them, sorry, it's a little bit of a repeat, but also some new stuff in there, hopefully. Okay. So the next item on my list would be a little selection of beauty matchbox shaders that I find really helpful. So I'm going to switch over here. So the first one is by Luis Saunas and it's wireless. And it's, it's really a neat little wire removal. So the way this works, oops, I have to reset this. We're going to start fresh. So in the result view and only in the result view, you'll get this little line here. So you can now pull up a start and an end. You can animate this. These are actually available in the channel editor. So you can track them if that's applicable in that shot. So now you can draw a straight line or our hair that we want to remove this one here is not straight. So we've got a curve to just the curvature. And we've got the hook, which basically pivots. It's the one side or the other. So these are the controls you have to define a wire you want to remove. What, what I should mention is initially this one is mostly off and you don't see that line. So you have to switch on draw workings. Once you got it in position. You turn it off and you dial up the radius just until this goes away. And what it does is it's kind of a zip brush. So it tries to clone in from left and right until it fills that up. And you can also adjust the angle at which it restores it wouldn't make much of a difference on this very out of focus skin. But if you've got a sharper more defined background, then it's worth playing around with the restore mix and also with the angle to get it into shape. It is prone to a little bit of artifacting, especially here on the, on the edge of the frame. Let me see if I can switch those icons off. Yeah, there. So you see you're getting this hard edge. So often this needs a little bit of cleanup work, but it's a great way to get rid of little stray flyaways. And then what's really neat, you can stack a million of these upon each other. And it's still really, really, really fast. So I've had hair jobs where literally put up, I think 40 matchbox Alice by Alice shaders after the other. And you could still shuttle through that clip, but that was awesome. And then we get to Mr. Cleaner Flame, which is everybody's best friend, the Crock beauty shader. And I don't think I have to say much about this. It's, yeah, I think one of Eva's best pieces, honestly, this is amazing. So I'm going to reset this back to default. And the one thing I just really find, this is really play around with the settings, find the way it works best for you. The only thing I find that you need to make notice of is you can define the skin color. So you always get a little bit of a tint, but this can be really, really helpful. If you don't want that, just turn the shine amount to zero. So that way you're getting something akin to the washer. It behaves differently. Of course, it's a different math under the hood, but this is the way I like to set it up, turned on the shine first. And in the end, I might play around with it if I wanted to enable such a look. But probably Eva can say much, much more about this. I just wanted to quickly highlight it because this has been used, I think, to clean up skin, of course. I've read about floss being cleaned with it, chicken, and curtains. So this has been used for everything. I've used it for clothing, used it for tablecloths. See, Mr. Clean of Flame. So then another Louis Saunders shade I really like as a blur. This is kind of a medium filter, really, but a really fast one. It does something really nice, I find, to the skin. So once again, bring down the edge preservation, which is the restore to zero, and then dial up the blur. So that way it just looks like a normal blur. But once you bring an edge preservation, you really get this defined area look. So this might be helpful in combination with, for example, the next technique I'm showing, which is high-pass filtering. So let's copy this over here. So high-pass filtering of frequency separation is basically the idea that you can independently work on the color aspect of your image, as well as the details aspect of the image. And there is a shader for that, of course, which generates you this high-pass, which is just the details of the image. Now this has got a very low setting, excuse me. Sorry, there. So if we crank this up a little bit like that, we're getting somewhere. So then the idea is with a blurred version of your color layer, and this, that you use a comp node to recombine then by using an overlay mode. And there are different ways you can achieve this. There are, I think, 50,000 different ways to achieve high-pass filtering. But this is one way that works really nice. And now you can imagine, for example, that we can now, for example, take a look at that ZIT. We can now just use a paint to, for example, clone out that. I'm doing a poor job here, but something like that. So now I just painted on the details without affecting the color or stamping that. And now we can, for example, try to get rid of this reddishness there. So something like, let me set up a context here like that. All right. To a view for the rescue here, let me just clone this away. So you see, I can keep some of that detail in there. So this is a pretty neat technique. And maybe we can blur this up. So you see, we're keeping, let me undo that, redo that. So this is a nice way of blurring the underlying color to get rid of, for example, splotchiness in the skin, but you keep that skin texture. So that's a really neat way of doing that. And with the shader, you can adjust the strength after the fact. So you can bring this up to get more detail or bring it down. So I wouldn't use it in the way I'm showing it here on the whole picture, but it's rather something for a small area, but for illustration purposes, I think this works. Okay. Randy was just wondering in the chat, does that pass the difference test? Like if your difference mattered the before and after? Let's see. I mean, I wouldn't think so because I put an arbitrary value here in the strength. So I think if I really want to do the before and after, I'd probably use something like a multiply with a blur, set it up more traditionally, but I just really like the ease of use. So let me think how would we set this up? Think. I have to look it up. Sorry. I think right now it wouldn't pass the difference test because this is really something this way. As I said, I use that on a localized area. I don't use it for the whole face. So this is actually what I like, that I can adjust the strength to taste. So I don't want to use it like exactly for the same, but yeah. I know that there's a way to set it up pretty easily with a multiply, but before I get stuck here, if it's okay with you, I would rather move on. Oh yeah. No, no, totally. Cool. All right. So one thing that was, I was always envy of the After Effects guys was spline blurring because it's really neat for hair retouch if you can blur along a spline and for other use cases, of course. And once again, Luis Saunders did a great shader for that. That shader needs a matte input and that matte, that's the only thing you really have to get your head around and needs to define not the area where the blur is applied, but rather the spline along which is blurred as we, I'll show you what I mean. So I wanted to blur that side of our hair to get rid of all these flyaways. And I wanted to do it in a way that it's got the correct direction here, that it's a little bit more curved down here so that it follows along the underlying line of that hair. And the way this works is if we take a look at the matte, everything that's totally white gets ignored, everything that's totally black gets ignored, but all the gradient stuff in between, that's where the, where the spline gets picked up so to speak. So if we connect this to the spline blur, and take a look, so now you see what's happening. This is of course total rubbish, but here this is, these are the default settings. So if I bring this down, you can see what's happening, but you see that the blur happens in a different direction here than for example up here. And if I would give this a little bit more care, I could really follow along that curvature of the hair with that spline blur. Of course what I need to do to localize this is to set up a second matte, this one for example, great job, like so. And then comp that, comp that on top of that, like so. And that way you could patch this through to the end. And this is just the one thing to know, even though the shader's got six or seven inputs, you only can effectively input to the front and the map, and the map is where the spline gets applied. It's worth playing around with because you can achieve some really great stuff with that. So the last thing I want to really quickly mention is the crock skin shader, which is just a skin generator. And this is helpful, for example, when there's already very little detail on a person's skin, or if that part is so messy that you want to introduce some smoother texture, then this can really be helpful. Of course you then need to track it in, maybe do a high pass filtering to apply it as an overlay and all that, but you've got some nice options here to set up different skin types. So you can introduce details, you can... What can we do, for example? We can enlarge in the pores, like so. So you can set this up really to your liking and then track it into your shot. So this is something... I don't use that too often, but I find it really, really helpful that it's there. So that would be my top six beauty shaders, I would say. That's great. Cool. My goodness, what a show. Does anybody have any questions about these shaders, or any final questions about these shaders for Christoph? All right, on to the next thing then. Okay, I've got one more thing and I think that's something... That's also from one of the FX PhD classes, but it's also one of these things, and I found this out totally by accident, but this was kind of like the... What on earth moments? Because I read online quite a lot the request, can't we please have a warp stabilizer in Flame? And I hate to say, we got that. It's been there. Kind of. All right, so the example is this. This is a clip from my class. The retouch is all done on that. So now we need to stabilize it because if we take a look at it, let's do that on the timeline maybe. Where we got... Let's take a look at the original rush. Come on, Flame. What's going on here? I'll do a run play. I'll do a run play and then it should work. So the thing is, we've got this movement of her hand going down while the camera pans up and she caresses her skin so that gets down a little bit. So it's just a little bit jittery, and you want to stabilize that. I thought, yeah, well, this is something nice to do for class. We can play around with planar tracking a little bit or something, and it all didn't work. So let's stabilize that shot for the love of it. And this is when I came across this little nugget. So I did an motion analysis. So you all know that. I'm so excited for this. All right, let's do this from the start. So I'll pull up the surface in new so that we are fresh here. So now we connect our motion analysis. And down on the tracking tab, we can set up reference frames. The idea behind that was that you could use this to stabilize shots so that you can set up a reference frame to say, okay, now this gets locked to the frame. So to go to the result view, you see what's happening up here. Flame tries to lock that picture into position for the whole time. And we get all these artifacts, and this is not what we want. However, what we can do is we can set up multiple frames, like so. And now, this is the love of motion vectors. We can set up these in between. And I'm doing this arbitrary now, but what I had done in the original class, I actually took a good look where it would make sense to set a frame. And what I came up with in the end was this. So this is not like every four frames or something, but I rather took a look where the frames made sense, where I liked the movement where it wasn't too slow or too fast from the previous picture, but where it had a good nice movement. And that's where I set up a reference frame and let flame interpolate the rest in between. And that really worked nicely. And now it's probably not the best example to show over zoom because it's a very subtle thing, but I got the feeling that the movement was much smoother and it took good care of the corresponding or correlating movements up and down of the camera and the fingers and all that. And with very little artifacts to paint out between the fingers and all that. So give that a shot because that way you can really build yourself a kind of a warp stabilizer. And in my opinion, with more control over it than for example in After Effects. That's fantastic. All right. Oh, yeah. Wow. I'm just speechless. This was great. Cool. Glad. Glad you liked it. Oh, this was great. I'm also starting like 75, you know, beauty jobs in the next six weeks. So I feel like, you know, he's energized. Oh, my God. This is great. I'm just, thank you, man. Thank you for this. And thank you for all that you do to share, you know, share knowledge and stuff with the community. It's just, it's just, it's really wonderful. It kind of embodies the spirit of everything that, you know, we've been trying to do here. So from the bottom of my heart, man. Thank you so much for having me. Of course. Does anybody have any other questions for Christophe? I think everybody's speechless. I guess Randy wants to know what your day rate is. He'll have his people, all your people. Cool. Oh man. All right. Amazing. Cool. Well, thanks man. I appreciate it. If you wouldn't mind, I'm going to stop your share here. Sure. That way I can see your beautiful face. I can say thank you face to face. And this is great. Let's give away some prizes. Who's, uh, who's up for that? Everybody, of course. All right. Oh, I gotta have. Oh, that's two weeks in a row. Is it right by, there we go. Right. Here we go. It's logic prize time. Ladies and gentlemen. I've got to count them two of these gorgeous. Beautiful. I'm going to hold it in here. So it doesn't fall out like last week. Logic phone chargers. That's right. Big bucks. No whammies. Okay. This has now become, uh, one of my favorite parts of, uh, of hosting logic live. I have a list here of everybody who's in the meeting. I'm going to share the screen. I'm going to go to the random name picker website, which is brought to you by, um, go daddy, I guess. And, uh, and, and many other things. All right. So I'm going to put the names in here. And, uh, And I'm going to pick a random name. He's ready. And the winner is. Mindy. Oh my goodness. Let's give it up for Mindy. Congrats Mindy. Wonderful. All right. But that's just one, uh, of the logic phone chargers that I have to give away for today. The other one is for, uh, the, um, two second tip. I'm sorry. Two sentence tips that were over on the forum. If you have any comments to check them out, please check them out. They're fantastic. I think it was Jamie Beckwith had one about, uh, BFX, uh, like that whole, like the, the endless struggle of like having to, oh, there it is. Thank you, Randy. Just posted the link in the chat, the endless struggle of like, I have this set up. I don't want to slide it like two frames or whatever. Uh, it was just the great tip of, of making it a BFX. And then you can offset the BFX, but by all means, please head over to the forum and, uh, oh, Randy, you got to set your, uh, comments to all panelists and attendees. So Randy's going to do that and repaste the link. And I am going to pick, uh, another winner. So we're going to give another one of these away to everybody, to one person rather who contributed, uh, a tip. So let's go back over to the random name picker. Delete all those. These are all the wonderful human beings and members of our community, uh, who went ahead and contributed something to the two sentence tips. So let's go here. Pick another name. Carlos Campos. Let's give it up for Carlos. Ladies and gentlemen, I will reach out to Carlos and make sure that he gets his logic live or his logic, uh, phone charger, courtesy of our friends and sentences. Wonderful. All right. Thank you so much. And let's close this thing out. I can never move my mouse. There we go. Nice. All right. So coming up everybody, uh, on logic live. Oh, you know, I almost forgot you too. These two, Mindy and Carlos, you can be, uh, just as cool as Carrie and Miriam and, uh, and Quinn and, uh, and Peter who, uh, who's, who's charges in the mail by, uh, you know, make sure you share a picture of you with your, your, your logic phone. Okay. Thank you. This is what happens when you don't remember the slide show that you built. For logic live. Uh, we've got an action packed, uh, a few weeks coming up on logic live. I'm going to go ahead and paste the, uh, sign up links or the registration links in chat. Uh, next week, uh, we are off. Okay. So, uh, not so exciting next week, but, uh, after that we're coming back on October 4th with the one frame of white reunion show. We had D Pat Byron wall, Caleb Cahill, Gabriel, Gabriel Gerito, Darren Hoffmeyer and Greg Malone, Greg, Paul Malone. Excuse me. I'm never going to live that one down. Uh, who's here with us today and, uh, they're all going to join us for, uh, what's, what's sure to be, we're sure to rival any, um, any real housewives reunion show. Uh, but we're, you know, it'd be great to catch up with everybody and see how they're doing and talk all about one frame of white. Followed by October 11th is going to be the big one frame of white. And that's, that my friends is how you lose. It's how you lose a star. It was in his rider. Really. It was, what do I show you all the demands he had for the green room and everything was amazing. Um, October 11th, we're going to announce the winners of a one frame of white 2020. And, uh, you'll be the first to hear this, but, um, uh, the lovely and talented Grant Kay is going to be live with us here as our announcer. I'm so looking forward to that. Oh, Charles, you were there for the green M&Ms. Charles and I shared and shared a celebrity rider once experience. Um, October 18th is the role of a flame assistant with the amazing Amanda Elliott and October 25th, the Mocha deep dive with Mary Poplin of Boris effects. So as I said, one frame of white is on, you can enter now one from white.com and please, please, please. If you haven't already done so, sign up over at forum.logic.tv. Randy posted a link. Randy, would you mind posting that link again in the chat that if you haven't signed up, there it is. If you haven't signed up and use that link, Randy will bump you all the way up to 12 bit. We have different, uh, rankings for, you know, the, um, uh, for the forum, uh, eight bit, 10 bit, 12 bit, 16 bit, uh, based on participation. So, uh, each one unlocks different areas of the forum. So Randy will bump you up. Uh, this episode and all past past episodes are available at logic.tv. And if you haven't checked out the logic podcast, please do. I'm going to have more episodes for you coming soon. And, uh, I want to thank everybody who helped with the, uh, subscribe to the YouTube channel, um, uh, telephone that we had over the last week or so. Thank you, Randy. Uh, we, we are, I want to get those numbers up to a thousand and it's, uh, we added, I think almost a hundred last week. So if you haven't signed up, please go ahead. Thank you. Oh, your wife subscribed too. My kids subscribed. So thank you everybody. You helped to get the numbers up. And of course, thanks as always to our friends at cynicism for sponsoring logic live. That's going to do it for logic live this week, everybody. Have a great week and we'll see you next time. Thanks again, Christoph. Thank you.