 This show is brought to you by these lovely people. Hey there you lovely NPR enthusiasts. Experience the deluxe normal show this time. The highlights are, first, we continue our journey towards perfect tune. Second, Demon Slayer, Cats 2, Teardricker. And third, Marshall Mayhem, a musical. Welcome to the BNPR show, a celebration of stylized rendering. Want to see way cool NPR artworks? Please enjoy. Perfect tune shader series, but geometry nodes. In essence, geometry nodes are, it is getting something from a mesh, modifying that something, and then outputting the modified something. So for our case, we input the modified something to be used inside the material. Attributes are data attached or generated from a mesh. Geometry nodes input and geometry nodes output will not auto populate themselves. Meaning you have to assign these attributes to take an input and go to which output for later use. Geometry nodes will not output normal like other data and have a limited support for custom normal. But all is not a loss. We have the mesh data attributes UI. We can create normal output using the attributes UI for the geometry data. And here's the setup. Add a new attribute. We give it a name. Face corner as domain. And this we know from past encounters in the data transfer setup. Vector as the data type. To see the attribute in the material, we use the input attribute. You might have noticed by now, this is a multi-step setup. In short, and to hammer it home, you get data from geometry nodes, modify it, then output in a format that's usable in a material. And finally, use it in a material. This is similar to coding shaders in Malt, but a lot more node interface. It might be harder to parse if you don't know what is where. The normal that you get directly can be flipped caused by flipped faces. Plus, it's not normalized. More on this later. And now to the attribute domain. When set to point, everything will get interpolated as a smooth normal. When it is set to face corners, we can have split edges. And now let's go on a tangent to explain what normalizing is. Normalizing is changing a bunch of vectors from whatever values to vectors with the length of one. The reason we normalize vectors is so that we have a set of predictable vector component values. This will mean that all the component values, X, Y, and Z, are all in a predictable range of zero to one. The important part here is the direction. Let's compare the incorrect normals to the corrected base normals. Why? Because linear interpolation will make the vector length less than one. That causes the smoothing differences compared to the normalized vector, which is the normalized normal in our case. And that's the reason that when we need smoother normals, the normals need to be normalized. Backface normal can be done in the shader and not in geometry nodes. Because we can just compare the back face and front face and get the correct normals in the shader. Split normals are the same with auto-smooth and mark edges sharp. To tackle these, let's look at them one at a time. And a side note. So basically, if you've not noticed by now, we are understanding the technical working of normals in Blender feature by feature. Reproducing them with their quirks and trying to solve the quirks to get the ideal normals that we can manipulate to get our perfect tune. So back to split normal. Let's reproduce split sharp. Select the edges to split the normals. Make them into a vertex group. To solve over-splitting of the vertex, we add a capture attribute to Boolean type point. Another problem. After the mesh gets subdivided, the split edges are everywhere around the original split edges. To solve that, we only need to select edges in the vertex group with a value of 1.0 with a compare node. And that limits the splitting because the data coming out of the compare node is much cleaner. Now to reproduce edge split by degrees. Add an input with a min of zero and a max of 180. Add the edge angle node, convert it to degrees, then compare it to the angle we insert into the geometry node group. And just like that, we have split normals by face angle. To transfer the split normal while retaining vertex indexing, if you forgot about vertex indexing, please refer to the previous BNPR show. We use the original geometry, but used the modified normals. We transfer the vectors following the geometry index of the face corner and use the original geometry data. If you think of it clearly, it is similar to how we do custom normals the traditional way. To solve floating point errors that will make some faces go in and out of shadow and lit areas, we need to add a very small number to the split angle of our geometry nodes group input. And that is part four of our perfect tune shading journey. But now for part five. The goal of part five is recreating the normal data transfer modifier, but with so much more control in the geometry nodes. To set up a basic proximity transfer, object info node to set the source object. Original will behave as if the origin point was at zero while relatives will overlap the meshes. To capture vertex normals from face normals, we get geometry normal vector, then capture the attribute to points. To transfer the normals from another object, we use the transfer attribute node using the vector type and the nearest face interpolation as the mapping. And if that is the basic normal transfer setup, we can add other features too. One, mix factor. This is the mix between the original vertex normal with the normal from the source object. Two, to have the toggle between original and relative space, we just put a switch in between the two options. Three, vertex group invert, which is a switch for the mix factor. The invert is a subtraction of one. And four, max distance, which is a number we set. True is white, false is black. This is a factor to mix between the base normal and the source normal. In all of these, we use switches to turn these features on or off. Next, to do projected face interpolated, we need to learn about ray casting. The ray casting node casts rays in the direction of the ray direction input. In our case, the ray direction is our normal direction. From that, we get hit normal, effectively transferring the normal hit by the ray cast to our destination mesh. To get the correct back face normal, we need the back face normal selection in our shader as shown earlier. The transformation space also determines the result of the normal. When the transform is object space, transform using scale and rotation, the transfer is wrong. When the transfer is inside the geometry node, aka relative space, transforming using a transform node, the transform is correct. When using the object space, we can counter the transform by subtracting off the transform and get the correct normal. To properly set up projected transfer, we need to take into account a few things. First, pick either the negative or positive direction of the ray cast. Second, if both directions are hit, pick only the shortest ray. And third, for problematic areas like the nose, we'll use the same solution from the data transfer modifier. We transfer from another source mesh and overwrite the normals. When doing geometry node normals, we encounter these problems. First, the geometry node normals with the world space rotation state put. It is disconnected from world space. To solve that, we just need to rotate the normals by the rotation from the object info node at the end of the node tree. This can also be done with the shader using the same method, but with a driver. And the second, geometry nodes modifiers must be at the end of the modifier stack to make sure it is not overwritten by other modifiers. Brig the transfer object, aka source normal, similar to the main mesh to sync the deformation. With geometry nodes, we have the flexibility to transfer any attribute from any object to our destination mesh. So we can chain it, like in the data transfer modifier setup, but please keep the chain short for the sake of sanity in the long run. With all the knowledge we have here, we are ready to handle more complex custom normals with geometry nodes, and that will be in the future BMPR show. So please subscribe to stay informed. Alrighty, you overdosed with geometry nodes yet? Well, then this water particle trails tutorial by Simon3D is for you. Add a mesh object as the particle emitter, add a meta ball as the particle object, and then scale it. Now for the trail, add a curve object, add a force field in the curve guide type, and now make the trail by extruding the curve, increase the lifetime of the particles. To add randomness to the scale, increase scale randomness until you are satisfied with the result. To break the meta balls, select a point on the curve and press Alt S, which is shrinking or flattening to the radius of the curve. Control T to tilt the vert to coral the particles around. To make some particles moving faster or slower than others, slide up the value of lifetime randomness. And lastly, we don't want any gray particles. So please add a shiny material of your liking or you can watch Simon3D's video to copy his lustrous material. Demon Slayer Cats 2. Rengoku versus Akaza by Dilungu. What if you made a fan-made animation that looks better than the actual movie? Well, that's the case with Demon Slayer Cats 2. Based on the manga and it started production before the movie came out. The result is way more tear-jerking than the official movie. Mine kind of blown and it's a must-watch. Blown Apart, Marshall Mayhem by Tiny Media. A musical animation made with Blender is rare. When there is one, we have to feature it like this one. So no more spoilers, please go watch it. Dragon Ball Gohanverse, episode nine by Daiya Tomodachi. Now not to spoil this episode. First, Piccolo Sad. Second, Gohan Reads. Third, poor Piccolo Sad. Again, OTT, The Final Battle by Bucket Boy. OTT stands for Own the Throne. Previously, we said we want a longer fighting scene. Now we have it here. Enjoy all the grittiness of this fight. Wow, we hope you're all still with us. We know the geometry nodes and normal tutorials are pretty hardcore. But if you still want more, please visit the show notes linked under this video. And now for the most important part. The show is only made possible by these kind-hearted people. Please thank them kindly. And before we go, one final question. How much would you pay for a head modeling course?