 Can you hear me? Okay. Two week ago, I was on the top of a roller coaster and for a few seconds, this is the situation, the feel that I inside me and every time I start a new talk and a new workshop, this is my sensation. So I need a little boost to start. Thank you and welcome. I am Valerio Fissolo. I'm the 3D lead artist inside Protocubre Applied, that is my company in Turing in Italy. And it's my eighth time here in the Blender conference since 2012. I don't usually take more photos, so this is the only one pic that you have since the first conference. I was here with my colleagues mate in the years with the project from 3D printing and configurator and CGI animation and visualization. So this year I'm here to present you my hour with the team project that is the title of the presentation. So it's a predictive fabric design. So what it means, predictive fabric design is the project to allow us to create rendering or texture of fabrics before they are woven. So I don't know. Have you seen a texture of fabrics made by a substance designer or something similar? This is not good for this kind of market because this type of texture is fake and is very approximative. And this is not physically correct. So our goal is to go deeper and create a texture that is physically correct. So what's the problem for our customer? Fabric designer are very fast in the phase of creation of the fabrics inside the CAD software. But they need to wait one month to see the real patch of the prototype because the industry is very slow. So for woven a little patch of fabric they wait more than one month. After that for digitalize the preview with zoo or similar technology. They need to wait the real patch and after that do the photo shooting to the patch. So the goal is very simple. So it's reduced time to market of this process. And we work on our solution for reduce this phase and that involves the 85% of the entire process. As I say we are working on a natural approach that is very similar to a simulation. In that way we have the 100% of the control of the fabrics deeper to the dimension and the color of a single fiber of the wool. The fabric have three basic elements that are colors of the fiber, the yarn and the weaving, so the structure of the fabrics. So the color is simple. We take from fiber the color with an instrument as colorimeter. And the difficult part of this approach is to show on the monitor the real color that I catch with my tool. But that not involve blender so is another part of the project. But finally we arrive inside blender so we have reproduced the real yarn inside blender thanks to the animation node and also a part of Pindo script. And the project actually is developed in 2.79 because it start in 2018. I hope next year to migrate on 2.8. In this shot you can see how the simulation of a single yarn is managed by the animation node that are feed by external parameters that arrive from our platform. And this parameter are number of twist, the verse of the yarn, the number of wool fiber that are inside the section of the yarn. So for example this number define also the diameter of the section of the yarn. So this is very microscopic approach to the problem. This is the first step of up-trop mixing. So when the yarn is setting we do a step for reduce the resources for the final render. And we bake it in 4 maps. And actually we don't have a solution faster than blender internal. And we use the bake system of blender internal for catch 4 maps at this resolution in less than 10 seconds. So one of the issue of 2.8 is this. And after that we use that map to create a shader to apply on a mesh that is less defined than previously because it's only a cylinder. But the base mesh are made by 8 quad and all the rest is procedurally generated by modifier stack. And this is a view of the step of our modifier. And at the same time we use this approach for generate the spool. So the yarn is one that is rapid to the cardboard with a curve. And so we can create something more realistic that single yarn. And in this case we use animation node to change the code and the name of the yarn for every each render. In our database there are more than 2000 yarn and more than 6000 of fabrics generated for a year by our customer. So also to automate the name of the single yarn is important. And this is the next step. It is similar to the spool. We have now able to generate the fabrics. We have done a specific integration with the CAD software of our customer that now can export data directly inside blender. And this data define all the fabric is made. Ok, actually render time is acceptable because we render two type of output. The first one, the preview is 1K resolution and the final render is near to the 6K resolution. Time go from 2 to 20, 30 minute for render in cycle. And preprocessing of this render is very easy. 80% of the rendering is preprocessing and at the last the phase of rendering is very quickly. So it is very easy also in terms of memory. So we have decided to use CPU inside our cloud render farm because it is more safe for the customer. Not every render are the same because when I have a lot of fuel for example inside my fabric rendering time it is very expensive. Ok, jump now inside the platform as the customer see. Not in deep end. So our customer by a graphic front end feed the repository of color and yarns. And all the data that you see on the right can be managed by the flatmore and the final variable for the final fabrics. So as you see the fiber size, the yarn, the number of twists, so every request of rendering are unique. And this is all the designer see our platform. So on the left you can see now all the designer see the fabric in phase of the project phase. So it is very ugly and not realistic. We have added something like a render button inside the software that transfer the data inside our platform and make the render. And before when they press render all cut the data said in order to create and manage the trade scene. And this is an example of the final render, a patch of 22 for 14 cm at a resolution of 600 dpi for printed on a paper. So a very high resolution with technology like with zoo or photo. We can make and we can make a texture of the fabric at a resolution of 2K because after that the lens is out of focus. So, yes, I can make a photo of 8K resolution, but it's not defined. And with the rendering system I can go down to 8, 10 or 16K resolution, but it's not very interesting for the customer. Another interesting output of this market are the digital maps for digital materials. So we are able to export seamless maps after the render with render layers. So this is another example applied to a jacket and one more complex example of felt at fabric. So the structure is not regular, but it's very mixed. Another type of output is an animation of the fabric, it's very light. We are very near to the goal, we hope to finish the first part of the project until the end of the year. But it's only a first step because we are sure that when we pass this checkpoint, the main useful tool could be developed in future for this market. And what about the roadmap, we are next to finish the first release. In the first month of 2020 we are interested to migrate the platform to Blender 2.8 with a lot of number of issues. And if you are interested in about topics like custom node in animation node or shader node that allow us to see what happen inside material shader, we are open to discuss. So that's all, thank you and have a nice day.