 Hi, nice being too nice Hello everyone, it's a great pleasure to be here to present to you some of my work my name is Matthew Vitz and I come from Paris, France The vast majority of my work Not playing Sorry, yeah, the last majority of my work deals with using blender, but I do wear the same multiple hats on the one hand I'm the CEO of a company that is called for colleagues parties Which deals with the video analysis and the 3d reconstruction of complex events using heterogeneous Sources we mostly work for the French justice or the institutional projects to get a better idea Idea of what we do. I really recommend watching the amazing talk from Nick from forensic architecture from last year's conference On the other hand I've been involved over the last 10 years First as a PhD then a researcher and now freelancer on some research projects that led to the development of an addon That is called blend DIC that is now Used by aerospace engineer for performing what we call virtual material and expert and structural experiments So I will take the remaining 18 ish minutes to tell you the story of that addon And how it helps engineers doing their job So let me first explain to you what we call material and structural experiments So as researcher or engineers, we usually have to solve real life problems We see something that we want to understand in our case sometimes it would be the cracking in reinforced concrete We want to predict to follow the tip of a crack that would go along a beam a first stage When doing so consists in conceptualizing the problem basically making a mathematical model of the phenomenon Once this is done. We may produce a Computer model base that basically translates the equations into some code Some experiments can then be performed sometimes at different scales and there are different assumptions in To prove the computer model as well as the mathematical model During all these steps multiple sets of data are generated in order to validate the whole process So I will be focusing in that work on the last step The experimental part in between the computer model and proofing the real-world phenomenon So when designing new material or structures, we must study what we call the mechanical behavior of Of these materials at different scales in order to answer different questions The questions are going to be how will the Structure deform how will it crack? What is going to be what we call its mechanical response to given mechanical stresses so the Experimental campaigns that I mentioned May require to test more or less large structures at using more or less complex or big machines and in this talk I will refer to Samples which are rather low-scale pieces from a few nanometers to up some types a few centimeters or structures that are assemblies of these samples and I kind of prompt some centimeters to much higher pieces like meters Each scale will provide some very meaningful information So these experiments may be difficult to set up though depending on the shape of the sample on the Well on the phenomenon that we want to characterize if it's going to be purely mechanical Behavior if it's fatigue if it's couple problems like magnet or mechanical problems So I told you earlier. I was going to tell you a story But that's what started as a non-funded research project and that ended up now being an add-on that is the It developed in partnership with a French startup company called ecosystem and as that is now being used by engineers in the arena six This add-on is called blend the IC Probably guess that the blend part is for blender and DIC stands for digital image correlation So what is the IC? It's really simple with no math or symbols that would scare the hell out of you I showed you here a representation of what is a 2d 2-dimensional DIC test what it looks like So we have a testing machine that you have here that is going to be pulling on a sample and while it pulls on the sample we have a digital camera that we'll be taking some pictures at different time steps and We will basically record the formation of the sample and especially These dots black and white dots that we call a speckle that are that are painted on the sample and by looking at How these dots move we can see cracks appearing But we can also get to the mechanical mechanical behavior of the sample that is being studied depending on How the sample will deform on how large it may be one need to use one digital camera usually for 2d motion To do digital cameras. That's what we call a stereo system for 3d motion Or when it gets larger while you need multiple sets of these cameras for each surface For example that we will study I mentioned classical cameras, but one may also use thermal imaging high-speed cameras or a mix of all of them that might be required and so that involves in the end lots of cameras machines cables Stands light systems and all the structures that you need to hold everything together So to give you an impression of what the overall pipeline looks like I'm at this figure so as I mentioned earlier we start with a simulation and And a design we want to measure something Then When the simulation is done We go to the experimental part where we have to set up the experiment and do the actual Experiment that will generate a lot of data in our case mostly digital pictures that we need to post process to extract mechanical information This first part is let's say Virtual it happens like on a computer And what it may take a lot of time? weeks up to some months some time to be able to code and prove the code But the most costly part is probably the experimental part which because it requires a lot of people you need engineers technicians Researchers all together on a machine And so the setup itself can take some week and the experiment part what depends on what you want to do But there's a week If everything goes well the post-processing is gonna give you what you're looking for if not you have to start to From the previous step or start over if it doesn't match the real phenomenon that you're trying to understand So Something like ten years ago A colleague told me can your free software you told me about last blend something Can model a real digital camera and I was yeah, I guess and so we started investigating how we could use blender as a tool for designing this complex experience prior doing them for real With one specific experiment in mind that I'm gonna show you right now. It is a beam column assembly So it is a reinforced concrete structure the beam is kind of two meters long with a cross-section of approximately like 12 centimeters and There's the column which is one meter long, which is quite a big structure for experiments in the lab So the beam was clamped so it was blocked at Is to end and on the column? We were kind of moving the end of the column just like this like this kind of snail pattern that you can see here The goal was to study the cracking at the node So the junction between the beam and the column see how the cracks would appear on that on the top surface But also on the rare junctions And so that's what the test had been done it was that's what it looked like if you think it's a mess You're right. It was a mess. There were 10 digital cameras multiple light sources That were required Cables everywhere to power and transfer the data a Huge machine here on the bottom left corner that is called a hexapod that was here to enforce this kind of nail pattern that I showed you and And cameras going up to three or four meters high It took something like three to four weeks to set up this whole test Which means you cannot use the machine in the meantime You can it was using all the digital cameras available Which means that all the researchers could not use these cameras for the test and And so yeah, it was a very complex experiment to set up Just to show you what the process looked like Here the result of the top surface of the node and the researchers were researchers were able to show here in Kind of bright lines the cracking patterns Appearing while loading the sample so all the bright lines are cracks basically Which look nice, but we're not completely consistent with the theory with the math behind that Mostly because at the clamping of the beam It was not perfect. I started rotating a bit So the real results were not matching what was expected and remember what I said when it doesn't match You have to start over again So what we did was rather to do some kind of post-experiment retro engineering We modeled the test that already existed to compare the virtual images To the actual ones and we the goal was to see if modeling all that in blender prior to the experiment could like we could gain some time on the actual setup So we could see if some zones might be occluded if the light source were at the right place if the cameras were well positioned You have here for example for images on the 3d models for views from the cameras And even though the positions were slightly different from the real test What we thought that is that the 3d model was matching our needs to Pretty good to the real experiment So what we basically did was to add some kind of extra step in the pipeline in the visual part of this pipeline That I showed you earlier Which consists in preparing the full experiment in 3d in blender prior to doing the test in real life The goal was to Significantly reduce the setup time from weeks to only a few days Even the experiment so to improve the conditions But it's the most importantly try to avoid this kind of feedback loop and avoid any problems that would make us start all over again We published this results a few years ago Even if to be honest it was a bit difficult because to find a publisher because people Were kind of reluctant to see 3d as something that was just looking good And so although all these things look like look good in the lab where The people are used to using the tools and routines on a daily basis The transposition to an engineering lab Was not straightforward as people may only perform a handful tests a year So that's why as I was starting freelancing I decided to team up with a French startup company called ecosystem to which is specialized in Software developing for this kind of specific test the IC test and that they already knew and used blender And so we developed an add-on blend the IC that would make engineers lives easier at least when it comes to using blender Among the different specs well probably the first one was they wanted a lighter UI I guess it happened to everyone to when you open blender for the first time and you see all the panels all the things like that It kind of scared them so they wanted something that would only contain The few buttons that they might need and all the routines we had developed would be put in operators So they would just have a kind of cooking recipe to to work with that and So what mattered the most to them was the ability to load their simulation So start from the simulation on the material put that in blender life deform in blender and take virtual pictures and output these Rendered images in different configurations as well as camera locations for example all the camera parameters in order to Optimize their test as much as they could That's what we pretty much did over the last four years raising blend the IC as part of a tool called equitrain virtual that starts by converting the Simulations and then helps it guides the engineers through the whole process and It is part of a suite of softwares that will not only help them set up the thing But also post process and validate their data So here's an example. We have the design office that work on the simulation of a crash of a dummy Here was probably something that looked like a brain crash Both the displacement of that dummy if I can Load play that again, maybe yeah both the displacement of the dummy and the deformation of the base of its seat here Where important to measure So once they had the simulation they went they talked to the experimental Office and told them okay, we need to design the next one We need you to design an experiment that would measure the deformation of both the dummy and its seat So this this is kind of fake experiment that was made to prove the overall process here And so the experimental team was able to model their tests in blender Loading the geometry first testing it Rescaling the speckle if needed adding some environment and some digital cameras So one stereo system that would focus on the dummy and one system that would focus on the seat in order to To measure what they wanted they were also able to add some markers on the dummy in order to have some kind of Reference points in the scene and so once really the engineers could live preview what they were doing based on simulation files and Compare the render the images and compare that to the actual simulation He hopefully they would find the same thing but with this Pipeline they could try different things different lighting systems different cameras and everything They were quite Some challenges in developing blend the IC for this kind of thing. The first one was the ability to feed 3d image data from the simulation directly into blender and focusing developer converter to extract the data from the simulation and I developed the important custom important that was Feeding this data directly into geometry nodes attributes in order to live deform everything according to the scientific data we also needed some extra information in in the way we end up digital cameras the cameras so in BBY data cameras in order to remember the models of cameras we were using remembering like which Stereo systems of which which cameras were paired together for example as well as a custom that fulfilled To all that we developed That is slightly different from the the one in blender And the most interesting part was to have a custom render routine that I would take into account heterogeneous digital cameras, so We've mostly with different resolutions, which is not something that blender net and actively does and so here We might have different types of cameras and we need to take that into account when rendering So back to a dummy the simulation was done. We could go to the experiment Where you can see we have these four cameras The environment is kind of There's more that what the engineers would Put was just kind of to show off for the conference But what you can see it is like a lead panel these four cameras that are targeting at our dummy and Once ready, we could render these four sets of images The render is mostly done in EV If we need some more complex lighting systems, we could switch to cycles But usually Deforming the sample like that for 10 to 15 time steps for these four cameras would take four or five minutes Basically the time for the engineers to go get a coffee And when they come back they can process These different images in their favorite software, which is in our case equitrain DIC to get the measurements of the deformations of the of both the dummy and the lower part of the of the seat a Come and compare that to their simulation So that was just one example of what we can do I didn't go into too much detail because it would get a bit boring with the older the other possibilities that blend Yes, you can do But there's tons of plans in the future for this add-on the first one being called refactoring and taking into account more and more Using more and more geometry nodes, especially for live computing tons of stuff like depth of field when you start moving cameras But as well as handling procedural environments that can be useful for the engineers when designing what they want to do We also plan on using quite extensively the asset libraries So that users can build their own with their own machines their own cameras They're all all their gear they might have and accessories But also providing some that I could usually come on to all the different tests engineers do in this kind of industry that also plans for Cooper which is the optimization toolbox where what we can do for example is determine the camera location and angles depending on a given quantity of interest so using routines that would directly optimize the location of the different cameras But there are multiple over examples of what blender can do to help researchers And engineers in our field Another project we've been working on is to use for example here a robot on which is mounted a digital camera The idea is to lower the number of cameras that would be used during a test And add more flexibility in the setup moving the camera at the spots where that would be the most interesting To get the most relevant information It's still an ongoing project. There's still lots of work to Lots of things to working on you can see the glitches on just moving the camera targeting one spot at some point We're not even sure we'll be able to do that just in a minute. We may have to pipeline that With all the softwares But that's the kind of things we're working on So thank you very much for your attention. Feel free to reach out during after the conference if you have some ideas questions or remarks and I would also like to thank the many people that Directly or indirectly worked and contributed to these projects in code debugging and just providing some cool ideas So, thank you very much and enjoy the rest of the conference