 One, I think we can start. My name is Vasili Shishkin. I'm an orthopedic surgeon, but I use blender in my daily practice. And today I'm going to be talking about creating orthopedic surgical implants with a blender. I work at the Central Clinical Hospital of the Russian Academy of Sciences in Moscow. So this is a picture of my hospital. And apart from being a general hospital, this is also a scientific institution which gives us an opportunity to carry out some research projects, scientific projects, and currently we're having a project on 3D technologies in surgery. So this is me doing work. As you can see, we're using blender in the operating room, helping surgeons to make the correct decision, how to make correct surgeries, to use the planned, the previously planned surgery to get it into life. And just a couple of words about orthopedic surgery. It's a branch of surgery that mostly deals with bones. So there are bone fractures and deformities and some other diseases that affect the joints. So today we'll go and be talking about joint diseases and problems with joints. Usually when there's a problem in a joint, in any joint in the human organism, eventually the person comes to a joint replacement surgery. A joint gets replaced by a metallic structure that takes the function onto it and in that way it reduces the pain and gives some inability to the joint. But this approach has some issues. First of all, implants come in standard sizes that do not fit all patients. And sometimes the surgeon has to adapt a patient to the implant. So he has to cut away more bone just to make the implant fit right in place. They're made out of metal and they are not anatomical, meaning that they are meant to articulate, to move around, but they did not fully mimic the bone itself. And in a couple of years, like 15, 20 years, a revision surgery is needed. So these implants do not last long. Having that in mind, the Russian professor, Alexander Voronsov, proposed his vision of this problem by creating individual joint replacement implants by using molding techniques. In his understanding, the restoration of the anatomy will give the restoration of function while maintaining the anatomy of the joint. He had a lot of operations made and basically this was his workflow. He took a plastic model of a bone, made a mold out of it, and casted a special cement that's used to hold a standard implants in place, casted that into that mold and in that way he produced the implants that were later implanted to the patients. So as you see, they're quite anatomical. They look like the real bone. And he had a book, he wrote a book that's called Individual Joint Replacement Surgery and Art. And one of the questions that he was raising in that book is surgery art and does a surgeon needs to be an artist. He himself was a sculptor, so he was using his sculpting techniques in his work to get the correct anatomical shape of the implant. And so this is where Blender comes in. So before we start to work in Blender, we need the material to work on. So we make CT scans of the affected problematic hand or leg or arm and then convert those into 3D models using specialized software. We create surface models and the healthy model serves as a template that we can use to restore the anatomy of the affected one. So on this slide you see that the green model is the healthy hand, but it's mirrored so it will represent the right one and the yellow one is the one that has the problem. At the 2015 Blender conference, I've raised an issue about model alignment because when we have these two models, we have to align them one against the other to get the understanding where the deformity arises from. So I got a great feedback from the Blender community. People were calling me coming up with different ideas and eventually I came across this add-on for Blender that's called ICP add-on. This is an iterative closest point alignment algorithm that is employed to minimize the difference between two clouds of points or meshes when one object is held in place and the other one gets rotated and translated around it just to make that it best matches the reference point. It was introduced in the 1990s but it got an expansion only in the 2000s. So this is how it looks like. It offers a semi-automated alignment process. The first one, the first option is a pick point alignment that enables you to get a rough alignment of two separate meshes and the other one is the ICP alignment. This algorithm was written by Patrick Moore. He's a dentist himself and so he met the same problems that I had during planning surgery so I've included the link on the GitHub so you can go and check it out. And so this is how it works. The first step, the picked point alignment. You have two models and you just select the corresponding points on each model in order to snap them together. So this is a video showing this process. This is a really severe elbow dislocation and fracture and this is the problem that we see here. The bone is fractured. So we made a CD scan. We took the healthy bone that's shown in green and we need to align the green one to the yellow one. So we select both of them, select the pick point alignment thing, zoom in and we just need to pick the points that we know that are not touched but the deformity that are healthy. So usually three points are enough but the more the better. So you just select them, hit enter and it just snaps one over the other. This is quite rough but you still get an understanding that the problem is somewhere around the elbow joint but it needs a bit of fine tuning. So this is the next step to be done, implementing the algorithm itself. So what you do is use weight painting to select the region that is relatively healthy on each of the models, on the affected one and the healthy one and after that you execute the algorithm and the algorithm itself finds the way to align those bowls in a really sophisticated manner. You see that it's moving one model over the other and you can see that the blue thing sticking out is what the implant should look like, the replacement implant. So this is the correct size and the correct shape of the implant to be made for this patient. So the possible areas of this algorithm of use includes dentistry, some facial reconstruction of surgery but basically any area that needs to align one thing to the other architecture, 3D scanning, anything. So I encourage you to check out the GitHub link and have a look at that algorithm, that's pretty interesting. Now the next thing to do after we have defined the thing that the shape and the form of the implant, we need to somehow produce it and here we'll come to 3D printing. Unfortunately we do not have materials yet that can be implanted inside the body of the patient. So we are heading back to the professor who made the casting thing, so we need to make a mold. Now I've made some research around the web and I found the guy that already did this but he was working with brain, so they three printed a brain model then made a mold out of it, a scaled model, made some casting, but the story had a kind of weird ending because at the end the guy just ate his brain. I guess this was the whole point of the scientific research but nevertheless this was a proof of concept for me that this could be done. But instead of printing a model and making a mold out of it in real life, I decided that a mold could be made in blender but that was relatively simple, so the model was just taken and using Boolean operations and some primitives, a mold was created and then printed out. Now for some clinical cases, there will be some pictures of surgery so if anyone's not feeling comfortable watching blood you can just close your eyes for a moment. Now that's the first case, a patient had a tumor of the distal radial bone around the wrist, so this is basically what CT looks like without 3D reconstruction. That's just 2D black and white slides that the surgeon has to scroll through and use his imagination to really understand the thing. So on the right you see that there's some kind of a big hole under the wrist and so we made a 3D reconstruction. The red one is the affected bone, the green one is the healthy one and then unlined that one over the other. The affected bone with the tumor had to be removed, obviously, and then created a mold, 3D printed the mold and casted an implant out of it. On the top picture you see the rejected thing, the thing that we removed from the patient's hand and this is the surgery, the implant positioning. It snapped right in just as it was there and had a tight fit. This is the postoperative x-ray showing good alignment and this is the functional result. The patient may not have the best flexion and extension of his hand, but he gained strength. He was a car mechanic, yes, so he was able to get back to work so he was happy about it. Now the other clinical case, a severe elbow fracture, the community fracture of both forearm bones. We made a 3D reconstruction. If you're looking at this and thinking that you don't really understand what this thing is, you're not the only one because a lot of surgeons that make x-rays, make CT, do not have the full information about the thing and they found out some interesting things during surgery so this thing has to be really carefully planned. So we made a mold and blender when we modeled a fixation stamp to hold the implant in place, tried the surgery. Virtually then we printed the mold and casted the thesis. So we used approved surgical cements. In this case it was a specific cement that's non-toxic, bone-like and it integrates into bone and we used this stem that was made out of biodegradable plastic that dissolves through time and so in a couple of years the patient just ends up with the implant held in place without that plastic stem. So this is the comparison of the bone that was removed from the patient with the created implant. As you can see, the bone does not have a really even shape. It's kind of oval, triangular-like thing, but the implant fully restores that. This is the x-ray. Unfortunately, the implant is not visible on x-ray but you can see the space where it's positioned and this is the most interesting part. The thing works. This is the rotation inside the elbow. We've tried that during surgery and it works fine. The patient in one month had good function of the elbow flexing, extending, rotating. She's 74 years old. She was able to get back to her normal life and was not disabled. So creation of individual surgical implants give the surgeon a better understanding of the problem, restores anatomy of the bones, enhances precision of surgery and gives good functional results. But we've run into some issues and those are related to segmentation issues. Segmentation is the transformation of 2D CT slices into the 3D model. Usually it's made, it's performed in a certain software such as Slicer in Vizalius and a couple of more. It takes up time and gives out a big STL file that's really hard to work with but it uses the algorithm that's called marching cubes. Now there is a possibility of visualizing DICOM in Blender. I've made this myself. I've talked to people doing this but apart from visualization you cannot do nothing more about it. You can just scroll through slices, have a look at them but we need a 3D model. I've run across a marching cubes algorithm that was made for Blender, there's the link below but unfortunately I didn't have the opportunity to get it work correctly so I'm calling out for help. If anyone knows how to get this thing work or maybe has any other suggestions on solving this problem please get in contact with me. I'll be here around the conference and I'll have my contact details below. So hopefully the solution of this problem will get us one step closer to creating a fully functional surgical orthopedic add-on for Blender. Thank you for your attention.