 So now we can move on to the last presentation of today. And it's my pleasure to welcome Enan Morales, who is currently working at Philips, but who used to work here at UPF some time ago. And he's now responsible in Philips Paris in the research group for modeling, modeling of fluids, modeling of solids. So please, Enan. Thank you. Can you hear well? Thank you for the invitation. And also I thank the previous presenters because it brings a lot of the elements that are going to show here and share. So today I'm going to talk about Euler and Lagrange solvers for cardiovascular problems. So in this presentation I will go through general overview of cardiovascular modeling and then briefly to example one on my previous work on neurovascular flow and then some ideas about cardiovascular flow. So as you have seen during this summer school cardiovascular system is an organ. The goal is to permit the blood circulation and the transport of nutrients, oxygen and waste and all about our body. It provides the nourishment and help the fighting disease. Also we have seen the maintaining the homeostasis and the study of the blood flow is what we call hemodynamics. And we have seen as well during this summer school that there are two response to the blood stimuli. So each time start running muscles start to demand more blood. Artists will react to that new hemodynamic condition or start thinking or more blood will go to my brain or start eating then blood will go to my belly. And also it's associated to disease as we have seen again in other classes and I'm going to show you with brain aneurysms. So to study hemodynamics we have mainly three different approaches. So one is pure theory. We have physical principles that we can mathematically express. We have also pure experiments. We have seen a couple of stocks with PIB for example that provide just a version of the reality but it's limited due to the devices that are able to measure. And also we have the numerical modeling. And what numerical modeling can offer us so basically we have full control of the experiments because it's very nice. You can set up my boundary conditions. I can change my flow rate. I can measure wherever I want to measure. It's very cheap. It's portable. I can test on feasible conditions. For example, I can make my heartbeat faster, slower where I can put a device, I can test a drug and there will be a harmful effect on the patient, on the virtual patient. And of course it's reproducible. How I use modeling in particular is better understanding on the physics. So in the first example you will see that although we consider Navier-Stokes is the governing equation for flow as it continues it still is unknown the effects of those equations on complex anatomism when we have an stenosis or we have an aneurysm. I'm going to use that information to provide virtual data so other tools can be developed on top of that. It brings in new information like the pressure, like stresses on the wall and also brings this what we were, for example, showing yesterday with the FDA what is the prediction of a endovascular device when it's placed in an anatomy. So what is the recipe for all of us regarding flow modeling is of course we need to define a domain for the flow where it moves. In the case we have, we're interacting with solid, we need also the solid domain. We need boundary conditions, so the pressure or flow rate. Displacement of stresses is again, FSI is included in solid properties if I consider my blood as a Newtonian or non-Newtonian fluid density. So regarding the fundamental equations, so in this case we're solving for flow, transport equation, mass, momentum conservation equations. For a continuous we need to discretize it as it was previously mentioned and thus we bring numerical modeling into the scene. And I want to highlight two big families, so the Eulerian approach that you met was already very well explained by Professor Ognate. It is basically the discretization fix in space and it's a widely used and mainly used for flow modeling. On the other hand we have the Lagrangian approach where the numerical discretization is attached to the material. To exemplify this, my regret in the next slide, but in my view we have, well, right in a bicycle and we want to know our velocity, pressure, temperature. So if I'm with my phone, I turn on the GPS and then I can measure and evaluate at each position my temperature or my velocity. So this is the Lagrangian approach, so it's very intuitive. On the other hand, someone is measuring what's going on. You need to define, okay, from this tree to this tree, someone is passing and he's going to take a note of, okay, this is the temperature of this person or this is the position and the velocity. So forget the previous slide, focus now on this standard way of looking at the discretization. So we have fixed mesh and in each of these notes I evaluate my velocity or my property. In the Lagrangian approach, this mesh can deform and that brings two examples of fine elements and meshless method. So to go from one to the other, so first here I'm writing the Navier-Stokes equation in the Eulerian formulation, so mass conservation equation and momentum conservation equation. There we have the local derivatives and the convective derivatives, so local means, for example, speeding up at a certain point, the biking. And convective could be, for example, if there's a slope or there's a sunny area that warms me up, so that will be these two terms that constitute the variation in the case of deceleration. And on the right hand side, we have the gradient of the pressure, the viscous term, and the body forces, in this case the gravity. So to go from, if we use the material derivative, we can go from the Eulerian approach to the Lagrangian approach. So now this term in the local derivatives and the convective derivatives are just considering one single term. So the question is, okay, we have a problem, which technique should I use? So obviously the answer is, well, it depends on the real question, what is my question? I want to try to answer. So if there's a fluid problem, it's a solid problem, it's a heat transfer problem, it's a combination, it's static, it's dynamic, which modality in this case is the one providing me the data, ultrasound CT, MR, 3D array. Do I have the patient anatomy, really? Do I have certain boundary conditions? So here are two examples, one is the viscous flow and catech flow. So the first one I'm going to explain to you is how we use our Eulerian approach to solve the Navier-Strauss equation for aneurysm hemodynamics. So the clinical question is, what happened if the physician actually would like to know how many fluid diverters are required to ensure the success of the treatment? I'm going to explain the technique, I'm going to explain the cardiopathy, which you are not familiar with. So aneurysms are this abnormal dilatation of the artery. Still, we don't know yet well which are the reasons why we have these dilatations. They are asymptomatic, usually found, we can find it at the base of the brain in the perforation point. And the problem with that is when they break, it can be lethal. So let's say that the physician decides that the risk of the rupture of aneurysm is higher than the risk of the treatment so we're going to treat it, and in particular we're going to use endovascular treatments. The goal of these treatments is to isolate this abnormality from blood circulation by triggering the coagulation cascade. So in the first example, we have endovascular coils. You can imagine that, for example, this is the aneurysm that enters without shares. We can move around freely and then eventually leave. But if we put some shares in the middle, it's going to be that. It's going to be more complicated to go through all this room and eventually we'll like to sit and stay there forever and clot. So that will be the first one. The second one, for example, still is a big wide open area but shares are not fixed so they can move and eventually they can leave. So this is a complication. So I need to put some bodyguards trying to block people taking shares out of the room. So that will be the second one with a combination of stem and coils. And the latest one is I can put many bodyguards at the entrance so it's not that easy to enter and if I manage to go through and enter the room, again it's empty, but then I cannot leave so also it's going to be costly to leave the room. And the therapy that I'm interested in. So as I said, well, eventually the results of the the success of the therapy is going to be highly dependent on the hemodynamic output. So here I have an example of high velocity for untreated condition and the world's stress. If I put, for example, one row of shares, I'm going to have a certain reduction but might not be sufficient. But if I have plenty of shares then the hemodynamic is going to be lower and eventually that, hopefully is going to trigger the correlation cascade. So we are this is the scenario. We have an analysis we decide to treat it from the treatments we said flow diverters. The advantage of the flow diverters is just one device so I don't need to put under stress the physician is playing with the tool. I can reduce also the intervention but it's not clear how many flow diverters should I put. So this is the question for the physician. So what we see on top is a contrast being injected and this is an X-ray image so we can see the contrast projected and before the treatment. And then after the first flow diverter this is what we see and the physician is actually comparing these two images and surprisingly they have they scored okay now it's halfway filled and they say okay it's they rank it and say this is alright or not. When I put then eventually a second flow diverter but they don't know and of course these devices are costly but from a hemodynamic point of view, from a mechanical point of view actually this is for me is scary because the amount of blood that is of the contrast is capable to enter the geometry is related to the amount of flow that is going through the R-train and actually what happens is because there is a catheter inside that they are creating some damage to the wall the wall is biological tissue that's reacting and maybe reducing its diameter. So actually the flow before and after the flow diverter has changed we don't know how much but it's changed so we need to compensate for that but we don't know actually how to compensate. So this is the available data so we have very nice probably one of the best image resolution so 0.3 millimeters we rely we don't have a temporal resolution for the 3D anatomy and then we have a set of sequence of contrast being injected. It's a 2D projection and the assumption is we set because in the brain the artist cannot change the diameter doesn't change much we consider ok this is a rigid wall and based on this special resolution I say ok here is my wall and this is very important because it's going to allow me to make the assumption of a rigid wall and especially the use of an Eulerian approach so most of the studies in this field use this approach they are not using a Lagrangian approach and as I said well the goal is to quantify the effect of flow diverter before and after so how simulation supports that tool so we run a lot of simulation in many many patients we follow this pipeline so from the image we segmented we limit our region of interest we do the image generation, solve the equations visualization and eventually we need to put a flow diverter at some point so we apply a remission and then we run the simulation again I'm going to go through 3 articles in the pulsatility of the flow so for that we took we started this project 4 years ago just 7 bascular models and what we did is we varied the flow rate so this is as I said we have full control and we can test on feasible conditions on numerical models we cannot ask the patient could you speed up this because I want to see the increment of blood in your in your artery we cannot do that so what we did was scale up and down this flow rate we didn't change the shape but as I said the mean flow rate and with that what we found what you see on the left is velocity on the right is the Walsh stress important thing is basically here you see this variation correspond to different segment of the artery and this line at the bottom correspond to the velocity at the aneurysm and here will be the Walsh stress and these are two different cases of different flow rates so this is more or less so what we can get out of the simulation you see what peak system arrives these are lances of this surface area is maximum on the artery we see that this flow takes some time to fill the aneurysm so we're curious about this effect and actually what we observe is the flow is very slow you can see on the top this is the Walsh stress when I'm plotting the first image the first image on the top left is the time instant when the flow rate is maximum at the artery and then the next image is when the Walsh stress is maximum so what is telling me this is imagine again the exercise of the room so now I have I'm at the entrance I'm at maximum speed and enter the room I need to come here, touch the wall and then leave it so the time that it takes and the energy that I have here it's not it's not has been delayed because I need to cover all this distance and then I have someone much faster than me running that would be the second scenario someone that really fast can almost internally touch and produce the maximum Walsh stress and this is what happened here so what we report is the Walsh stress and peak velocity it depends on the size of the room but also depends and this is what you see by the dots bigger dots means big aneurysms but also depends on the speed of the flow rate in the main artery so as a conclusion it's previously in this field we have been stressing the importance of the morphology so you have a blade you have a small aneurysm those has a normal impact on the hemodynamics but we are really forgetting the flow rate, the arterial flow rate and now it's at an additional dimension but we cannot avoid this because this is a biological problem and the other thing important is well the peak system that we believe was the maximum was the time when the maximum stresses and velocities occur in the aneurysm that is not true it might be the case but it's not the case so we need to revise all the literature that we have been producing over the years and the main conclusion for us as an industrial application is that the probability including the pulsative flow into our consideration is very complex so we went to something much simpler so time average or results we include a bit more of the cases so now we have 15 aneurysms so it's at the same cross but time average and what we produce what we observe is something very interesting so here on the horizontal axis we have the flow rate and the velocity, watch stress interesting thing is I can as you can see we can draw a line for the velocity and analytically represent the relationship between the arterial flow rate and the velocity inside the aneurysm for the world stress and the pressure and on top of that we show that no matter how is the shape of the waveform this law this relationship is preserved so as a conclusion of this study the velocity, the watch stress and pressure we can characterize with this linear or quadratic relationship these measurements beyond any patient specific flow rate measurement because if I able to measure one at certain point the arterial flow rate of that patient that is time dependent that's not going to be the flow rate for that patient so again if I drink coffee more blood will go to my brain and that flow rate is going to change so this curve beyond any patient specific flow quantification and nicely describe the whole hemodynamic and I am able to compare between cases so the third thing that we did is we introduced the stent, the flow diverter so we include now we have 25 aneurysm models and 9 pulsati waveform you do the math well that create quite a lot of simulations and the stent in order to be able to do this amount of simulations we do that implicit approach for the stent so we use instead of classically the approach is defining where the wires are located so we use a porous media for that and thanks to that as a result you see on top different flow rates two cases before the device is placed and then after the porous media is turned on so you see that vortices in the aneurysm are reduced and the flow is now going is not entering as much as before we again evaluate this time just focused on the velocity because it's actually what I can measure no water pressure and this law also was applied to the stent condition so having that we define the efficiency of the flow diverter with this equation so basically the reduction in the velocity so as you can see this equation I'm drawing the first plot on the left it's dependent on the mean flow rate it matters the efficiency of the flow diverter is going to be higher but then when I move towards physiological flow rate which is around four milliliters per second in the internal characteristic artery then this variation due to the flow rate are disappearing and this is what we bring as we using numerical modeling to the tool that we develop so this is more or less then understanding on the physics and then on top of that what we did is we produced some virtual angiograms so we set a given position orientation of the virtual x-ray and then we do this virtual angiography that goes to our optical flow technique so I'm not going to talk about that just to take of the time but basically with this set of data we can evaluate where this tool was capable to be accurate or not so here a more example of because it's 3D object we can orient how we want on the left untreated on the right with the flow diverter for two different flow rates so the question is okay can we compare well directly no we need to compensate based on the tree outflow and that's the end product that now has the FDA approval actually it's as follows so we have an untreated condition on the left we do two measurements one in the artery and one in the aneurysm after the treatment we do in the same regions the same quantifications and then for each of them we define the MAFA which is the aneurysm flow activity it's not the velocity because again it's the projection of all these contrasts inside we normalize it based on this artery outflow rate and then if this MAFA now there are clinical studies we are collaborating with physicians now they are pointing out that if this MAFA ratio of course is below 1 it means that the flow has been reduced it's above 1 it means that actually I'm increasing the flow that might happen actually but now it's we're reaching the after having this tool for a while we are identifying that at a certain point I think at point 0.6 of this MAFA ratio then we are ensuring the co-ambulation of the aneurysm at six months so to summarize more or less to give you this is a message that I would like to also give to you so this is how I see a numerical modeling and the gap with respect to the physician so we have the patient and the doctor knows he does some certain measurements and where there's no sufficient well as for images and then he understands the images of course and we are also getting some information from the images so we are doing my modeling but if I do this I need to teach the physician to understand what it's all about to understand pressure to understand all these vectors and colors so I need to teach I need to train and that is expensive and you heard from others presenting that that is not straightforward the physician ask could you make 4DMR something that looks like color Doppler because this is what they are used to so maybe we have to do the same for modeling but what we are trying to do is this so don't say that we are doing modeling or don't show all these complex equations but just give back information to the image so the physician will be able to still understand what is going on so summarizing this very quickly as I showed I use a Liren approach to understand hemodynamics we provide ground truth data for testing an image based for blood flow quantification and an innocent flow assessment for that something very important we need to perform hundreds of simulations and still we are not capable to give those tools to the physician so my second example this is more ongoing work so I'm quite new in the cardiac flow aspect so the idea is now we have different type of image especially ultrasound sometimes CT and MR we have seen and we would like to analyze for example vertical structure and quantification of bulk recirculation these kind of things so this is our organ we know it really quite well and we have these kind of images for example 2D flow and the important thing here is when we do these kind of images for cardiac assessment what I would like to highlight is what is a highly dynamic problem and that the special resolution when I have a 3D and 2D are not as good as the previous case so doing a nice segmentation of an accurate segmentation of the wall is not as evident as before so this is a snapshot of the heart model of Philips capable to provide up to 4 chambers as you can see on the right but need to make certain assumptions regarding for example the traviculae and the chordias or the position of the valve because it's based on the formal model and having that well as a community again we like to use numerical modeling to see what's going on inside the cardiac cavity so what I'm going to show you is how Eulera and Lagrangian have been used to understand cardiac modeling so this is not my work as I said now I'm new in this this Phil but basically what we like to have again is flow characterization so a vertical structure metral regurgitation for instance is a more complex problem highly dynamic where we integrate fluid solid and eventually electromechanics or electric activation we have mainly ultrasound sometimes CT when treatment is need to be planned and rarely MR and for the patient data we have we believe that we have the 3D anatomy and we believe also we have 2D flow quantification if we use ultrasound as I said this is not my work something that are interesting and important to highlight for example if we have a numerical solver using Eulera in approach so fix and mesh this group of Beluta and Mital have been proposed and will use the immersed boundary method to be able to consider the deformation of the artery sorry of the left of the ventricle in order to simulate the flow inside what they say for example if we consider the traffic light they observe that the traffic light helps to give to the flow to go deeper into the left ventricle is what we see during the systols with the third image on the top right and the bottom is without the traffic light again we can simulate unfeasible conditions and during the ejection the traffic light also helps to squeeze the flow out of the heart this is what you see in the first row the second approach is using again immersed boundary for the leaflet of the artery and a Lagrangian Eulera Lagrangian Eulera approach to consider the wall deformation and the conclusion is mainly wall deformation is important and we can get more physiological flows as we are observing in reality the third example that we like to bring is now it's really a retail Lagrangian Lagrangian Eulera final elements approach it's a Japanese team that are able to combine cell activation of the heart and the flow inside the chamber and what they say this tool can be a powerful tool for establishing a link between molecular and abnormalities and the clinical disorder at the macroscopic level so this is not an animation it's a very nice simulation that you can find on YouTube it's quite impressive what they are capable to do so more recently in this map that I show you from the patient to the doctor an effort to speed up the computational cost they are having two review papers that suggest that in order to have a ready to run CFD simulation for cardiac modeling we require around 20-30 hours of work it is quite expensive considering that we need to run thousands of simulations you want to really understand what is going on in this chamber so one of my students Alexander proposed this pipeline to speed up and to reduce this hour to a couple of minutes so by doing a non-rigid registration you perform a generic mesh and adapt it so this is what you can see this is a generic mesh take the ultrasound data combine both and then perform the simulation so this is what you see here it's how this mesh was deformed and now taking into account the ultrasound data that is well that is patient specific so on top of that we can run the flow simulation and this is published it's going to show this in the next FIMH conference so now some more examples in this case of fine elements and HPH approach what they conclude is they do a comparison of what happened if there were interest in the valve and the stresses and the deformation all the basic assumption is well I impose pressure in the phases of the leaflet and well since I don't know the distribution of the pressure I apply constant or same for all the surface but if I have the flow I can have a better distribution of this pressure all along the surface and they compare that and for the flow they use SPH and they conclude it's well substantial difference whether you use fine elements alone or use this combination and finally just recently for example this was recently a couple of years ago people from animations are integrating pure SPH to see how is the flow in the left ventricle in this case the data comes from CT and what they are claiming is now they are capable to simulate one cardiac cycle in 30 minutes so again coming back to this diagram I think we are far from teaching again the physician using these models but before we were giving that arrow but I think for cardiac cardiac flow modeling we are even not in the state of providing data because simulation is still too complex too expensive to compute so as a conclusion of this second part cardiac modeling is very very complex we have flow dynamics and solid mechanics we need to do simplification in order to be able to speed up but still be able to provide reasonable results the optimization of the pipeline or this new numerical method that starts to pop up in the biomedical there are interesting options and especially for cardiac flow using ultrasound for me the key message is what is your question what do I want and especially where is your input data and where is your wall with that Thank you very much for this nice overview of also different methods for different applications I think it's really important to choose the right method for the right question now what you also say is one of the big problems is this large computational time do you think it's related to the way we do it and when we speed up for example when you go to faster methods do you lose something in accuracy or in flexibility or so what's the trade-offs is it just that we haven't been using the right methods or that we have to choose which is important and which we can sacrifice I think that now we have as engineers we want the highest accuracy especially in the academia we want the highest possible so we really do efforts that are guaranteed that are numerical so that we choose is capable to provide us the best but in the industry the accuracy is not the same accuracy as we want in the academia so I will say that it is because we are pushing to methods that are with the highest accuracy but accuracy that try to be comparable with the analytical solution that we can provide but not with the accuracy that is required for a clinical application so I think that that way for example the use of SPH for flow for me is very attractive because it is providing an overview of the whole flow but it is weaker in terms of accuracy for near the wall or if I want to calculate accurately the pressure drop and when you look at an industrial model to do these things or like industrial commercial model what do you think is the best to say that indeed you simplify it in such a way that a clinician can run it on an ultrasound machine or on a computer or do you rather think it would be like some people are doing a service that would use high performance computing and you submit images and you get them back I think that the first option is the way to go but still even that I will does not require user interaction so the other one let's say sending the data to a cloud service and then there is an engineer behind running the simulation for you and send it back it's nice but it's dangerous because if there is a misunderstanding or there is a lack of communication between the two parts let's say in the case of Phelps that try to predict the position of a TABI device to replace a calcified Aorta if the physician decides two positions they need to provide both to the company they need to test both and then come back and then the physician says ah well but actually the first one I did a mistake and I cannot do that in reality I need to modify it so could you do it again and again and again each simulation takes time and the engineer might not be in the same in the same time zone so it can be even more complicated to that any other questions there's a question thank you so much for this very nice talk so following on the previous question I was thinking of course the simulation once you have everything set up you just run it and it gives you the same result if you use the same parameters always I guess there is some sort of user dependency part which is how you generate your anatomical model from the image so you showed for example this nice tool that I don't know if automatically or automatically does a segmentation of the heart but what do you think it would be the impact of different users producing this data or you know when the data comes from different images because I guess different clinicians will acquire the images in a different way as well so you think this variability in the input data in general for modeling can produce significant differences in the output of the simulation or does the simulation have or the CFD have some mechanisms to compensate for that well computer agree with you if my segmentation tool is user dependent the simulation is going to give me different results and actually that is my kind of argument to use less accurate models for flow because knowing really where is the wall is not solved yet for brain aneurysms I would say it is the best situation because we have the best special resolution and the segmentation algorithm works quite well there are some of them that are automatic but even if you have for example different image modalities have been reported that you get a slightly different anatomy and then you even if you do the machine dependence analysis and you get very nice world stress the error comes from something that is you cannot control that is the image acquisition so if you use MR CT or 3D array then that will really give you some errors in where is the wall and we are talking about modalities that has higher special resolution that ultrasound so that way for me before I was working a lot on world stress because we heard from previous presentation it is very important in order to understand atherosclerosis for example and then from that people derive the OSI but we from as an engineer point of view that is fine but really when you talk about when you talk with the person just before you the one who provide you the image then you start realizing this is what I mean this is I just realize in the last year because I am working collaborating with people who work with image processing that the output of that is perfect, a unique solution thanks any other questions according to your experience so far so do you think that right now we are in a point that as researchers, as engineers we are obtaining from the simulation too much data, too much information and it is a question of the clinicians starting to understand how they can benefit from this or there are still some important things that the clinicians are asking for but we are not yet there so we need to there are some important questions that we still have not addressed with simulations you make a good comment because as engineer we believe for example watches to some pressure are the what produce the rupture of a failure of a tissue and there is no proof as far as I know that those are the biomarkers that are required to establish the initiation of an aneurysm or rupture of an aneurysm or development so indeed we are providing more data that they can understand without the proof that those biomarkers are relevant and I forgot the second part of your remark if there is still a very important question like a very important doubt that the clinicians still have for example in stand placement that we are not yet there because we are we are able of generating a huge amount of data but we still can not give an answer to that particular question so that requires more there are many questions I will set it depends on the physician first to all so you have to be careful especially in the industry that you collaborate with a limited amount of people and those people have certain expertise so for example you go to France when people are really expert and placing flow diverters on the vascular device tools that help them to predict the positioning as you go work with Finland they open the head and then they put a clip there so it's a different question and different interest so you need to be careful and try to be as general as possible and don't be biased by the recommendation of few physicians but I think this is a very interesting issue we already have touched it a couple of times because one of the problems is as a simulator you can indeed go from the physics and provide all the things that you think are relevant the clinician will tell you I'm using this parameter make it easier for me to measure that parameter but neither what the engineer does on their own or what the clinician is doing might be the best way and so this interaction, very close interaction where the knowledge of each of them is being put together and then that supported by evidence because in the end you need to provide the evidence I think that's the way forward so I think there's still a huge amount of questions which haven't been answered the problem is also that these questions haven't been formulated in some ways and it's by very very close collaboration as also Javier Bermejo said at some point it's like you really need to go as a modeler into the clinic or unless you develop methods which is something different but when you want to go to biomedical applications you really have to closely collaborate it's not enough to do a simulation and show it to the clinician and then or have the clinicians say simulate this for me it really has to be a very close interaction you have to understand the biology understand the question why is the clinician asking this and then also try to explain to them in probably an easy way by integrating it in images or whatever try to explain what added value of kind of modeling technologies is and this is something which I think is still lacking this is very very crucial for our community any other questions okay thank you very much then that finishes today's morning thank you