 So now we go one step further towards a little bit more the cardiac models and how cardiac especially multi-scale models Can work on this and unfortunately enough we also asked her to include machine learning Hoping that this was the only one with machine learning, but as we know reality Chrissie So thanks, but for this nice introduction and for the invitation to come here and talk today I was I've been kind of curious to hear how that would introduce me for this talk today because I was asked to give a lecture about cardiac most multi-scale models coupling multi physics and machine learning But to me they're these are three topics. So to really scratch the surface. We'd need one Lecture for each of these topics But well, this is the task that I've been given so I'm going to try and condense these three topics into one talk And to be able to cover everything I'm going to focus on just providing an intuition of what we're working on in the field and why and Since there's not much time. I'm not going to go too much into the how of what we're doing so if you're interested we can discuss later of our beers and So I just wanted to start with how I got here today So Bart and I were at a meeting earlier this year in Nice. So sometimes around St. Patrick's Day and But asked me you said Chrissy, can you come to the VPH summer school and give a lecture on cardiac multi-scale modeling? And I was thinking Okay, but you know cardiac modeling is not really my area of expertise because I did my PhD and image processing and statistical analysis and since then I've been working on Translating those kinds of methods into more clinical applications But you know beer tasting in Barcelona sounded pretty nice. So of course I said yes But I just want to change the title of my talk to the multi-scale heart from models to clinical okay from models to clinical decision supports and The reason for this is that in my lab we've been working a lot on applying modeling and analysis to clinical problems So that we can not only validate the kind of models that are developed within the VPH community But also so that we can better communicate these kind of nice new methods to the clinical community And I think this is a really important task and like Alberto said yesterday it's completely useless if we have awesome tools that are not getting used in clinical practice and a big challenge that we face with converting these kind of academic Developments into clinical practice is That clinicians don't want to use a tool that they don't understand if they don't know where we come to our recommendations They're not going to have Faith in what we're suggesting to them and also, you know clinicians struggle to to to change their habits So we really need to provide them with the kind of methods that are understandable From their clinical perspective at least kind of intuitively maybe not all the details and We need to work together. I think as a community to ensure that we're converting more academic research to clinical problems So I just wanted to start today by mentioning that I feel like the heart is really underappreciated organ If you consider that the heart beats on average 80 times per minute And if we think about the valves that are controlling the one-way blood flow through the the chambers these valves are as thin as a piece of paper So these tiny thin pieces of paper are maintaining this function over Three billion times during a lifetime so it's pretty Easy to understand or actually it's hard to understand how the heart can maintain this kind of function over a lifetime even under healthy conditions But it's pretty easy to appreciate that even small changes in the function can have significant impacts in a patient's health so the way that the heart sustains this function is by a complex structure made up of four chambers the ventricles in the atria and Valves controlling the blood flow through these chambers and a fiber network That's allowing electrical propagation through the heart to initiate pumping motion ensuring blood flow through the heart and into the body and This action is driven by mechanical function that depends on tissue properties and as well as profusion and metabolism So I'm just going to present today how we're working in the VPH community to better understand these aspects So that we can help to improve diagnosis and therapy planning so if we're interested in Understanding a patient what's going on with an individual patient we can derive mathematical formulas of cell tissue and organ-level dynamics, which if we bridge these scales and incorporate Multiphysics we can use these kinds of models to help with guiding diagnosis and with therapy planning But on the other side this patient is a part of a group of patients that fill it fit into a population coming from certain species, so if we also Look at what's going on in these groups of patients. We can understand what's going on with a patient by learning from what happens within this population or this group So that on the one hand we have modeling which helps us to understand the mechanisms behind heart disease and On the other hand we have analysis which helps us to learn By the outcomes of patients that are similar to the patient that we're interested in so the modeling kind of answers or sorry These two aspects are in some way Interconnected because like Miguel was saying that there's a lot of parameters in the models So we can use analysis to Identify which parameters are of interest in the sensitivity and confidence we have in these parameters and on the flip side We can use modeling to provide more data that we can use for analysis purposes to learn the outcomes So the modeling more or less answers the question of why and the analysis kind of answers the question of what what's happening with the patients So as I mentioned Modeling is not my background So I'm gonna make use of some work from a colleague of mine hermene guilds our value and Then I'll try to quickly brush on how we can combine these two aspects at the end of the talk by Kind of taking the best of both worlds So I'm gonna avoid any equations or any Talking about any specific methods because there's a lot of methods in both of these aspects So there's a lot of different Models available and there's a lot of different different types of analysis that we can perform So the point is not to provide a review of that the methods that are available I think there's a lot of literature you can find on that But more just to give an intuition on the kind of the problems that we're interested in So cardiac modeling is inherently a Multiscale approach so we're starting from the nanometer level of the of the protein and going up to the micrometer level the salm rain brain and So on from the cell tissue organ level all the way up to the torso level But the heart is a dynamic organ So we're also looking at a multi-scale approach in terms of time starting from milliseconds of the action potential up to the second level of the cardiac cycle and Even short-term remodeling that occurs acutely within hours. So for example after a myocardial infarction and Long-term remodeling that occurs over years So just for the sake of time focus on this the cell to organ level spatial scale and then talk about how modeling addresses The millisecond to second level temporal scale and how we can use analysis to model long-term remodeling in these patients So the modeling pipeline generally looks something like this So we have data and we have a lot of patient data available. So we have metadata. We have some Treatment history of the patient. We have some family history genetic information medical images blood tests physical examinations really have a rich a range of Data available and we can use this data to model the bio physical processes and these models of the bio physical processes can be used to guide therapy planning so for Starters to better understand the diagnosis of the patient To find what's the optimal treatment option for that patient and then subsequently to actually guide the physical therapy So of course once a patient receives therapy The data again upstate. So this is actually a continuous process where we need to update as well the models and then perhaps Modifier the diagnosis or or Repreform therapy So the heart is a multi-faceted organ, so I'm going to try to jump between some of the different Types of models that we're interested in so we have for example models of the electrophysiology of the heart of the structure of The hemodynamics and of the mechanics So I'll talk a little bit about each of these aspects and I'm sorry for the kind of the chaos of Jumping between them so we can blame but for giving me this topic But let's start by looking at the electrophysiology of the heart So it's been found that patients that are at high risk for arrhythmias They're also at high risk for sudden cardiac death So in these patients typically an ICD so implantable cardiac defibrillator would be implanted And this would reduce the risk of sudden cardiac death So for the sake of those that are not working on the heart, I'll just show this quick video. I hope there's audio killer and coordinated way at a rate between 60 and 100 beats per minute to pump blood to your body and brain However, heart disease can cause your heart to suddenly start beating very fast and quiver instead of pumping blood Effectively if left untreated this rapid heartbeat can cause sudden cardiac arrest and may lead to death in minutes So as the name suggests sudden cardiac arrest can occur without any previous symptoms and it can occur in patients that are healthy young Athletes and so it's a public health concern because we need to better understand which patients are at risk of sudden cardiac arrest So we can prevent these unnecessary deaths So the way that risk stratification for sudden cardiac death is currently performed Clinically is using one measure. So that's a measure of ejection fraction, which was introduced on Monday So this is just the amount of blood pumped out of the ventricle divided by the total amount of blood in the chamber and So this is a very coarse measure and it's been found to be highly Nonspecific and insensitive and risk stratific and risk stratifying patients so the current Cutoff is set set to ejection fraction of less than 35 percent. So these patients are automatically Prescribed with ICD The problem is that for every patient that receives a life-saving shock from an ICD 17 other patients have to undergo the cost and the the risks and the physical trauma of having this this device so what's been found in Observational and observational and clinical trials is that a letter of physiological Studies can be used to better risk stratified patients for ICD and so this is an invasive procedure where a catheter is inserted into the leg through the vein and up into the heart and Within the heart cavity stimulus is applied to assess whether the patient with the stimulus will Have a reentry wave of electrical propagation so this Inducibility for arrhythmia has been found to have increased risk to sudden cardiac death But this this method this clinical electrophysiological study methods Has not been translated so much to clinical practice due to concerns about this invasive nature the risk to patients and the low negative predictive value So what's been proposed as an alternative is to perform virtual electrophysiological studies? And these are in contrast non-invasive. They're safe and they're effective and They're non-invasive because we're relying just on image data. It's safe because we can Stimulate the heart as aggressively as we want when we're using a virtual avatar instead of the actual patient's heart And as I'll show it's been proven to be effective in determining a arrhythmia susceptibility So the pipeline looks like this we start with MRI images of the patient. So there's a contrast enhanced images and We segment the ventricle of the images to To define the the borders of the tissue and Thanks to the contrast agent which perfuses slower through dead tissue We can extract and characterize The different tissue by normal myocardium Gray zone or scar according to where these hyper enhanced pixels are present in the images So that the gray zone has been identified in several recent studies to be important even though there is some healthy cells in there that this progression towards dead tissue is causing problems in these areas So then from these segmentations we create a model the 3d model So if we look transparently we can see the scar and the gray zone in this model and then we extract Fibre orientation using rule-based methods So we need this fiber orientation because the electrical activation is propagated along these directions So the electrophysiology simulations are run like this using Experimental lab procedures some voltage clamp studies of isolated cells and these studies are used to extract the Membrane potential of different types of cells so we have cells coming from healthy tissue or gray zone tissue or scar and This membrane potential is embedded into a cell model where we change these ion channels According to this these different membrane membrane potentials and here I'm just showing one example of cell model. There's lots of different examples that you can find in the literature And the cell model is embedded into a tissue model which describes the cell to cell propagation of the electrical current And this is a 3d reaction diffusion system that we're solving and again There's many different methods each lab kind of has their own favorite So we have some very complex biophysical methods that are a number of equations solving a PDE Solution and then there is simplified models as well So this tissue model is embedded in an organ model where we have then the tissue characterization and the fiber orientations So these patch clamp studies give us information about the membrane potential So the normal myocardium is modeled as human ventricle action potential and the gray zone is modeled by a longer action potential duration with decreased Excitability and decreased conduction and the scour is modeled as non-conducting because the tissue is dead So then a simulation protocol is is applied. So here we're showing the left and the right ventricle so 17 regions of the left ventricle are Applied the stimulus is applied in these 17 regions and as well as two regions of the right ventricle at the base and at the apex and then the simulation protocol is applied so we have The simulation protocol here with five seconds of applying stimulation and then waiting two seconds to determine if That the propagation terminates or if it reenters So this recent study from my colleague was applied on 32 Patients and these patients also underwent clinical electrophysiological study so we could compare the the results from the two different methods virtual versus clinical So this was the largest scale electrophysiology study to date and this is just showing that the the models for these patients And then the results found that the first showing the first 22 here these 22 patients were inducible for arithmias and the last 10 were non-inducible. So there's no wave Propagation shown here because the electrical current is terminated So just show what this looks like over time. So here we've got one patient model and this is the point where The stimulation was invoking an arithmia So if we show this movie now the stimulation is propagating from this point and while it should Be terminating when there's no more stimulus. There's a reentry wave occurring and this is leading to arithmias So you can also see these patchy areas where the The propagation is not flowing through these scar areas So looking at the results so comparing the survival over years of the inducible patients So for the virtual electrophysiological study, there is a statistically significant difference between those that were found to be non-inducible and those that were found to be inducible and in contrast For the clinical electrophysiological study, there was no statistically significant difference or really you can see there's no difference there So we just look at some numbers quickly So the virtual study was found to be more sensitive more specific Have a higher positive predictive value and a higher negative predictive value and a higher overall accuracy than in invasive clinical electrophysiological procedures so what I would just want to conclude from that is that Multiscale modeling has the opportunity to provide a non-invasive safe and effective alternative to clinically invasive procedures and using these kinds of methods we can better Determine which patients are suitable for therapy So now I just want to quickly mention how we can integrate Mechanics into these kind of models so that we better understand the multi physics of what's going on so if we Consider that's what's being found recently is the mechanical properties within the healthy tissue are not the same as mechanical properties within the grey zone tissue However in the current electrophysiological formulation, there is no treatment of the mechanical function at all so this difference in mechanical properties would not be captured So the electrophysiology actually feeds into the mechanics via this excitation contraction coupling and Conversely the mechanics feed to the electrophysiology by McKenna electric feedback and this occurs It's both a cellular level So where the action potential changes in the action potential effect changes in twitch and an organ level where changes to pressure Influence changes to the ECG So just to quickly mention this is if we're considering multi physics. It's also a multi-scale approach. So mechanics Coming from the organ level are described by tissue level continuum mechanics and at a cell level by cellular myofilament models And these can be combined Or to Capture this excitation contraction coupling via the calcium channels here So that's all I really want to mention because we don't have much time But just to say that it's important to model the multi physics because it provides a more comprehensive insight Into the cardiac mechanisms knowing that the cut that the heart is a multi physics organ And it also helps to provide ways that we can understand better the regulatory processes within the heart So now I want to switch gears and talk about How we can look at the different temporal scales by using analysis of modeling Changes that happen over hours or even over years So if we come back again to this example of risk stratification for sudden cardiac death if we have a population of patients We know we may know which of these Have had a history of arrhythmias and what we could do is to look at these patients in more detail and investigate what Common manifestations are present in these patients. So are they exhibiting similar? abnormalities in terms of mechanics or structure or whatever so if we perform analysis of this group it can help provide Insight into which kinds of patients are suitable for an ICD So we can use machine learning for this and this is Bart's fault because he mentioned it in the title of my talks I have to talk about it So we we start with some data and we pre-process this data We train a model based on the data and we test this model from new data that we have and then we validate it So for the sake of time, let's just pretend we're in an ideal world where we have already lots of data So unfortunately, that's not really the case But well and let's just assume we have some validation protocol. So then we can just concentrate on looking at the models The way we normally do things is we take some input we apply some feature extractors to get some features and We apply these features to our machine learning algorithm of choice and this gives us some output So of course as you all know and I can't avoid mentioning it big hot topic in the field these days is to just skip these middle parts and Go straight for a deep learning approach and then getting some output and So of course it's been shown and in many different publications recently deep learning is really really useful for for example if we want to care if we want to classify Pixel of an image for example for segmentation We don't care why that pixel was defined as being in the myocardium We know that it's in the myocardium. So we don't need to know the why But if we actually want to understand better what's going on and provide the doctors with something that they can understand then we Need to use this class of methods So and then the biggest challenge there is how we define these features That we can use for the learning so now I'll take a look at some structural analysis and If we're interested in the structure of a group of patients So we might want to know what's if these group of patients have the same disease Which structural features are related to clinical outcomes or what abnormal features are present in this population? What we can do is to compute the average shape in the population and then using this average shape We deform that shape to each patient's geometry and this Essentially gives us a description of each patient shape as the deformation of this average shape So of course, this is just one way to represent the shapes I mean, there's lots of different ways to do that and everyone also has their own preferred way to do it and Auckland They do it their own way Pablo does it his own way Of course, this is though the best way to do it So we take all these deformations that we have and we Gather them into some arrays so that we can perform the analysis So this is just the XYZ coordinates of the atlas and what's the deformation of those coordinates? So we just collect that for each subject And then once we have this nice array We can just apply matrix manipulation to extract. What is the most important features within this matrix? Again, there's lots of different ways to do that and I'll just mention The way we've been doing it with principal component analysis just to compute the eigenvalues So these are essentially the loadings or the scale and the corresponding eigenvectors Which are more or less so scores or the direction and so each PCI mode So mode as in most common feature each PCI common feature is just this direction times the scale So just to make sure that everyone's awake. Let's do a little quiz Who can tell me what's the most common shape in this set? No one's awake Not a trick question Anyone? Yes, well done. It is a square. Like I said, not a trick question or a four-sided object So this is exactly what we're doing basically It's a bit more complex because we're dealing with 3d shapes But we're trying to answer the same thing when we have a set of shapes. What's the most common one? And we've been looking at this group of patients, which is arithmetic right ventricleicardium myopathy patients And as you can see, this is two patients and they're very different So this patient has very high risk of arithmias or actually they have heads already arithmias and this patient on the other hand Is a low risk and we can see that there are some abnormal shape features if you're used to looking at MRI that is This ventricle the right ventricle here is significantly dilated and so what we can do is to To model how these abnormalities in this shape are related to history of arithmias, for example So when we applied that to datasets of these patients, so just showing Here on the the left column that just the normal control group just for comparison sake and Then the the disease patients on the right column. So here showing its plus and minus one standard deviation We can see that the first mode it corresponds to dilation of the ventricle And here I'm just throwing the right ventricle because otherwise the left ventricles in the way But these labels these were defined by the clinicians that we're working on just by intuitively looking at this shape And that's kind of the beauty of this method is that there is an object that they can understand when they look at this They can interpret it in a meaningful way so the second Shape mode was elongation at the RVOT the right ventricle outflow tract and you can see that here. It's really elongated here Third was some weird behavior between the left and the right ventricle where the left ventricles tilting towards the right ventricle and vice versa and the fourth was lengthening of The the ventricle and the last one is elongation at both the inlet and the outlet So these five modes capture the majority of the variance of the population So now we have all of our patients described by these five shapes that we now we've put names on them The clinicians can understand. Oh, okay, the patient is a lot of global dilation for example So that means so for any new patient that we have thanks to linear algebra We can project this patient into this reduced subspace this what we call a latent variable subspace So we can find for example how much of global dilation the patient has how much maybe five times RVOT elongation four times septal tilting three times lengthening in five times RV and the outlet elongation so now we have these five understandable shapes and Five numbers that correspond to this shape and that are describing this specific patient so we basically just Collapsed down this complex 3d object into five numbers and now we can use these five numbers for learning And that's what we did. So this is just showing the loadings for each of these five shapes For the subjects that we had in our population and what we're interested in is the patient's having the most extreme values of these Loading so for example showing here the patient that has the most global dilation you can see Yeah, the RV is severely dilated. So the results of these loadings they have some kind of interpretation and Then we use these loadings We just took the shape loadings combined. So we took the absolute value and just Summed them up and compared to a control group in blue and our deceased population in red Just rent simple Machine learning classification algorithm. So the K nearest neighbors Finding okay for this guy. What are the closest three values to this value and These are all red. So I'm gonna assign that patient red as well And when we plot the receiver operating characteristic curve for this Classification you can see here It's if the classification was just by chance and here along this axis if it was perfect So we're getting pretty good results for this Then we had another group of patients that were interested in so can anyone guess Which patient grew which patient was born was structurally at normal heart. So the aorta where the Pulmonary artery needs to go and vice versa and which patients normal Can anyone guess I hope you don't say the right one because that's my heart and I think I'm healthy but as you can see the The images there they're really hard to distinguish this patient looks pretty healthy and The thing is that the surgery that these patients received was pioneered only 30 years ago So these patients are now starting to reach adulthood and it's starting to become more Clear how the patients will evolve and it was considered at the beginning that Once the patients have the surgery. They're just gonna be normal afterwards, and they'll be fine But when we did this structural analysis what we found was a complete split between age matched controls and the the patients and this is as you can see here the perfect best-case scenario for The classification that's there's a perfect split between the patients So we don't yet know why this is the case But it suggests that we need to do some more investigation in these patients because they may start to develop problems later in life so we also looked at correlating the shape features to clinical outcomes so we can kind of then draw some conclusions about if one if a patient has one shape features so for example shape 4 which is moderately correlated to history of arrhythmias of ventricular arrhythmias a patient a new patient with this This a lot of this shape May want to follow up and do some more examinations for example So we looked at this for history of arrhythmias and abnormal ECG abnormal signal averaged ECG presence of fibrosis history of the patient fainting and increased heart rate So the thing is that if we want to predict response to therapy It's based on mechanical properties and also on biological properties on genetic factors and also electrical factors And ideally we need to actually combine these if we really want to provide a good prediction of the response And so there is different ways to do that So for example if we have feature so this is a again really simplifying the problem just for the sake of Providing the intuition But we can take features and then we perform our favorite classifier And we get our predicted response and we can do this for a number of features coming from different data for example and then we can Combine these predictive response by a voting method. So this is as some of you probably know ensemble methods Alternatively what we can do is take the feature and describe this in a latent variable subspace. So that's what I was doing with the shapes We do this for different Features and then we can combine these latent variable Features now and then perform classification in that space and predict the outcome On the other hand we can take a feature and we apply some kernels to different features and we can combine these kernels Like so and then perform support vector machine algorithms and get our Predictive response. So of course I had to mention this method because it's about favorite method of the month or something Keep him happy So these are not an exhaustive list of different methods But if we just consider here we have one way to combine different features by combining models through voting or combining latent variables or combining kernels So if we just look first at combining the latent variables again looking at the shape here we have the shape of the same patients that I showed earlier and combining the Computation of these shapes with clinical measures. So here. This is actually the shape that's in the population is most capturing variances in right ventricle and diastolic volume So the volume at the right ventricle when the heart is most relaxed and then this scores of these shapes highly correlates to in diastolic volume So we've kind of capturing two features within the same the same model there And when we did this we were looking at the correlation with the history of erythmias And what we found is when we look at just the measures alone. So the computation of the ejection fraction in the right ventricle was not statistically significant Right ventricle in diastolic volume. So the volume at the most relaxed phase of the cardiac cycle. This was moderately Correlated with history of erythmias and similarly at the most contract phase and in the left ventricle no statistically significant correlation was found Whereas when we add the measures plus the shapes what we found is now we have correlation with ejection fraction in the right ventricle we've improved the correlation with The volumes in the right ventricle and now we have strong correlation with left ventricle in diastolic volume so we kind of we take these shapes and we describe the shapes with some descriptors and We perform this latent variable analysis or model reduction and We compute the modes that are relevant to clinical parameters So we have some clinical parameters and we what we get out is some pathological shape patterns and in terms of the dimensionality of the problem we start with Models with the parameters on the order of hundreds so more than 500 maybe more than a thousand parameters depending on the parameterization of of the geometry We reduce this to what we have was five parameters and then We reduce this even further to not just one One mode that's correlated to some outcomes and the great thing about this magic number is that It's a number that doctors can actually also interpret because they can visually see what the shapes correspond to and It's also for us convenient it provides more robust results We get greatest statistical power and now we have the possibility for coupling complex models Because we have fewer parameters to deal with So if we look at combining kernels, I have to give an example as well So let's just quickly mention what we're talking about. So we have an input space and this might be For two different groups of patients. You can see there's a separation between these groups But it's not linearly separable. We cannot draw a line there So we apply some function to project this to a higher dimensional space in which these two groups are linearly separable by this hyperplane and this is what we call the kernel and of course the kernel will change depending on the data That we have so if we have data coming from ECG for example The appropriate kernel will be different than data coming from images so the advantage of this Multiple kernel learning framework is we can take for example, we have CNI MRI images So this is in just a quick recent study that we performed we took the shapes and the scar extends We extracted some descriptors for each of these and then we combine these different types of data and the kernel learning space by extracting multiple kernels and Performing the learning on these multiple kernels and the the beauty of this method is that the weights of the kernels are Automatically determined within the learning algorithm. So everything kind of comes together for free and When we did this in this particular work, we get we got 100% classification accuracy in groups Two different patron groups that were not that clear to divide So we were kind of happy surprise and there was a significant improvement on ensemble methods and on just running SVM on on one or the other so What can we use machine learning for? Combine data from different sources and that's important because We need to better make use of all this rich range of data that we have available and we can use it to predict outcomes as well So now I just want to quickly mention how we can combine these so like I showed before we were able to explain the what by predicting or correlating history of arrhythmias with some Information that we provide from image data and What we can do actually so this is a shape mode or a shape feature. We know that this shape feature is highly correlated to history of arrhythmias But we don't know why and what we could do what we haven't done yet, but what we could do is to perform modeling on this geometry to see if we can Find some answers that explain why Why this feature is leading to history or to Arithmetic events So this kind of this approach. It's been applied recently by some collaborators in London in a different way But it's it's still very much Upcoming research and I think something that we need to go more in the direction of We can also use again multiple kernel learning for this So if we had senior mirror images where we get the shapes again, or we can get derived models from this So for example here showing models of the blood flow through the pulmonary artery We can extract late invariable descriptors of each of these so the shape modes that I showed and What we did in some work a few years ago with the lab from In rea in Paris Extracting also modes, but now of the blood flow solution Here's showing the velocities and here showing the pressure and We're using these kinds of models Obviously, we saw a lot of nice talks yesterday about how we can image blood flow, but thanks to simulations of blood flow we can simulate different scenarios we can implant Devices in the models and investigate whether these which devices are optimal in terms of flow efficiency and so on so then we can Combine the the kernels of these kinds of Descriptors and again we could apply learning So this is something that is we could we haven't done it yet And again something that I think we should do more of in the future And the reason for that is that if we really combine these three aspects Multiscale modeling with multi physics and machine learning we can get everything so we can provide safe and Effective alternatives to invasive clinical procedures. We can better stratify Patients for treatment We can better understand the regulatory processes. We can predict long-term outcomes We can combine data from different sources and we can provide most importantly a more comprehensive understanding into the inside of cardiac mechanisms and So this is something that I've been working on thanks to this collaboration that I've been a part of for the last four years And we're working together with us at similar research laboratory working on the modeling and analysis and we're working with industrial partners from Metronic GE cardio solve for example and collaboration with the clinicians So we have kind of the best of all worlds here so that we can combine these three aspects and Hopefully provides better tools for diagnosis and therapy planning So with that, I thank you for your attention and I'll take some questions Thank you very much. I'm happy. I think you're perfectly filled in the title that we asked you It was a challenge It was a challenge, but I think it's also and I think it's extremely complimentary Yeah, if you look at what we were looking at today So we started with a in brackets kind of simple tissue With complex modeling in order to try to understand what happens also in lab situations which gives us insight which gives us Questions in order to look further and then a little bit more complex towards arteries where it's already much more difficult To do validation Towards here cardiac modeling, which is almost impossible to get right at this moment and do fully But on the other hand, there's clinical questions that need to be answered One of the things that I like very much which you said in the end Although I not completely agree with going to one parameter, but the fact of dimensionality reduction I think is crucial because what you see is like with our models with our machine learning with all of these things It's like we have so much information for measurements We can do so much modeling but in the end if we provide all this information to the clinicians Nobody's going to use it because it's impossible to do something with and they're definitely they're more used to making a decision Based on yes, no easy guidelines this kind of things so finding ways to based on models data Doing machine learning to go to dimensionality reduction. I think that's really really crucial This is something that I think in our community. We really have to do so. I think you addressed that well I mean, this is a very very important message. I don't know if there's any questions Yeah, thank you for your presentation. Just a question. I put some method that they use you use to generate the shapes mean shapes You use standard elements the formatica software if I'm not mistaken. Yeah Which has one limitation which is that it does not correlate to Mechanical points, let's say of the herd through the formation it generates the deformation which satisfies certain properties So you had very beautiful results on the classification. So it does work. Did you have any? Did you expect some problems with including in the mean shapes some problems of the algorithm Through the deformation Was I clear? In this type of work what we're mostly interested in is providing a way to parameterize the patient shapes And so that's mostly what we're using the atlas for and so in this case The atlas is it's the average of the population, but we actually could use another geometry as well It's just more kind of statistically clean way of doing things. So we're not that bound by the the geometry of the atlas actually what I what I meant was the Deformation that you use to go from one patient to the LS that you you fill in into your matrix to generate the modes they have some Algorithms and some influence into how the deformation is generated. So when you extract the modes, you also reflects the the deformitica software, let's say Influence on the deformation, right? Yeah, but so what we were mostly Looking at is making sure that the deformed geometries correspond well to the the true geometry So we're just making sure that that fit was good and it's in my opinion pretty hard to really Take a look at any specific landmarks in the geometry anyways. There's so many kind of Tricky steps along the way already. We're not sure really about the segmentation. So there's there's kind of bias Coming in at every step of the stage So we just try to make sure that everything's at least consistently performed and that's where yeah consistently Representing the shapes so that they correspond well Awesome work. Thank you Hi Thank you for the presentation. I I just want you to comment on the Generalization of your models for example in the first Model that you showed you use 32 Patients and this because you had the images and the electrophysiology for those. So how much? Your models are biased toward the patients that you're using in your training in the moment in which then you want to apply those models to population scale Yeah, that's a great question. It's like the golden question in modeling I guess so in this case there was 32 patients used in that particular study because those were the patients that also underwent Clinical electrophysiology studies and it's not that easy to get large numbers of patient data for this And that's kind of the challenge I think that we all face in the community and something that we're all trying to push more to to resolve so that we can Be more sure and we're also right now. That's we're trying to avoid saying If this then this Kind of because we cannot yet really generalize with such small numbers of patients sorry, I'm asking just because I Mean we are always talking about big data and combining different sources of data You also know we're showing this but actually at the end of the day what you want is like to Tailor your model using law very personalized Source of data, which is very difficult to to obtain actually Yeah, so for sure with the when it comes to for example the biophysical models When there are so many different options for the cell models and for the the tissue level models and so on it's kind of a trade-off between Finding the parameters in those models That are leading to the highest accuracy and finding which models are the good trade-off also with computation time But I would say in this case that the models are each individually applied to the patient-specific Geometries, of course then using the parameters that we've derived from a group but at least in that way we're I think capturing the Differences that are in the each of the patient-specific geometries. I think it's actually a really interesting question in some ways Also is how many patients you use because one of the things is also you start from MRI images But as we said like on Monday, it's like MRI is an extremely small percentage of the number of patients that get it And when you go to real clinical data, which is mainly ultrasound based for example One of our students Gabrielle is trying to predict subtle shape remodeling based on this and there it seems like it's not the number of patients that you need for the shape But the problem really is the noise on the measurement which is actually determining the limits And so whether you have a hundred patients or ten thousand patients It makes no difference because your measurement is not accurate enough from the beginning So this is another thing that at some point We also have to consider and see which is the modalities that we need and things and that have you been playing with ultrasound data For example, just to know so I have been curious for a long time to know if the this kind of raw Course geometry that you can extract from ultrasound is sufficient to get also these kind of results But the challenge has always been that I'm working predominantly on diseases that influence the right ventricle and Getting right the right ventricle and ultrasound a good luck So we just don't have the images of the cavity that we need to study so we haven't done it so far But of course if you're only interested in the the left ventricle, I Intuitively, I guess it's it will also work Yes, good question. Did you Try at your institution deep learning. Do you have any experience? Are you planning to try it? And how do you think clinicians will respond to this sort of techniques that are powerful that be difficult to interpret? I really feel like the clinicians are not ready for that yet. They they're already like modeling analysis Even some clinicians don't believe in ultrasound. So trying to bring in new concepts It's the clinics. They really need to understand what's going on We can't just give them this black box tool and but I think they will get there We just need to work more on Applying more clinical applications and and getting more of our kind of work into the clinical community so that They'll understand better what we're doing and when they understand they'll be convinced And I presented some of this work at a clinical conference last year On the shape analysis and the clinicians was like, yeah, this is great. I'm really excited I want to apply it to my data. Where can I download the software and I was like We don't have software. It's like code written on my computer But I guess that's what they would like to to actually be able to play with these tools themselves and apply it to the data and we need to make more of a push to To getting these these nice methods out there so that we can Yeah, use them Thank you very much. I have really enjoyed the talk and I really agree with the vision you provide that I would like to also raise a bit the What are the challenges to really fulfill this and we have touched on a couple of them But I think the one I would like to highlight is their reproducibility Where still you touch a little bit as well. We are working with a modality. That is really great It's MRI, but you then have some gaps between when you get it when you submit it when you mess it and when you make the analysis And I have the feeling that working with these small samples studies of 50 100 patients We might still yes getting great results when when but at the time that it comes to reproduce those results Playing exactly with the same numbers and not tricking anymore the learning. We're gonna face some Surprises that that not performing so well, so it's it's a little bit that I'm answering a little bit the question but in your in your mind to Not to fall too much into that our excitement at the beginning and to get our steady progress towards our goal What I think your challenges are and the and the right opportunities we have in order to solve them Yeah, this is something that actually it was asked during my PhD defense How many patients I think we need to actually be able to generalize the results and I didn't have that Answer to the question then and I don't have it now Four years later, but I think When we have more data available for the community will be able to answer these questions and it's also a matter of Not just having the data but having the tools to process the data because we all know that segmentation is biased I'll do it different to you do it. It's not that easy to to do these things Automatically, but you know the the community is getting there and when we have these tools we can really find out what interesting stuff We can pull out I think I Think this is indeed also relevant and one of the other things that maybe we as an engineering or modeling community Also, probably do wrongly or at least not good enough yet Is that also what you do is like you compare different pathologies? Which are in a way extreme an ARVC is very different from a normal ventricle But if you start to look at larger populations with more complex or more mixed diseases Then you start to see that everything becomes much more fuzzy is at the moment that we look for example at a hundred patients with Heart failure compared to normal. It's very easy to find machine learning that that solves it if we look at 5,000 patients with heart failure then suddenly it becomes a mess because it becomes much much less homogeneous And so I think that's where also some of our questions go It's like, okay How do we deal with this much more fuzzy data rather than with clear cut-offs that we like diseased or not and Much diseased or not. That's what we actually have been working with So I think that's indeed another challenge that we need to face So numbers itself will not tell it will be also like what's the mix of patients that we will look at yeah So you presented at some point so you what by the end of the talk and during the talk you you presented the The the the possibility of simulating so the electro mechanical So electro mechanical processes in the heart as a way to to improve the stratification of patients and and I really like this idea and but My question is so if you do that So computer simulations will can give you virtually an infinite number of Descriptors and within this descriptor so a lot of them will be will be noise so then you also add the difficulty to choose the proper simulated descriptors and So I would like to know so from your opinion if you think that it also stands for a new challenge in machine learning algorithms with a new solutions then should come up in order to deal with this possible massive amount of additional descriptors Yeah, so I think that's the We can use these kind of models to boil down very simple descriptors. So that's what my colleagues are doing. They they just take Reentry waves is one descriptor of this Complex model that's described by millions of parameters. So There it's taken just one parameter and that parameter could be used for modeling, but it's kind of the challenge that when we have to tailor the features for the machine learning algorithm, we need to Also incorporate some biological Information we need to talk to the doctors and know what's interesting for them because we could find something that we're like Oh, this is really discriminant. They're like, yeah, we know that we don't need to know that We don't care if this patient or this patient. So it's I think a into the disciplinary approach that we need to Tackle by really working together and in the different domains So that's indeed where I think what I said the dimensionality reduction, but in a clever clinical physiological way I think this is crucial in order to Get on with this Okay, thank you very much. I think it was a very interesting morning. So we go for lunch now Afterwards, there will be the hands-on again at about the same time as yesterday same place Keep in mind also that this afternoon at six o'clock in here, there will be the talk by Tina Morrison From the FDA, which I think is very very interesting to look at how modeling will contribute or is going to be used for Kind of validation and regulation of devices and therapies and things like that So please be back here and after that after the talk at six o'clock then around seven We will have the beer tasting here on the square above So please come back then later on. Thank you very much