 Okay, I'll get our CSDMS, the Community Surface Dynamics Modeling System webinar series of today going. So I'm Irina Overeem and I'm on the CSDMS integration facility and we're hosting together with Sam Harrison, like as a hybrid US European. So I'll hand in a second the mic to him. But I'm very delighted to have Guillaume Jouvert present today. He's one of the first people that for me personally introduced me to the world of neural nets in modeling and not in just data manipulation or image manipulation, but like really like taking that tool or those methods to like physics based modeling and physics informed machine learning and all the better that it was in a glacial logical model and like we'll hear some about glacial dynamics as well. So I'll hand it over to Sam who like talks a bit on like the Euro CSDMS. Yeah, thanks very much. So I guess some of you might have spotted that this the full webinar series this year we've been running two of them as as what we've labeled the Euro CSDMS webinar and basically the goal there is to align the topics and align the time zone slightly more favorably towards Europe and actually this is based on a little bit of funding that I've got at my institute and I'll say who I am in a sec, which is hoping to kind of bring together different communities. So we have a little bit of a community over here. So I'm based at UK CEH UK Center for Ecology and Hydrology. I'm based up in Lancaster in the UK and we have a community up here called seeds which is the center of excellence in environmental data science which is a joint institute between UK CEH and Lancaster University and we do basically lots of similar work to what CSDMS does and we thought this was a really good opportunity to kind of I don't want to use a buzzword synergism but I'm going to use it anyway to bring out some synergies by you know working a bit closely together. So a long story short we've got a couple of these webinars this time round that with as I say targeting slightly more at the Euro side of stuff and I also just wanted to flag that at the end of next year we're going to be holding a Euro systems, Euro CSDMS workshop over in the Northwest of the UK and we're really hoping to bring some folks there from the European community together. It's going to be a workshop with seminars, with tutorial type stuff maybe with some kind of interactive sessions. The scope hasn't been fully decided yet as you can imagine but I think one of the themes is going to be around what the next generation of environmental modelling will look like and what are the tools and what are the things that can bring us there and that's a really nice segue actually into Guillaume's talk today because reading through his abstract there's stuff in there that I think is absolutely part of that next generation of environmental modelling. We're talking about emulators and neural networks and GPU stuff so I'm really excited to have them here today talking about those things. So without further ado I'm going to hand you over to Guillaume. Do you want to get your slides up and check we're all okay? Yes. Perfect, so yeah, over to you and then and then we'll wrap up at the end with some questions so be thinking the questions on the way through. Cheers, over to you. Thanks a lot. Well first of all thanks a lot Sam Lien and Irina for organizing this. I'm very happy and very honoured to do this presentation for the CSTMS community that I'm following with very great interest. Okay so basically so I'm Guillaume Jouvet, I'm a professor, substitute professor at University of Lausanne and since two, three years I think three years I started to work on the Glacier Evolution Model, IHGM which is named after Instructed Glacier Model because it is referring to deep learning and I'm using basically deep learning mostly for boosting or to make the model computationally more efficient but it has also some other advantages. Basically I'm going to present my talk with the following structures. I'm first of all going to do some a bit of background on Glacier Evolution Model and some rational what are the main limitations of today's model and why we would like to take some new approaches. Then I'm going to go more into the detail about this emulation strategy for ice flow modelling which is really the core of IHGM. I'm going to talk about the inverse modelling because this is really a very important area also in Glacier modelling. It has all to do with data simulation and how we can integrate not only the physics but also the data within this framework and then at some point I would like to make a short demo of IHGM according to the time that I have because actually the goal is we can select a glacier randomly and then we can make a short 100 or 200 euros of simulation of a Glacier. So if anyone actually listening here has a glacier that he or she very like then don't think about it and then you may give the RGI all glaciers they have an ID and if you can copy paste this RGI number in the chat then we will model that glacier at the very end. Let's go to basically the main picture of Glacier Evolution Model. So Glacier Evolution Model take his two basically two different components on the other hand on the one hand we have some exchange with the climate and how this translates into some surface mass balance which can be positive on the top of a glacier or negative on the bottom and on the other hand we have a nice flow which brings which drains ice from the top to the bottom and basically Glacier Evolution Model is modelling ice thickness evolution accounting for ice flow and surface mass balance through the mass conservation equations and there are a couple of things we can add to our model whether we want to model thermodynamics to glacier hydrology calving whatever. So this is really the main the main pictures and the animation was just an example of of an evolution of the adage class in Switzerland but it was just an evolution of a glaciers there were not much important things. The rationale for building a new glacier well at least a new approach to have a new approach for Glacier Evolution Model is that there is obviously an increasing demand in Glacier Evolution Model with more physics we would like to have more higher resolution or explore longer time scale especially if we want to do some patio glacier modelling and there is something that often happened in the model is that it's if we think about for instance paleoglacier models it's often that we actually simulate same or similar states so if we think about the glacier which during the last glacial cycle advanced 50 times and then retreated 50 times in fact we could think about that we are recomputing many times the same thing which brings us to the idea that would that not be able would that be a way to instead of resolving and then re-investing time investing computational time trying to remember or trying to store the information we already computed and then reuse it and of course a major thing that parallel computing is key to achieve significant speed up especially GPUs because now the computational world is migrating to what GPUs just for a small illustration so CPU we can pay this as we have six very very fast scores but we only have a few of them and that's not much if we want to especially to deal with some very large grids because everything has to go through these six even if they are large pipes on the other hand a GPU it's a lot of processors but they are much lower so we can sort of picture the difference between CPU and GPUs that way and the thing that in the last year they were impressive improvements in GPUs they are cheap they are affordable because they are also used by the gaming community and they are they made tremendous improvement in terms of number of cores power and then the computational capability of GPUs has reached a level which is pretty impressive and it's also increasing very quickly which means that everything that we are developing on GPU you can expect to have for the same code in two years probably a speed up of possibly a factor two while you won't gain much on CPU I do have an animation but I'm not daring to open it here just to show I don't know should I try not let's try yeah it doesn't crash no I just had a drop of on my front it's okay so just an example of it's basically a game where I use the topography of the Alps and just playing the game of decreasing and increasing the the equilibrium line altitude and just having glaciers that grow and go forward and go backward just to mention that it's a 1000 pure simulation so it doesn't intend to reproduce any kind of reality it's just an exercise basically but this is something that took about an hour on on the laptop of the GPU of the my laptop and here is just an example of basically I played the game of doing some settings which means that I additionally included some particles just tracking passively particles so it's just some post processing but it's also a big advantage of using the GPU because then all the computation of each individual particles can be done in parallel so that we can afford really doing millions and millions of particles and again what all what am I going to all what am I showing to you is really it's not super high computing it's it was done on the on the on the GPU of a good laptop but basically it's it's really something which is key is that we but if we are capable basically to go to to turn our computation to the GPU we can really afford doing or handling some very large arrays and possibly to capture very high resolution if this if that's what we want the main problem is that how to vectorize or how to parallelize or the operation of a glacier version model and keeping the modeling framework simple and or my presentation or this this glacier evolution model like gem is about this okay if we think about the simulation I've been showing to you before where basically we have different components we have surface mass band we have the interaction from climate to the glass here we have the mass conservation and then we have the ice dynamic the ice flow modeling is usually the computation of bottlenecks this is what takes in a higher model this is what takes 95 if not 99 percent of the time so which means that if we want to reduce the bottlenecks this is the component we have to to to to attack first and this is all the strategy of igm is to instead of solving the partial differential equation associated with the ice flow instead to emulate this part and instead to replace the traditional solver by a machine learning model which is here a convolutional neural network trained from high order models and the the main advantage is that once this is trained the it is completely very efficient and on the top of that it it runs on gpu very well because those all those network those neural networks are extremely compatible with gpu computation in a sense that gpu can not only train them but evaluate them very efficiently on gpu's because you can there is a the strategy for parallelizing is really the base of actually the the of the neural networks so this is all what what is going to what i'm going to do is really to to substitute this part by a convolutional neural network so the idea is to use this deep learning surrogate model so instead of solving the physical model so here would be the stokes stokes equations that are modeling the dynamic of the ice so this is one one uh possibility it's not the only model there are some simplification of this model but this is usually the one that we like because this is uh it it has it is complete in terms of the mechanical component it has on so the idea is that instead of using this model we are we want to go for the other way which is using this deep learning surrogate model which is our convolutional network so we really want to go that way and the main challenge is that how to train basically a convolutional network to do what or to do not exactly what the physical model is going to do but at least something very similar so here we aim to to train the convolutional network which basically will map uh output field what we want to uh predict is ice flow velocity so this is something that we want to do on the raster grid so you can think this is a horizontal component the horizontal directions and the layers are corresponds to the to the third dimension basically the the the vertical elevation and we want you to do this from input fields which include for instance the ice thickness or the surface of topography there are some parameters which are controlling the ice viscosity or the sliding basically here we put all our predictors which are basically the parameters that are that would be in the stalk solver if we were going towards a solver but we are not doing this so instead we are we are using and we are stacking our inputs that way because uh the main interest of working with uh raster grid is that uh first of all we want to use this convolutional this convolutional networks which work on raster grid and uh that way we represent all our data on those raster grids and the convolutional network are very basic tool in um machine learning applied especially for image analysis so all the all the for instance the uh classification problems like is that a cat is that a dog uh or there's a bench of other example in image analysis image labeling which are based on this convolutional network and this convolutional these convolutions they are special operation which are good at tracking special patterns and that's exactly what we like to use because in a way we want to build a neural network which is solving a PDE we want to replace or to substitute the solving of the PDE and the PDE itself made of derivatives which makes it makes sense that we want to include this convolutional operation which in fact our suppose are here to learn the special derivative of the PDE so uh the CNN mathematically speaking is nothing else as a sequence of composition of linear and non-linear functions and the linear functions they are trainable in the sense that they are weights and those weights are that's what we are going to optimize to fit to some data that's really the the basics of of the usage of CNN so when I started to do something like this uh for applied to ice flow modeling the I took an approach which was very um similar to let's say a database approach where my the I really took a very pragmatic and very easy approach which means that I what I did is that I first tried to produce a certain number of data so which means I used an ice flow model an external ice flow model and I have simulated a certain number of glaciers here you can see 10 glaciers that so those glaciers actually don't exist I just took random lists and topographies and I applied some certain climate to just to produce some geometries some plausible geometries of glacier and I really wanted to build a certain catalog to get some large glaciers some thin thick one uh fast slow ice a bit of everything the goal is to get to get a certain diversity and on the top of this one I had my database I try to train this convolutional network to fit as much as possible to those data so you could think about you do some run on with um an external simulation tool so here I used uh MRIs and I actually did not use on the MRI use also the parallel ice sheet model uh to produce a certain number of data and then I try to I trained my neural network my convolutional network to minimize the misfit to those data so I tried I feel I fed this this neural network with a lot of data with the goal of say okay this is a database try to get as close as to as close as possible to the model data that that came out of those uh ice flow model so this was really the former approach which was nice but the problem is that it did not generalize very well so you could think that of course if you are modeling mountain glacier don't don't expect it to work with with ice sheet or ice shelf because of course this is another dimension the ice flow is very different it's going to work only if you are modeling some things that resemble what you have in your training data set but still in general it provided it did a rather good job but again as soon as we were going out of this envelope defined by the database it was not it was hardly generalizable so the reason I came to another approach which is way more recent the paper came this year where basically the idea was instead of training the convolutional network to uh fit some external data so basically I trained it to minimize the energy which comes with the ice flow model because that's something that actually it's a common point to any ice flow model is that they can all be written in in form of an energy so here it's not the stokes model it's a so-called bladder patin model it is a small simplification of the stokes and what you can see here so it's not the most common way to write the ice flow equation so it's rather uh which called a niche to to write it that way but in fact any ice flow model what we don't always know is that when we have solvers that solve those PDEs what they do actually they are also optimizers that are minimizing this energy even if this is not very visible and so for instance this energy which comes with the bladder patin model is a sum of different terms one corresponds to the shearing one corresponds to to the basal friction and one corresponds to to the gravity um okay sorry I'll just close my emails that are not to be disturbed one corresponds to the gravity and then this is basically we are minimizing a certain balance of forces I'm not going to go too much into the detail but basically the only difference using this physics-informed deep learning is that instead of minimizing the proximity to some data which was generated uh apriery or externally offline so this time we are just uh training our neural network to minimize the energy which is associated with a physical system which makes that we have a generic approach so which is way more uh friendly for if we are treating any kind of ice flow any kind of geometry any kind of special resolution and the most important data it is we have a training which is completely glacier customized unsupervised in the machine learning jargon and it's done on the fly which means that in fact in our time loop what we do is that any time step we use I mean we evaluate the emulator and then we retrain the emulator because we can so we you could think about this when I say it's an emulator it's something which is uh capable to reproduce what's the physical model we do but only this and not more and only for that glacier and not more and and not for more that's what I mean by emulator is that it is very efficient at doing what we are doing at times t but don't expect it to be good to do something very different because we and that's all the idea of having something which is really completely customized but the thing is that we are retraining this on the fly such that we are re-adapting the the the weights uh while we are actually computing so we basically we have this this two different step in each time step we are training the cnn and then we are evaluating the cnn and it's a bit it's a bit a trade-off whether how much we want to train how much we want well we have to evaluate anyway but the the training is basically this is a bit more expensive is that the evaluation so we are happy so basically we want to train to be accurate but we don't want to do it too much because it's expensive so there is always a trade off to have in between it's efficient because we are kind of taking advantage of all the automatic differentiation tool that use actually the gpu and i'm using the tons of flow library for for doing i mean all of this is based on the tons of flow library and the most important is that and I think this is really the main point of this approach if you ask yourself how this is different to solving basically or using a solver in fact is that using convolutional network has some memory capability so which means that also training that we do then that's something we may reuse afterwards so this is something that is that can reduce the amount of composition we may have to do later on so I will explain what I'm what I mean with this memory capability after I just want to show that an example of a fidelity result if I would just use a normal solver so here I'm solving I'm showing the velocity field of alleged class here just one snapshot which is more or less realistic and this is the one that I would obtain by traditional solver solving and this is the one I would obtain by using this emulator this emulator which is trained with this neural network and as you can see we get a capability to approximate ice flow solution very very accurately using this convolutional neural networks so the fidelity is not really the problem it's more like now the cost we have to invest to to to do the retraining which really matters because that's what we want to mitigate and to show the this memory capability I'm going to do a simple experiment so this experiment is basically I take hopefully it's not doesn't rush so I take a glass here that I like this is a lh class here in Switzerland I play the game of just rising and decreasing the equilibrium line altitude where I just play the game of having the glass here that increase and decrease and increase and increase just again it's just a modeling experiment so why am I doing this so here this is my forcing so this el is that go up and go down and the reason why I'm doing this experiment is that in the first stage what I'm going to do is that I'm going to do some intense retraining which means at each iteration I do retrain my convolutional neural network and by doing this as you can see I get the error that is between the error between the emulation and the solving is very low which is not surprising because I invest some time in retraining and then after doing one pass I just plays again of releasing the training so if I release the training then you will not be surprised that the error is going up but the main advantage is that if you as you can see that I could still continue with a very light retraining so which means sorry if I still continue with the retraining h10 iteration that I couldn't maintain the error very low while so I could maintain a very high accuracy at rather small computational effort and the reason is that because the CNN has learned this during the first pass basically so we will not have that this basically this result without having to do the first the first pass okay I think I will skip this slide here what I wanted to summarize I will not go into the detail but in fact the approach that I'm showing is really could be seen as a merging point between kind of traditional numerics where we do some finite elements and with machine learning and in fact this has some name in the literature and the literature is very recent some people call it deep reads or variational for physics informal network actually it's not easy to find the name of this because all the literature on this is extremely recent at least when I did this which was about a year ago okay I just want to check the time that I'm not completely off time wise okay good okay I would like to talk quickly about inverse modeling because inverse modeling is something that is very important for English modeling so the inverse modeling or we could call this also data simulation is when basically we have to initialize initialize simulations so typically if we think that we would like to model glaciers that we like in the next 80,000 years until 2100 so the first thing that any model has to do is first of all to estimate some parameters which are not observable so they are the red parameters in order to fit some parameters which are observable which are blue here so basically we may we can observe some ice surface velocities we can observe some ice sickness profile or some surface elevation and basically the ice sickness is not something that we can observe everywhere or some parameters which are controlling the ice viscosity or the sliding velocity this is things that we we would like we need to estimate but then of course we don't have an access and this inverse modeling is really it's all about trying to find this ice sickness or sliding parameters such that the emulated ice flow fit as much as as well as possible these these blue variables and and this is really a very important advantage of this convolutional network because it has first of all it can accelerate the optimization the optimization the world optimization process significantly compared to traditional method and on the other hand it can really take advantage of machine learning machineries when I say machine learning machineries it's basically automatic differentiation to do the inversions and I'm going to explain this in another slide basically what I've been presenting so far was the first upper part of this slide so we have been forcing the physics by minimizing by finding the weight of the neural network which were minimizing so here in the first approach it was minimizing the misfit to some data which was computed externally but then you could replace this by physical energy that's something you can do so this was the way to force the physics into the system and now we could actually use all the same methodology for forcing the observation and the observations observation it is about to invert or to try to find what are the inputs such that we are minimizing the misfit to some data and as you can see enforcing the physics or enforcing the observation is done in a very similar manner because at the end it ends up with solving an optimization problem and that's to me I think that's kind of the beauty of basically using the convolutional neural network or artificial neural network instead of or to replace the physical model by a convolutional neural network emulator is that actually the two operation which is enforcing the physics or enforcing the observation is now based on very similar methodology which is and in fact all the tools that we have been using for enforcing the physics we can reuse them for the enforcing the observations and practically speaking if you look at the code in fact this would this would look very similar I just show very simple example so I don't show all the data I assume that we we are seeking for the high thickness or this red variable which produce the velocity on the surface that we observe because now we have a pretty impressive map of observed surface velocity worldwide so I'm thinking about several projects like ITS live or the recent paper by Romain Milan from last year from this year and the data simulation here consists of finding the high thickness such that the emulated ice flow which means the image of this convolutional network fit as good as possible the observed surface velocities and the main advantage again of having this in form of a convolutional network is that we don't only have the model once it is trained but we also have all the gradients and all the gradients are extremely useful because that's what exactly what we need if we want to do this optimization process where we start from a first edge zero we first start with this first high thickness we get a first velocity which will have a certain bias and we want to correct to find the new high thickness such that we reduce this bias and to get this new high thickness we absolutely need the gradient of this misfit function and this gradient come actually for free because all the operation that we are doing and again I'm using the tons of library which means that all operations from the high thickness to this misfit will be tons of operations and then I can use all this machinery or the optimizers that are that are coming with this with this library which makes that technically speaking practically speaking this this this optimization is rather is not that difficult we could go for something more generic which means that we can have the number of certain number of control now we don't want to optimize only the high thickness but we want to have to optimize some high flow parameters and then we have some more observation but all of this all what I've been describing still apply and then we can do some multi valuable control optimization and again so if we were doing if we were doing this with a physical model that's the the tricky and potentially expensive part is to compute the descent directions with especially using these adjoint programs because that's what we do in general and instead we use these deep learning surrogate model and this direction as I said before are kind of found by ad by automatic differentiation and this is technically very simple and computationally inexpensive. Okay I just sort of illustrate an example of an eruption here in the Rhone or alleged class here you can see our observations you can see some lines here where we have some profile where we know the high sickness we could also see the observed high velocities and then there are basically the data we want to fit and what you can see here is a progression of the optimization through the iteration so basically we start from here we start from zero ice and then we converge towards something and then what you see here this is the high sickness that we obtain at convergence so and of course the STDs the misfit to the observed velocity on the surface is reduced and finally get to small numbers such that we are happy it fits the data well and also for the sorry that was for high sickness and also for the observed velocity the STD is dropping and again just to illustrate how efficient it is so this is something that took about one one minute on my laptop on the GPU and my laptop so this was uh it's really really not uh something uh compositionally demanding as as long as we have a good GPU okay uh just check the time that I have I okay I think I still have another 10 minutes I would like to give an overview of IGM and then hopefully to to make a small demo um practically speaking IGM so it's a 3d glacier evolution model so it's written in TensorFlow Python and it has compatibility with other libraries especially OGGM that I like a lot especially because there are many many tools which are already uh included in OGGM especially for data handling it has a model wise structure especially to facilitate coupling and customization and then uh here I think I will try to do some parallel with CSDMS um so it is based on the high order 3d ice flow model and there is also basically all those components I did not detail this but there is also a model for ice entalpy so which means there is not only the mechanical model but also the thermic part of the model so IGM is a truly thermo-mechanical model in a sense that we have uh this entalpy it's actually temperature and water content and the two the temperature affect the dynamic and the dynamic affect the temperature and the water content there is a very very small very minimal uh supply-sure hydrological um model as well but then this is basically the main picture uh well it is completely efficient especially on GPU and all the the main philosophy of IGM is that it is vectorial at least in the horizontal directions all operations that we do in the horizontal direction which is the one that is supposed to be the longest one that because we we we would like to to treat some very some some large arrays assuming that we want we want to model some large areas or with high resolution especially if we want to if we have if we are dealing with complex topography um we it take advantage of all automatic differentiation which is facilitated inversion and data assimilation and as I said before as I'm using this convolutional neural networks so all the data actually live on 2d rest agreed um so I'm not doing any uh irregular grid so this is this remains rather simple so everything is on the github repository the documentation the code example for installing we can also do it over pi pi uh or through github but it's probably better to do it over github because the change the code changed quite uh a lot there are a lot of updates so in general it's rather the the installation is should be rather simple the only things which is usually a bit more tricky is to install gpu drivers and then to get all the the compatibility with a gpu and this is also pretty always dependent as well but there are a lot of things which are written on the viki um now that's something also I would like to detail a bit and that's I think also very similar to the csdms framework uh so actually I was uh very interested to when I went through the cdms paper uh I found quite a lot of inspiration to sort of model igm because what igm is doing actually 99 percent of the code is actually uh modules so only the core part of igm is like an empty shell and they are all modules and they are modules for a bit of favoritism they are pre-processing modules which for instance there is a ogm shop module which download the data which sets all the data that we want given a gith of glasier or we can read some data we can do this optimization step the what I described is inverse modeling then there are pre-processing modules which are basically there are one module per physical component one that computes the ice flow with this normal network emulator another one taking care of climate that produce climate another one that produce surface mass balance and so on and all of them basically they work together and some also for post-processing which means that a module itself is a one python function it's one python file which includes some functions for parameters one for initializing one for updating and one for finalizing so this is how it is very very similar to the csdms framework um igm comes with some existing module but uh user can write their own and that's actually not very difficult um okay now i will switch to my terminal so i prepared everything but i as i had to reboot uh everything but before before to go okay sorry it will it should not take me too long uh did anyone write in the chat near gi id that uh we can try i'm looking at okay i'm checking no there is no suggestions what i'm going to do is that i will take um we can go together to the glimpse map if someone has a spontaneous idea of a glacier that he like we can say this now otherwise i will pick a glacier i mean uh i i've asked you about the arching chair before right argentia let's go to argentia yeah and i mean the reason that i asked maybe like to like um talk to other people through is there is like a um um sub glacial observatory there i see okay perfect and i take the rgi of argentia here and basically what so i will just activate so i'm not going to install igm right away so i kind of i'm a bit cheating so here i'm already on the igm environment so in the python igm environment so everything is installed so all what i'm going to do is igm run and plus i will uh add uh or ggm just one option rgi id with this sorry okay can you see my screen well yep we can okay perfect okay now it's running basically um and i will show you right after what is this here okay you can see the argentia glacier here uh i did a simulation from 17 000 2000 as the glacier is rather small because it's 100 meters resolution so it's very actually pretty quick so the glacier i selected before was slightly longer but what you can see basically how the glacier reacts to the certain climate which is prescribed here and on the top of this i also played a game of just doing some seeding so that we can see the trajectory of some particles and actually what is nice here we could see some almost sort of a little ice age more rain now there is a tool where i can take this uh this and put it in the browser uh what is nice is that someone from the community make a nice tool with a browser to directly visualize the result of the simulation and you can see here the argentia glacier and we can play the movie here so we can see the animation and then the velocities as well that's pretty awesome too yeah uh yeah absolutely and i i was very impressed this person did it uh very quickly and uh we have we are really thinking about doing a bit more generic tool where we can have way more control and then analyze all the simulation i find it's very very handy to have it over the browser because then you don't really need much tools yourself in style and i think this it works across platform as well um yeah no basically it looks like this now i just want to show what i did so it created a lot of files but before to start there were only one file so i will clean my my room so when i arrived there were only this sorry uh when i arrived there were only one file there were a parameter file and then we will just look at what what is here so in this parameter file basically what i say is that i'm calling this uh module so i call the pre-processing module or ggmshub that's the one that when i gave this rgi id so it's just downloaded all the data that i wanted for this for this run and then it calls a certain number of module one is uh using one is generating uh a climate so this is actually uh also supported by oggm and one is computing the surface mass balance from the climate also with parameters uh calibrated by oggm then this is a module which computes the ice flow the ones that update the time the one that updates the sickness and the one that computes the particles so they are basically my uh processing modules and there are some post-processing modules which uh write and cdf do the life plot or this is the one that is uh permit to to show this visualization over the browser and of course there are here you can change a couple of parameters that say from when you want to do the simulation until one so this is basically the way how we control the climate so because we actually the climate this is historical data but then we may want to apply some shift like some delta temperature if we want to make the climate cooler or warmer uh so here for instance applying a plus four degree by 2000 uh 2100 or plus plus eight in the case of extreme scenarios and then of course we can change all our parameters here and basically what ijm run when i do ijm run here uh when i do this sorry it was here when i do this ijm run so it will call all those module in the in turn and then it will loop over all the module it will initialize all the take the parameters initialize all the module and then using a time loop it will update all the model component and finally for the post processing it will finalize um yeah so i think i will stop here uh so basically uh all the code the documentation is on the github repo we also have a discord chat for for to to to help the community to to to to discuss or to to get some support or also there is a video tutorial which is ready and then there are some reference and thanks a lot uh i'm happy to take any questions