 So does this work? Can you hear me now? Control L. Thank you. So yes, hi, everyone. I work at the CWI in Amsterdam. And I want to present this super parameterization project we're working on. I work on this together with Gais at the Dutch ESAEN Center. And we see this at KNMI and the Technical University of Delft and Dan Cromelin at the CWI. So to introduce super parameterization, say we have an open IFS model running and with a resolution of 40 kilometers, this is the T511 grid. Then you have something like 40 kilometers of solution. And now we want to couple this model with a different model on a much smaller scale. So one column of open IFS, we couple with a large eddy simulation model with a greater solution of 100 meters, roughly. For this, we use a model called Dallas, the Dutch atmospheric large eddy simulation. And on this very small scale, we can resolve clouds and convection in a way not possible in this very large grid size. So our idea is to use the large eddy simulation to study smaller features that are not explicitly resolved on a large scale. So in open IFS, the things you can't resolve, you have to parametrize. So there is, for example, a cloud parameterization and a convection parameterization, which estimates what these processes do in each column. If we, instead of these parametrizations, we actually simulate them, these processes, then hopefully we can be more accurate than the parametrizations. And this is what we want to do. So we're going to pick many of these columns and couple every one of them with its own large eddy simulation box. And then the important part is we couple in both ways. So the large eddy simulation will get input from open IFS. But open IFS will also get input back from the large eddy simulation. So we're effectively going to replace the cloud and convection parametrizations by the output from the large eddy simulation. And this is super parametrization. So instead of parametrizing, you actually simulate on a smaller scale. And then you actually don't need to parametrize, because you get everything resolved. Of course, this is going to be computationally expensive. So we hope to get something back as well. And why do we want to do this? And especially, why do we want to bring another model instead of maybe increasing the solution, which might seem the obvious answer? So first of all, clouds represent one of the biggest, perhaps the biggest, uncertainty in climate simulations, because clouds give a feedback on the climate because they affect radiation. So if climate changes, the clouds will change. And then radiation will also change. So getting the clouds right in the simulation will be very important for accurate climate simulations. And this is one reason. And if we now try the super parametrization approach, we can also compare with existing cloud parametrization to see how well it does. It's one more comparison possible. And if you think about why do super parametrization, instead of increasing the open IFS resolution, it's going to be done anyway. We just heard that the goal is nine kilometers in the future. But with nine kilometer resolution, you still can't resolve clouds. So there's still some distance to go in this before you actually see the clouds in open IFS itself. So this is one way to increase the resolution. And also it's a possibility to increase the resolution in just one place. We don't have to put these less models everywhere. We can put them just in a region and see how this region will react. And it might also be more computationally efficient to have a larger dissimulation per column instead of just filling the globe with these larger dissimulations. And this is because of how the models are coupled to each other. So we don't need to couple all these larger dissimulations to each other. We only couple them to the columns where they are located. So of course we make some kind of approximation here. But on the other hand, we save a lot of communication between the larger dissimulations. So it's an approximation, but it might actually give us something. And also in this way, we couple two well-tested models. So OpenIFS is very well-tested. It's known how it's supposed to behave. And if you would go and change the resolution without thinking about it more than that, then many of the parametrizations would probably not be relevant anymore because you look at it on a completely different scale. So we leave OpenIFS as it is. And then we instead couple it with a well-tested larger dissimulation, which is really tuned for these small grids. So this is easier than coming with a very high-resolution global model from scratch. There's been lots of work previously on superparametrization. The main and origin is probably Wojciech Kropovsky, who suggested this approach for atmospheric simulations. And he also suggested a coupling scheme, the way you couple the larger dissimulation with a global model. And we're actually using his scheme as it is. And then there's one other pair of people, David Randall and Marat Karudino, who already implemented superparametrization in OpenIFS. So actually in the code, there is already something for superparametrization. What exists, though, is it's uniform. So every grid box in OpenIFS will get a larger dissimulation embedded in it. And usually it's a 2D larger dissimulation, because if you want to do this in every grid box, then a simulation time will be enormously long if you spend a lot of time on each one. So therefore, they embed a very lightweight model. And then a 2D model is more lightweight. So this is how it is. They already see benefits from even this 2D model. So there's potential in superparametrization. What we want to do differently is to not embed less in every column, just select some region we care about, and then embed a higher solution and 3D model in these selected columns. So what we want to do with this, first as a test case, we want to put a few less columns over the Netherlands. The Netherlands is here. And then we put just as many as we can afford over the center of the Netherlands, just to see how this works. In the center of the Netherlands, there's an observation site called Kabao with a tower with measurement instruments. So we want to compare our results with these Kabao measurements and see if we might do better than plain open IFS, or at least test that we're not breaking stuff badly. So this is the first test case. Once we have it set up, this is still very much in progress. We want to go for a cold air outbreak where polar cold air comes down over the ocean and causes cloud formation. And for that, we want a region maybe 1,000 by 1,000 kilometers covered by these Dallas simulations. So it's something about between 100 and 1,000 of the columns. We want to couple to a less model. So this is then rather large scale simulation. We're not here yet, but this shows the few grid columns over the Netherlands. And I want to say a few words about the coupling scheme, so how you couple a large head simulation to a global model. And the main idea is here. So any quantity you want to couple can be a temperature, humidity, horizontal velocities. In a large scale model, you have one grid column, which we couple. So it's just one value at a certain height. We get a very high profile throughout this column. In the small scale model, there will be a horizontal slab corresponding to the same height. And we want the slab average in the small scale model to match the value of the large scale model. We want this to be valued throughout the simulation. So we start the systems in this way. We initialize open IFS. And then we ask for the vertical profiles. And we initialize the less in that column to match that vertical profile. And then to keep this relationship throughout the simulation, we force both models towards each other. You can think of it as a relaxation. So we relax one model towards the other, and the other model back towards the first. So we keep them together always. And then this slab average relationship will still hold. So when we time step, we time step open IFS once. This is a 10-minute step. Then we find the forcings on all the Dallas instances to make them go towards the open IFS state. Then we step all the Dallas's, many steps, because it's a small model. It's a small time step. We step all the Dallas's to catch up. And then we find forcings on open IFS to make it match the Dallas instances. And then these forcings will be applied in the next open IFS time step. So of course, since these models are now separate, we can't synchronize them more than this. They will be lagged by one time step. It seems this is the best we can do. So we'll just leave it to lag again. Hope for the best. I want to say a few words about the technicalities of coupling, because this might also be useful for others of you. So we use a framework called amuse. It comes from astrophysics. And it gives you a Python interface to any legacy code you care to couple to it. So fortune code or C is the article once. So you have any code you want. For us, it's Dallas and open IFS. Then you can define Python functions, which will couple to this code. And we have now made a Python interface to both open IFS and to Dallas. And the scheme we want to use is to have these two separate models as libraries, then a Python interface on top of each one. And then we have a couple of code in Python calling both of these libraries. So our Python code is on the top. And then calling these two libraries for time stepping and setting for things and so on. So another way to couple the two models would be to just embed Dallas into open IFS. Just in the place of the physics parametrization in open IFS, you could insert the larger simulation. And this would be easier, less coupling, less Python, less everything. But it has the big drawback that then if you want a Dallas in only some of the grid columns, then you will break the load balancing. So as long as you treat all the columns the same way, you're fine. But if you want to choose a region for special treatment, then this will be very difficult to balance. And for this reason, we went for this library approach. And now as a byproduct, there is a Python interface to open IFS and one to Dallas. And Dallas is open source. So this you can have immediately. And a Python interface we want to share. So if you're interested, please talk to me about this. The open IFS one we are also willing to share. But since open IFS is not completely open, we don't know how to do this yet. Again, if you're interested, then let me know. We're very happy with this so far, because it's easy to use Python as the top level coupling system. It's very nice for prototyping, and so on. Besides, Python was invented at CWI. So we have a slight bias to use it whenever we have an excuse. So this is the stage we are at now. We hope to get out of the engineering phase of this and actually run it and get some results out. But we're still testing. It breaks from time to time. Usually, Dallas crashes if you force it too hard. And well, once it's stable, we want to go for the cold air outbreak. And what I wanted to remember especially is this Amuse, which is a very nice toolbox for anything if you want Python interfaces. And then the Python interfaces that already exist and the superparamethasization approach, which we hope will bring nice things. So I'm going to stop with the slides, and I will show you a video of the coupled system, which I have here, full screen this now. So I hope you can see something. These boxes are the Dallas instances. They are on top of the open IFS cloud field or the liquid water path. And then I also show the wind at about a kilometer height. So you can see where open IFS has clouds. The Dallas instances will also have clouds typically. Now you see clouds appearing both models. And neighboring boxes will also have similar conditions. So you see that the models communicate with each other through the open IFS layer. Then I have to say these boxes are somewhat exaggerated. So our Dallas instances are more like this size if you plot them to scale. So this is how it looks now. Thank you.