 They have this at you. Thank you very much, Alec, for this introduction. So I'm going to share the screen. Hope you can see the screen in full mode now. OK, welcome, everybody. Good morning, everyone, in case you are in Europe. As Alec said, my name is Arnau Falk, and I am a professor now at the Geoscience Barcelona, which belongs to the Spanish National Research Council. But they also have a double affiliation, so I am also based at the Barcelona Supercomputing Center. Today it is about volcanoes, and I'm going to start with this series of two talks on the data simulation of volcanic clouds. This is the plan for today. This morning is to 45 minutes talk with a break of 10 minutes. And then after lunch or in the afternoon, we're going to have another two talks on lava dynamics. I am very sorry I don't want to interrupt you, but yesterday I gave this talk because replacing the Vasili Titov who couldn't make. And today will be Vasili Titov instead of me. Sorry. Sorry, I didn't know because I was in the field until yesterday, so I'm sorry about that. No problem. OK, so this is the contents of the two talks. I think it's worth to do a first long introduction on volcanic clouds first to see I guess that some of you will maybe not have a strong background on volcanology and on volcanic clouds. So I think it's worth to spend some time showing what are volcanic clouds and how do we model them and how do we forecast the trajectories and the evolution of volcanic clouds and also how are the settings of operational models and which is the role that data simulation is playing nowadays on all these setups. We will also see few slides on the different observation mechanisms that we have for volcanic clouds. In particular, I will focus on satellite devices, let's say, because this is the most used, but we also will see other. I think that it will take me around these 45 minutes, so that's probably part one of the two talks. And then in the second block, we will go straight to the point and see the different data simulation strategies. And in particular, we will see what the community has developed in the last decade, which are essentially three different strategies, data insertion, the inversion of the source term, and then a more, let's say, standard ensemble based sequential data simulation. And then I will finish with the conclusion. So let's just start with this introduction and I will try to explain you briefly what are volcanic clouds. The first thing we have to know is that volcanic clouds are formed during explosive volcanism. And the explosive volcanism is characterized by the fragmentation of magma. So we have this situation here. Typically during interruption, we have a deep reservoir or a shallow reservoir, which contains magma. Magma is a silica male that has also volatiles dissolving. And an important point is that the solubility of these volatiles in the magma depends on pressure. So in the reservoir, pressures are high and the volatiles are dissolved. When we have an eruption and this magma starts to ascend through a dike or through a conduit towards the surface, it may reach a point that we call the exsolution surface in which this male gets saturated and gas bubbles start to form. This is a phenomena similar to what happened when we opened and we depressurized the champagne bottle, right? We just decreased the pressure. That saturates the volatiles and it produces the exsolution of gases. During the ascent, this mixture of magma with gas bubbles reaches levels, shallower levels, where the pressure is less and less. And these bubbles expand and grow up to a level with a phenomena which is called magma fragmentation of course. And this happens at this fragmentation level. And this is a dramatic change because then when the magma fragments, it converts into, let's say, a gas with pyroplasts of different sizes. This process of magma fragmentation can produce what the geologist called pyroplasts, with essentially this magma broken, let's say, of many different sizes. And they are globally known, regardless of the size, they are globally known as volcanic defa. But we have names for these particles depending on the size. Larger particles are named or are called volcanic bombs. And by definition, they are more than 6.4 centimeters, so 64 millimeters in size in diameter. Then the particles that are larger than 2 millimeters are named as lapilli. And finally, the finest component of this resulting from this fragmentation is what we call volcanic ash. And volcanic ash are all the particles that are less than 2 millimeters in size. Then when all this material, essentially this is gas with all these particles dispersed, reach the bend, they form an eruption column or a volcanic bloom. And this is how it looked like. So these columns are essentially when they are emitted in the atmosphere, as I said, a mixture of gases and dephra. And they can inject huge, huge amounts of materials to the atmosphere at high velocity. At the Basel region, this is a momentum driven, so there is an excess of momentum. So the assent of this mixture is essentially driven by the excess of momentum. But then, because this mixture is very hot, it's much hotter than the surrounding air, it starts to entrain air. And it heats the air, and this produces all this convection mechanism, what we call the convection region. That produces the assent of all this mixture up to an ultra buoyancy level in the atmosphere. That essentially depends on the energy. The most energetic eruptions, they can put this material up to many kilometers in the atmosphere, but essentially depends on the energy that is available. But in any case, a level or situation is rich when all this material that forms the eruption column reaches an ultra buoyancy level. And there, the winds at high start to, let's say, affect all this material and form what we call the volcanic cloud. So essentially, I want to make this distinction between what we call the column, which is this ascending part with vertical velocity, and the volcanic cloud, which is driven by passive transport. So this is passively transport by winds, and it's a mixture of entrained air with volcanic gases and particles dispersed. So when these particles and gases are injected, they can be transported very, very large distance before these particles settle on the ground. And they are passively driven by several physical mechanisms. The most dominant is the advection by wind, but we also have turbulent atmospheric diffusion. And of course, because these particles have a certain size, they do settle down, and we have also a particle sedimentation. And all in all, depending on the injection rate and on the wind speed, these clouds can travel from 100,000 to 1,000 kilometers from the volcanic source. We can see here a couple of images on how these clouds look like in the visible with the satellite images from the Coral and Kauje, and one of the eruptions of moment etna in CCB. An important point is that the settling velocity is very, very much dependent on the particle size. So as I said, we have particles of different sizes, bombs and blocks with size larger than 64 millimeters. They typically have falling velocities of the order of around 100 meters per second. It means that typically the residence time of these larger particles in the atmosphere is very low, so a few seconds. And as a result, they fall out very, very close to the volcano. So typically, they don't reach large distances. They fall out just with a ballistic component at few kilometers from the volcano. The other types of particles, so lapilli, colesage, and finage, they have this is order of magnitude, but the settling velocity is in the range of 10 meters per second in the case of the lapilli or of the order of 1 meter per second in the case of colesage or even centimetric velocities in the case of finage. And as a result, again here, the residence time is very, very changing. For lapilli, typically, it stays in the atmosphere of minutes. It means that it reaches medium distances, tens of kilometers typically, whether the colesage and the finage, they can travel long, long distances from 100 to 1,000 kilometers. Normally, the fine material can even reach several thousands of kilometers. Discuss implications because we have to distinguish between what we call the proximal clouds, which are the cloud very, very close to the bend, which are more concentrated. The cloud in the proximal locations has all these particles within. But as I said, the course of particles fall for very, very soon, producing volcanic fallout, whether the final particle can travel larger distances. So if we look at the cloud very far away from the source, what we will see in this distal cloud is that they are much more dilute. They are not concentrated. And they carry on, essentially, micrometric size particles. Whereas the proximal clouds, they are much more concentrated. They are optically thick. And they carry particles of all sizes. An important thing is that, apart from this passive transfer by wind adhesion and particle sedimentation and turbulent diffusion, some other phenomenon may also occur within a volcanic cloud. And probably one of the most relevant is the aggregation of particles. This is a phenomenon that of course depends on many parameters. And it's a phenomenon by which the fine ash particles can aggregate and form much larger sized particles. And this is essentially something that is, let's say, enhanced by the presence of water and that acts as a binding mechanism for these particles. The water or ice coating allows these particles to stick and to form much larger particles. These aggregates that, for example, you can see here a microscope, same image, how these aggregates look like. So it's, for example, these aggregates are composed by thousands of small ash particles. And they have a millimeter size. Obviously, these aggregates, they fall from the selling velocity of the aggregate, is much, much larger because they have a much larger size. And it means that when aggregation occurs in the volcanic cloud, it triggers or it causes a premature removal of mass because the aggregates form. So they settle down much, much faster. And we have clouds that are depleted in fine material. It is important to put aggregation in the models. But one of the problems that we have nowadays is that we still don't have a full comprehensive aggregation model. This is very difficult to model. We know how to do it. We know that we have to solve the so-called Smoluchowsky equation. But the problem is that doing this with all the particle size and all the degrees of freedom that we have here is really hard from the computational point of view. It is very, very expensive. Very, very expensive. And we have to do some assumptions that in most of the cases work. But in some cases, they do not work well. When we have the fallout of the material, we have a lot of impacts on the ground. Did I just put here some pictures of what happened when all these materials fall down on the ground? And we have impacts on the infrastructure. For example, what you see here on the top left is a picture of a town in Argentina close to the Chaitan volcano in Chile. And of course, if the ash fallout is substantial, we can have even a collapse of the roofs of the houses or the infrastructures because it's quite heavy, particularly if it drains after the fallout. Ash can become quite heavy. And if you don't clean it, it can cause a roof collapse. It produces also the fallout impacts on, of course, agriculture, on livestock because ash particles, they carry, they are fluid rich. And when the animals eat the grass that has so much, it can produce fluorosis to the animals, apart from, let's say, abrasion of the tooth and many other irritation of eyes, et cetera. Of course, it causes also impacts on the transportation network, for example, on the roads. Ash is very slippery, so you cannot drive if you have to clean the roads, et cetera. But probably the most, let's say, the highest impact is on air navigation, OK, and on air quality as well. So we have a lot of impact on civil aviation. And the main reason for that, there are two reasons. One is that the ash particles, they are very angular and in shape and very abrasive. Again, this is a microscope, an electronic microscope image of a typical ash particle, just for reference. This is 30 micron sized. And as you can see, they are far from spherical. They are very, they have a lot of holes, pores, and they are very abrasive. And you can imagine what happens when we have a cloud with these particles suspended and an airplane that is traveling at, let's say, 900 kilometers per hour impacts with these particles. It produces a lot of abrasion on many components of the aircraft on the windscreens, also on the blades of the turbine. It produces erosion, affects the fuselage of the aircrafts, the navigation instruments. If there is also some volcanic gases, for example, sulfur dioxide, it also produces a lot of corrosion on the metallic components of the airplane, et cetera. But probably, apart from that, which is already all these are problems, we also have the potential engine installed. Because when these particles enter the combustion chamber, they can melt and they can form a glass and they can clog the cooling system of the airplane and produce the engine stall. So this is something that we have to avoid. And this one is one of the main reasons by which we do this operational modeling of the clouds to prevent encounters with aircrafts. Apart from that, of course, the fallout also impacts on airports. And disrupts the normal operation of airports because you have to clean the runaways. And this is a procedure that is quite expensive and by consuming this, the pictures that I took in La Palma a few days ago of the cleaning operation of the airport. And you can see that people, they have to, let's say, with machines and manually clean all. Every time that there is fallout, you have to clean it to resume the operations and the normal operations of the airport. So this is what volcanic clouds are. Now, the question is, OK, how do we model that? OK, the objective, as I said, of modeling and forecasting volcanic clouds is to obtain where they are and where they will be in time and space or the trajectory and how the concentration within these clouds will evolve with time. And we have many families and many types of models in general, but all of them, the important point is all of them have three components. So on one side, we need first, a meteorological or a non-medical weather prediction model. It essentially tells us the state of the atmosphere in the future. So this is a 40. We need to know the wind field, the properties of the air like density, viscosity, temperature, moisture, if we want, for example, to include abrogation, precipitation rate for weather position, et cetera. So we need several meteorological parameters that are furnished by the meteorological models. That can be on seven scales. We can have global scale models, but we can also have regional scale models. Or even if we want to do very, very high resolution simulations, we might need to use some more local scale meteorological models. Of course, this can be, let's say, the typical foray gas if we want to simulate what will happen in the future. Or if we want to simulate what happened in the past, we can rely on any of the re-analysis data sets that exist in the world. The second component is the dispersal model per se, in which we model how the particles and eventually also the aerosols, so essentially the SO2, are transport. And that includes the different physical phenomena that occur in transport. So that's the effect of wind advection, the turbulent diffusion, the particle segmentation. That, as I said before, is very, very strongly dependent on the size of the particle. So we need to parameterize how, which is the settling velocity of these particles depending on the properties of the particles, on the shape, on the size, on the density, et cetera. Some of the models also take into account where a wet and dry deposition mechanisms. Wet deposition, of course, it can be in cloud or below cloud. In cloud, of course, when, let's say, we have water drops, water droplets that can coat volcanic particles and they can stick. And wet deposition is produced below cloud and it's effect that when it rains, these drops can drag the particles and can produce also some premature removal of mass in the cloud. Some of the models also contemplate the eventual occurrence of particle aberration. And of course, if we have, for example, SO2, we also need to have some chemical model or need to account for different chemical reactions or even in some cases, the different phase changes that might occur in these aerosols. And then we have a third component, which is the so-called emission model in which we need to define how the source term is. Source term I mean essentially the volcanic plume, so which is the properties of these particles and mass flux, the evolution of the mass flux in time and space. This can be an independent model or typically nowadays in most of the models, the emission model is directly embedded within the dispersal model. So actually, we just have two components, the net model and the dispersal model that has some emission model embedded within. Okay. The important point is that this emission model is characterized by what we call the eruption source parameters. We will see in the next slides. Of course, we have uncertainties in all the components. So we have uncertainties in the meteorological forecast. We have uncertainties in the dispersal model that for example might come from the different physical parameterizations that we have, some of them are not perfect, some of them rely on some, for example, the sedimentation velocity rely on experiments that some, let's say, analytical relationships that give us the settling velocity, et cetera. So there are uncertainties, but by far the most uncertain part of all these different components is the emission term. This is very challenging. And this is the main reason why that assimilation is needed. So what are these eruption source parameters that we have in the emission model? Third, we have the emission term. So we have to give this model essentially the start and the end time of each of the European phases. So when the eruption start, when it ends, or if the conditions of the eruption are far from steady, then they vary with time. We need to give a time series, and we distinguish typically different eruptive phases. So we have to know the starting and the end time of each. And the most important one is at which age do we put this material, at which atmospheric levels do we inject all this material? So this is what the injection height of the eruption column height, okay? Then we also have to give how the mass distributes or is released along or across this eruption column. And of course, the emission rate, that is the mass flux, the total mass that is emitted or that is released in the atmosphere by unit of time. If we look at the level of uncertainty, we might see that, okay, you can say that the starting and the end times and the cloud height, I just put it in green because they are often, I said not always, but I are often observable, even by just by visual observation. And the vertical distribution of mass, we more or less know how it distributes but how it distributes in vertical based on what we know from past events. So I would say that these have a moderate level of uncertainty. In contrast, the emission rate is quite uncertain. And this is very important because this is telling us the amount of mass that we are releasing in the atmosphere. And at the end, this dictates which is the concentration of the cloud and wind. All these models are based on the mass conservation equation. And with the exception of some terms or some phenomena like weight deposition or aggregation, most of the processes are linked. And it means that if we double the mass flow rate, we double the concentration value. If we do an error of 100% here, we do an error of 100% in the estimation of the concentration that we model as. I'm now assuming that everything is linear. This emission rate, we know also that it depends essentially or very importantly heavily, depends heavily on the column height, but it's very, very difficult to quantify in real time. And this is the main source of uncertainty of volcanic cloud forecasting. On the other hand, apart from characterizing the emission term, we also need to characterize the properties of the particles that we are injecting in the atmosphere that we are meeting. And that includes the particle site distribution, the density and the shape factor. Okay, so the message is that at the time of forecasting, we will have always a little bit and epistemic uncertainties. One important thing is that this emission rate is proportional to the fourth power of the injection height of the column height. And this has implications. First of all, on the model errors, because even if we do a small errors in the determination of the column height, this amplify and translate in large errors in the concentration, okay? And not only, and we will see in the second part of this talk how this is important for data simulation because the distribution of errors is not Gaussian anymore. And this will have implications. Just by now, just remember this. Okay, so there are several or many models that would say that our operation are different parts of the world. Some of them are Eulerian, some models are Lagrangian. These models are used by the so-called volcanic ash advisory centers and other institutions of national level to forecast volcanic clouds whenever an eruption occurs in some parts of the world. And what typically, I would say before the eruption of the LVL at Yokut in Iceland in 2010, these forecasts were more qualitative in the sense that they were just interested in saying if there is ash, no ash because the mandatory regulation at that time was just to delineate zones in which there is some ash, but the models didn't have to tell us how much ash, just yes or not, okay? This paradigm was very much criticized in 2010 because of all the disruption that this eruption produced in European airspace. And new guidelines were adopted based more on quantitative criteria on concentration threshold. For example, three levels were defined in Europe at the time based on the concentration values of zero, two, two and four milligrams per cubic meter. And this has important implications for operational systems because if you have to deliver a forecast with quantitative values, then we need to estimate with much, much better precision the emission term. This is the main weakness that we had at that time. So the question that the scientific community posed at that time in 2010 is how can we constrain the source parameters on the related uncertainties? And several strategies have been developed since then. The first one is based on, or I would say based on the use of ensemble runs. So okay, we have something that is uncertain. Take for example, the eruption hay. So we do an ensemble of simulations exploring all the possible parameter space. And then with that, we can do two things. We can do a deterministic forecast as we did before, but for example, taking the ensemble mean, the ensemble median or any combination of the ensemble members. But the advantage is that now we have uncertainty quantification metrics because for example, based on ensemble spread, we can have an idea about the uncertainty of the forecast. But when we have these ensemble based runs, we can also have improbabilistic forecasts. For example, by counting how many members verify a certain condition. For example, I don't know, concentration value or the detection threshold or et cetera. This, let's say, has been one family of modeling strategies. The other one is data simulation either considered in in situ monitoring how to quantify the source term. But this is quite difficult. This essentially means putting the instrumentation at the volcano to quantify the source term in situ. This in many cases is not feasible because we have more than 1,000 active volcanoes worldwide. Some of them are very remote in remote places. It's logistically is very difficult to have all of them well monitored. So it's more practical to rely on distal cloud observations or observations that are done away from the volcano. And then we can also have, of course, a combination of ensemble based run and data simulation as we will see later on. So let's spend few slides now on the observation of volcanic clouds. Actually, we can see volcanic clouds using many, many different instrumentations. For example, satellite based sensor. We can have passive monitoring or active monitoring. For example, are using lighters on board of satellites. But we also can see volcanic observed volcanic clouds from ground based instrumentation networks. For example, what you see here on the right is an image of the European early internet network of lighters. We can use, in some cases, the lighters that typically are deployed for other purposes, more for the study of the planetary boundary layer, et cetera. So for other purposes, but that in some cases can also be used to observe volcanic clouds, same as the little networks. And we also have, of course, air quality stations. And then we can also observe or measure, let's say, the concentration in volcanic clouds using research aircraft equipped with, for example, particle counters. Of all these three options, let's say satellite based and ground based or even aircraft based instrumentations, the most used one by far is satellites for several reasons because they can give much higher resolution. The density, of course, is much higher. Consider that satellites now give a kilometer size pixel, whether the density of a station of networks like early net is the one that you see here. So compare the density of this network with one grid at one kilometer resolution. But apart from that, satellites also have global coverage and that includes oceans, is something we don't have ground based networks in the oceans, instead satellites can give a global coverage, okay? So I would say for operational mode, satellite is the only one choice, but the other two datasets or the other two instrumentation types are also very useful for validation purposes. Typically we use ground based networks and if we have them, data from ICRAF to validate a posteriori over simulations. So one important question is how we can discriminate from a satellite volcanic ash clouds from other meteorological clouds or other clouds, for example, dust clouds or other types. This is a modest image on the color of the geological production. And here we can see clearly how an ash cloud looks like in the visible then we have lots of meteorological clouds and then we have some parts in which it is not clear whether this is a meteorological cloud or is an ash layer, no? This is very typical. And a very important step forward was done in the late 80s or during the 90s with let's say the development of ice detection techniques using satellites based on passive sensor monitoring. The idea is very, very simple. We have the Earth that is emitting in the thermal infrared. This radiation passes through an ash cloud and or an ash cloud, a cloud that we still don't know what this cloud is. But the important point is that the absorption that this radiation, the absorption of this radiation depends on the wavelengths and depends on what is in here. And this is then detected by the satellite. There was a very important, let's say, seminal work by Prata in the late 90s in which the so-called reverse absorption was applied to detect volcanic ash. And the idea is very simple is that he realized that the silicate particles are more absorbent at shorter wavelengths than water droplets in meteorological clouds. So they have an opposite behavior. So if we take the absorption at two different wavelengths, for example, 10.8 and 12, and we do the difference, in one case, this is positive for volcanic ice, this difference is negative whereas for meteorological clouds or water or ice clouds, this in principle should be positive. So doing this dual channel difference in principle, we should be able to discriminate one from the other. So this is how it works. We have here a modest true color image in which we see meteorological clouds, but we also see, we can imagine here the presence of volcanic ice. When we do this in the thermal infrared and we take this brightness temperature difference, we get something like that. So this is what we get when we do this reverse absorption mechanism. The most basic one, this is far from perfect because you see there is a lot of noise here, there is no specival and in many other parts. So actually what we do is not just doing this dual channel difference, but we apply many other filters. For example, we can correct or we can put a mask for cloud surface and this eliminates a lot of noise in this part. We can also correct for some zenithal view satellite projections that we might have, for example, in this case that the image was very close to the edge of the satellite and other corrections. But the important point is that at the end, after applying all these filters, in principle, one should be able to discriminate clearly on a set in principle, the volcanic ice from the other clouds. Okay, as we will see later, this is far from perfect but by now, let's say that we have a mechanism to separate or to highlight volcanic clouds and separate them from other types of clouds. But this is not a retrieval, this is just something that is telling us this detection algorithm, yes or no, there is ice and there is no ice and we want to know also how much or how much ash there is in the cloud. And for that, we need step further and this step forward is a retrieval in which we want to get the color of mass. So we have to combine this detection algorithm with some more, let's say, complex radiative transfer models that can tell us the properties or the effective size of the particles in this cloud and the mass per unit area. For example, this is what further step in the image that we saw before when we do a retrieval and we can retrieve the mass load. The mass load is the vertical integration of mass. So this is a zenithal view and what we see here is the mass per unit area. So we have just here, summing all the vertical column in the atmosphere, okay? So that's why the unit is per square meter. So we have many, many, let's say, sensors on board of several platforms. And nowadays this is just not an exhaustive list of what we have several platforms carrying different types of sensors. And we can have, for example, with geostationary satellites global coverage combining different, different satellites. It's important to distinguish between the geostationary observations and those that are polar orbiting that normally give us a couple of passes per day. So, which are the pros and the cons of observing volcanic loss from satellite. The advantage is that nowadays we have this large generation of geostationary satellites that give us a high frequency, let's say images every 10, 15 minutes, typically, on a very, very high spatial resolution, one to a kilometer pixel size. And they have global color. These are the advantages, this is clear. But this is far from perfect, okay? For many reasons. The first one is that the S-Cloud can be obscured but overlying meteorological clouds. Remember that we are considering here a passive sensing. So it is not active sensing what we typically have on board of geostationary satellites. And these absorption signals can be masked by the presence twice. For example, if ash particles are caught by ice, maybe they can be hidden and we can have, we cannot detect them. So we see areas of the cloud that are undetected by the satellites. These detection algorithms also can fail in detecting ash clouds that are optically thick, okay? And then another just thing which I want to draw your attention is that even if everything goes very, very well, we always have with the satellites a 2D view. It doesn't help you. So we don't have any, with the passive sensors we don't have any vertically resolved information of the cloud. So we don't know the cloud thickness and we don't know the concentration. It means that if we use this type of observations, we need to do some additional assumptions, for example, about the thickness of the cloud to infer the concentration, for example. So this retrieval gives us the mass per unit area and we need to assume a cloud thickness or to collocate these observations with some other observations in order to have a 3D picture of this cloud, okay? This is how collocation looks like. You see here, for example, two retrievals from this cloud in, you know, in Cordon Cauge Volcano. And what we get from the satellite, geostationary satellite are these as mass loadings, okay? But as I said, this is to the image of the cloud but we can collocate them with other observations. For example, we had here the pass of one polar orbiting satellite and what you see here, the green line is this image. So when we combine both geostationary and polar orbiting observation, we can really get is the only way to get is really 3D structure of the cloud and we can have an idea here about the cloud thickness and the average concentration within the cloud. Similar situation for SO2, the only thing is that SO2 is actually more easy to observe than the organic ash but there are also some, let's say some problems sometime with detection on retrieval of SO2. I think I'll leave it here. Maybe we can take a five or 10 minutes break and then we will go straight to the data simulation. Yes, you're correct. We will take a 10 minutes in the program and we will reconvene at 9.55. By the way, you can look in the chat, there are a few questions but you don't need to answer immediately about during the time question and answer. You make a wish to answer this question. Yeah, so there is time. If I do another 40, 45 minutes then... Yes, after that, we will have some 20, 15 minutes for questions and answers. Excellent. Now it's a break, enjoy the break.