 I'm belonging to this research center of post mining. This again belongs to a technical university in Germany in Bochum. We're dealing with everything that comes when mines are closed. And I bring you a little bit apart from these technical discussions that we had before. I would like to show you about a project that is funded by the Rackstiftung. And we are dealing with post mining dewatering of so-called polder areas. That is the area of subsidence after you dig out of the coal. And we have a lot of things to do with that. We call it, let's say, eternity tasks. You have to do it for the rest of eternity, I guess. What means that post mining really is a nice job. I cannot lose it anymore. So I would like to tell you about this project. We call it MUSE. I show you how we try the data fusion. And then I'm coming to some results and a conclusion. I'm sorry that I need to do this way because I do not see it on the monitor and I do not know which slide is there. Do you know this map? Somebody saw this map? Not really, no. That is the subsidence areas in the Ruhr area, only the Ruhr area. And if you see the values here, more than sometimes 20 meters of subsidence by digging out of the coal. There's a problem because of this number. I guess you also never heard about this number. 800 million of cubic meter. That is the water that we have to pump out every year out of these areas. Actually, I would like to leave that because then we have Masuri in Ruhr area. Everything was flooded then. So nobody liked that actually because people want to live there. And that is a real huge amount of water which actually costs a lot of money. And this is such an area. Looks pretty nice, no? But what you can see actually is the sink edge of one of the subsidence areas. Behind there where you see the chimneys, everything is fine. That is the situation before mining. But here in this place, actually there should be a river, the Boyer River. And this Boyer River now is underground. So behind me, there is a pumping station. And this pumping station is for getting this area dry so that people can live there, that we can do agriculture and forestry. And funny thing is that some brooks flowing just the wrong direction. They're just flowing to the, yeah, let's say not to the river but they're flowing to the pumping stations at the moment. That is our problem in this area. And we have this project funded by the Rack Stiftung and it is about dealing with brooks in these areas, with rivers in these areas. We had a huge effort in bringing the so-called Emscher River back to nature. Let's say I don't know actually three million or billions or five billions of euros to bring it back to nature. And now we have the situation that we have climate change. And due to this climate change, we have droughts, we have heavy rain situations. And we do not have an idea how these areas are developing. And that is what I would like to talk about. We need to pump out these areas. And we're going to check what in-situ sensors, drones, UAS or UAVs and satellites can do with that. And that is what I try to bring a little bit closer. Only one aspect in this huge project is how these brooks and this area can develop close to nature. And the approach is actually the first one, what we thought is that we should see the drought conditions in these areas by using satellite imageries. So we're flying with a satellite over this area and we see where the problems are, but that is not that easy. The problem is that we need to understand the process in-situ. We need to go into the field, dig in our sensors and then see what the satellite can tell us. And that is the idea that we had. We started with a very easy project. We thought that high soil temperatures, caused by heat, long drought periods, should actually cause damages in the plants, which then again is to be seen in multispectral data, calculating, for example, vegetation indices like NDVI or GNDVI and several others. That is what we thought. And we try to do that actually in a pretty easy way. So our approach is capture data very locally, but then you're coming up with a very fast, let's say big data problem, because the aerial level is totally different from the data that you can get by soil sensors. And when you check the satellites, then again that is a totally other level, because the sensor is sitting here on this place and it's measuring every second, every minute. I don't know what you like to, but then only for this place that is valuable. But the drone then has a first possibility of extrapolating this data. It shows you the first, let's say, area approach with high resolution, let's say five centimeters on the ground. And then you come up with the satellite. This satellite, let's say Sentinel-2, is coming and circling every six days over this area, but it has pixels like 10 by 10 meters. So that is what we need to get together, to bring together. And we do that actually in center GIS, but also we call it in n-dimensional cubes, where this is all addressed, what I just explained. That is actually our approach. And it belongs to this very old graphic from one of our books on soil science. You can see the type of soil. And using this soil, you have sandy, silty or clay soils, and our soils are laying in between. And the plant is able to get the water, let's say here in between. If you're going here, the water is, there's no adhesion, water is going to the ground level, to the ground water, plant cannot get it. Plant can get it here, the roots, but there is a point at PF-4.2, where the plants are not working anymore. The adhesion is so large that the roots cannot get the water out of the soil. And that is the point where the drought sets in. And this point we should actually see in the multispectral data, because the chlorophyll, which let's say bring the energy to the plant, is then not working anymore. And the reflection is totally different than before. That is actually the thing behind. So we put our sensors in the field, different types of sensors, but usually these are small soil sensors which we can easily read out, or they're sending it by, I don't know, different types of networks. It looks like that, little weather stations with soil sensors over here, sensors that we can read out with RFID and FC technology. We're doing grain size analysis, and the harder thing is to dig the holes for this. But the positive thing is, you can see the soil types, and how they develop in these areas. So that is what we have on the ground, and now we want to go one step further, going into the air. And we use the DJI Phantom multi-spectral camera, because, and I think that is quite necessary to know, the sensor of the DJI and the Sentinel-2 satellite, they're pretty close together. So we think that we understand what the drone is showing us, we can find in the satellite level again. And that is really of interest, because we do not find too much sensors, which are so close to Sentinel-2. So for us, it was a good decision actually. So what we are doing with these drone pictures, there is just an example. We're doing auto-photomosa X, but we also calculate the vegetation indices. A bunch of data in between, it's in the GIS only a button, and you get I think 10, 20, 40 indices. Very interesting. I just show you the NDVI, the Normalized Difference Vegetation Index, and the other one. And we, where am I? Okay, we have this composite, and then we're going in the field and doing a mapping. We try to do a mapping. This mapping is done by an old mapping method, the Stratified Unaligned Systematic Sampling Method. You see it's already from 1989, where field work was common at these times. Today it's more the computer work, but we use this field work to calculate a systematic grid. And within this grid, you got dots calculated by the computer, so you can avoid from systematic errors. That was pushed into a mobile GIS. We used the Survey123 for that. So from ArcGIS, and this environment was pretty easy. And using the GNSS, you can find these points in the field, and you can do your measurements then. That is, yeah, let's say work. The measurement was done by such sensors. It's RFID, NFC technology. You put it in, and let's say within a minute, you get a soil moisture value, and you can read it out using an application, like this Telet application. You write it down into your GIS, and then you have it. But then you only have the points. What means that we, to compare it, to integrate it with the images from the drones, you have to get an aerial picture, and that is what we have done using standard creaching methods. We calculated a semi-variogram. Looks a little bit like that. Should be another discussion. And then you come up with an interpolation. And this interpolation then was compared with the drone images. That was the first level. And like you see, here is the scale, whether these results are good or not. We come up, for example, with the GNDVI, with a correlation of 0.66 by a quite high, what is the name? So I'm missing a vocabulary. Doesn't matter. You know here this R square. What is R square in English? Confident level, confident level. So now I have it. So this confident level is quite high. But you can see it's not for every of the indices, and especially not for the temperature. So the soil temperature and the drone pictures do not fit together very well. And that is what we thought about. But this is actually the result of this project. We now know that the GNDVI that you can calculate by drones fits totally good together to this field work. And that gives us the confidence that in the next time we can see it in the satellite level as well. So this type of understanding is how we do the integration. But we also found that the protective mechanisms of the plants might be a problem. So you will not see it immediately. So if the drought is starting, the plant is not reacting immediately. It takes a while because they can close their stomata, keep the water inside, and then it takes maybe two days, three days a week or something like that. And the funny thing is that we do not have any correlations or good correlations to the temperature. That we think was a problem in our setup because during the mapping and the field, the sun was going around, and so you have shadows there and so on. But the conclusion is in this work in project that it actually works. And now we are confident to understand the satellite level as well. Thank you very much.