 I work for the government so be critical about everything I say yeah I was asked to talk about the open data endpoints we have available so I'm not gonna talk about the air quality as a very complex thing I have 15 minutes so it's I mean the quality is very complex and but if anyone has any questions I will stay here a bit after the presentation I can very keen to discuss so yeah in Belgium everything concerning the environment has been regionalized so we are sort of at ESL in we are a permanent cooperation agreement because of course moves over linguistic borders and so forth at an alarming pace so we need to cooperate anyhow and so yeah there are a couple of sort of well there are three regional networks in Belgium a couple of sort of Bretton butter things which we need to do like like aggregating all the data from the regional networks for the European level but yeah since we have all that data we we can we can also start manipulating it and and redistributing it and and yeah for example continuous forecasts and so forth that those are used for example for the smog alert or the information threshold recent addition to that that relating system and yeah well we are also responsible for trying to enforce a common scientific base between the regional monitoring networks okay so just to sketch the context well many of you probably know our website where we try to provide continuous real-time data but also for example scientific reports on air quality section with frequently asked questions about air quality we often get the same same questions about certain phenomena so yeah that's that's our communication channel but besides that or behind that's that website we have a whole infrastructure of different data services in general yeah we always make a distinction between validated data and non-validated data that's it's very easily done to to mix the two and but it's well a dreaded thing to mix the two they should be kept clean so our our solution to that is to to have to two separate databases but yeah on top of that we have a service layer of viewing services like WMS web map service or otherwise more downloading oriented services like WFS a feature service CS the cat the coverage service and an SOS since observation service and that the rest API for for faster access to time series data and so forth okay there is documentation there maybe maybe not well well if there's something missing please please send us a mail and we will try to to to include it we do our best to document it all but of course yeah time is limited and yeah we don't have a huge team so the sense of observation service yeah that's something to really well it's sort of a standard to to to exchange time series data it is a bit cumbersome at times because XML and so you have a lot of overhead of a lot of little signs and so forth which you really don't know it's it's also a bit more difficult to parse into some other system hence we we looked more in at the rest API of well it's part of part of that service actually which is a lot easier which provides Jason which is a lot easier to to integrate into other infrastructures live in there for example does that on a regular day on a continuous basis yeah that creates an environment where you can collaborate more easily the rest API has sort of a standard standard queries like give me all the stations a specific time series but you can also search on certain certain well search for the first station or phenomena phenomena is a pollutant in a sense of an enablement lingo the same thing so phenomena can also be temperature of course or wind direction or something else which you need in a calculation around air quality there is well something we heavily uses is the the open air package for for our our is something extremely handy to us for for for advanced analysis of the data it is well if you if you do it right it's pretty fast and but it's it's very flexible and especially a nice thing is you can make shiny apps and share your results and so even someone who doesn't know anything about our can can enjoy playing with with different parameters and so forth yeah you well that's maybe a bit too much into detail you can you can also integrate other rest APIs into your own stack so I mean from from a system architecture this is actually an interesting example for if you want to advance look that and a bit further at the moment you can you can only get to get the last five values from from that and besides that you have an archive of that but if you want want to get a longer time series for example it's a bit so maybe maybe an interesting thing yeah just some examples of shiny apps pollution roses for example of course depending on the wind direction your the impact of your concentrations at the given place will be different that can be calculated as long as you have wind direction and speed just some examples actually I mean there are so many possibilities within that as soon as you you get your data into our so some documentation if you want to have a look at this there we also the mobile app well the previous all the versions maybe not does not feel that that much that native so but yeah the concept was especially to have some some open source code which can be reused for other purposes easily and all that that wasn't in case achieved and but we are working on a new version based on Ionic which feels a lot more native and so yeah but same philosophy try to develop some some some open source base code which can be reused for for different purposes so feel free to to adapt it anyway you want and yeah then of course you have straight away a sort of a framework where you can easily integrate other well our open data in case or other other external open data if you want to have a look at it in context the high resolution models we do well these are still annual means so don't worry it's probably much better at the moment outside so what is inside there you have a lower resolution background model using Korean land cover actually to to to make an intelligent interpolation using land land use classes to to just to put it simply if you have a measuring station and to the east you have industry and to the west you have a big forest you can expect the air quality to be different according to to to the land cover and that's that's what that model does but that's a four by four kilometer grid we also have a way of of getting a scaling it down to one by one kilometer grid but actually it does not improve the data that much so to improve it you need emission inventories that's a very hot topic and well we used to have very good emission inventory as input but you see more and more with austerity austerity austerity that the first thing that gets crept that government agencies are the most interesting things no oh maybe so also also counting the the streams of of traffic was also regionalized so we used to have a very good federal inventory of of traffic counts there was a huge network and but yeah somebody left for suspension and and after that no one picked it up but then then then the three regions started evolving totally differently yeah and now now even the region that had the most advanced system is also stopping with accounting anyway we are working on that that problem but it's it's it's not easy and it's but it's I mean that's that's I mean without emissions and with it's really I mean all environmental impact assessments about changes to mobility become an intelligent guest at best if you don't know where where cars are driving if you want to know the effect of changes in circulation in towns at the moment you need well the most accurate way to to estimate that impact is via emission inventories because measurement the problem with measurements I mean it's of course an interesting idea even if you have good quality measurements the problem is the one year is not the other you might might be measuring the differences in meteorological conditions between the one year and the next and and and you cannot really really substantiate to claim that there was an impact negative or positive over certain certain change in mobility in a town but so we also now have a real-time version of them the high resolution maps we also do population exposure calculations there together with that based on that we do it this way so you see a one by one kilometer grid of of population which may be may seem of a much too low resolution but we we compared it with with with data on address level and actually the difference is not that big this is actually well another reason is that this is this is a data set which we where we have the same data for the whole country that's that's also a reason why we we prefer to use that data set but for for for areas where we have better data the difference is not that big there there is a is a certain scale where it doesn't matter that much anymore whether you really do it on a on a person-by-person basis I mean of course for a person it makes a huge difference whether you you yourself do do a certain take a certain route yes or no that for you it makes a difference but statistically on a whole population that ever it is averaged out it doesn't really make that much of a difference anymore whether you do it on a do your calculation on a video individual basis it just takes a lot longer to calculate so as a part of that we the web coverage service the nice thing there is that you can do both a trim and a slice as I call it in in the jargon so you can get a time series out of it or otherwise you can get to get to get one one one hour out of it and that is that makes it useful also to for example to to to use together with with low-cost sensor results to to to have a look what is actually happening but you can also for example start calculating the healthiest route and that's that's something we are now currently working on to make that more user-friendly I mean it's at the moment it is possible to do that already within within now we already achieved this to to pull the data in from from the WCS and do the calculation and come to a result but it's it needs needs some some smoothing and some some more documentation and so forth yeah citizen sciences initiatives we really want to engage and that's not not only on the Belgium level also for example the JRC is very very active in that I don't know if you you're aware of the sensor website and also yeah what's what's what's connected to that a lot of information about pitfalls of measurement measuring and so forth they did a lot of testing of especially gas sensors until now but it's very clear that you you often have a the big struggle is the interference of ozone with with NO2 because they have a signature similar a similar response signature very difficult to take out so especially in sort of the intermediate periods as we call them in the in the autumn and in the spring where you already have high levels of ozone but also still maybe some temperature inversion and so forth that NO2 remains close to the ground your results will differ significantly but yeah some some well interesting sensors which we discovered that for example the Winston ZH0803A also the the SDS011 from used by Lufthad and so forth they have they have pretty usable results they are certainly some some pitfalls but it's it's becoming interesting and certainly also the data infrastructure approach of Lufthad really very good so but so Thursday night this is what you were measuring with an air all is all is good now now we add the stations of from from from the regional networks from us and this is actually what the situation was and so if for people that think that the government is trying to hide something no it's not it's not that it is it is a it is more a thing of accepting the fact that measuring air quality is a difficult thing and so yeah to jump into the discussion about moving around with a sensor or otherwise having a sensor station stationery I mean there's one thing of achieving a good good measurement quality that's already a struggle but as soon as you start moving around with your sensor you enter into a totally different realm of complexity so well to give you an example if you if you have a sensor stationery and sometimes the sensor measures a little bit too much a little bit too low by aggregating the data to from from from every five seconds to every 10 minutes or to every half hour or so you already correct a lot of the the measurement of uncertainty because sometimes you are too high sometimes you are too low that's one thing but the fact that you are moving around you you think that you are then measuring an extremely high concentration in this street but yeah actually what you didn't see was that the truck was was passing just a little bit earlier and after that you don't have any traffic anymore so and on a daily average maybe the quality in the street is much better than it is where I live because the neighbors are all burning burning wood to to heat their houses at the moment so it's really you enter into a different different different complexity I'm not saying that that citizen science measurements are are not useful also for us for example I mean look at the courier user user projects there there we are really getting data out it's maybe not as sexy because it's it's a passive sampler it's a permeation to it's a technology from from from back the from the old days has been around for the last 30 years and so forth you don't you can't do it in real time and it's not gonna be you can't label it as internet of things or something so it's but it does produce robust results you get a get a monthly average which you can trust and we can use it to refine our high-resolution models which give you of course no idea about some event like there's a traffic jam in in the street can you know what not that of course we cannot predict but if we would have access to the traffic data in real time we could but that is something which is totally privatized now is it Tom Tom and Google and so forth they charge huge huge amounts of money if you want to access that data and so it's yeah and also they all have their pitfalls so it's there it's it's a thing of yeah we don't have access to the to the right data to really integrate it into models let me yeah so besides the problem so the problem Thursday night I mean I want to be fair to you to you on Friday night for example the problem was not not there anymore that you were measuring more or less the same same thing so yeah Thursday night was very dry in the air it's a it's a known known issue with low-cost sensors yeah other maybe possibilities to correct for this we need to investigate it's maybe for the for the high relative humidity I know it's it's a lot more complex to correct but low low relative humidity is maybe easier anyway so I mean that's that's something that's a field where a lot of innovation is still possible and and is it is especially by by opening up well they're having having open data services that we can look at this together it's not a thing of pointing fingers or whatever saying that the one or other is it's better or worse you can do things which we cannot do and we can do things which you can't do simply but so yeah I talked about the problem of a gas sensors so yeah there and or to even if we did manage to to get to levels which are okay with the European directives in the intermediate season and or to is still an issue we don't really have a I mean the best best local sensors are probably the alpha sensors it's about 20 euros per per sensor but then you still need to integrate it into a shield and so forth yeah I mean it's not impossible you certainly for for winter and summer periods you get a summer periods actually and then there's of course always too much ozone so yeah but in typical real winter situations it's maybe maybe also possible so yeah that's I think I said everything which is on this slide I want to close off a trick your attention to a couple of European projects we are busy with European projects for government agencies are sort of the way to still do a little bit of innovation even if budgets are cut so there is to be good project where we are developing the healthiest route API further so at the end there should be some some API which can be integrated into any any routing application it's not that we are calculating the routes ourselves now it's it's more evaluating a route which we get from from the one or other app but it needs to be fast enough of course and and so forth and accessible and well well documented and so forth there's the corona you project where we we try to make our open data endpoints more user friendly so comments there are also very welcome we are also viewers for comparing data which you collected for example with with reference data but I don't think that that is really a big big issue anymore because you also have your own on way of of pulling the data in into your systems anyway but still we could still look for synergies within vacuums we're also evaluating a short list of sensors the SDS 011 is also in that evaluation it will be as long as we can probably a year long worth of data but that will be in bogart in Antwerp in Antwerp but the interesting thing would be if you have your sensors that are close to to Eucl it's a different PM measurement method there is TOMF DMS where you have a little vibrating thing which measures the math whereas in Antwerp you have the DEFIDAS which is actually also a particulate count so you count the number of particulates which is the measurement technique as the SDS 011 so an optical method by the bigger device and so it's a similar measuring technique so the comparison would be really interesting to see whether you have a different different agreement because particle mass is one of the most difficult parameters to measure there are always I mean I could talk about comparison long-term comparisons between the British network and the German network where you had differences of 60% between measure reference monitors next to each other so that's just to I mean can we not say anything about PM no we can say something about PM but still keep that in mind it is even with big devices PM is a very difficult parameter now PM 2.5 is probably the best best fraction to take why because as soon as a particle is bigger than 2.5 if you do your conversion well if there are slight slight inaccuracies in your conversion it has much bigger impact because every particle is bigger so it has also a bigger impact on the mass you calculate of course whereas in the in the actually the actually the SDS 011 will measure typically the range between 1 micrometer and 2.5 they have a lower lower threshold of course of the size which they can actually observe or measure within that range it doesn't the particle number gives you a good good good indication about the mass so that's hence they are actually performing pretty pretty well so in these these projects everything is developed the open source way so let's look for synergies and we are not here to take over but we are here to support and facilitate and so for example also with the Luftart and we are seriously well looking at the possibility to rather invest in their infrastructure to improve that then becomes a totally government independent citizen project but we just financially support to include certain technological deficiencies we observe at the moment to improve it to a level that it can be useful for us and useful for you there we go thanks for your attention. Thank you Olaf it's really nice that you'd say things as they are and I think they're also very much in line. Can you maybe stay two seconds? I'm going to ask for constructive questions. Is there any question for Olaf or any comments? If I have one question thank you for the presentation you spoke a little bit about the data in Belgium as you are representing the administration. What are your relations with the neighbour countries? Do you have some network to exchange the data because the air is not stopping at the borders? That is correct yeah to give you an idea 60% of the air that is measured here is actually from outside of Belgium but at the same time we are still netto exporters of bad air so there's a lot of exchange and that's indeed also in smoke episodes always initial also for us to see what what is actually coming from especially continental wind and so forth. We well they are very very active European networks export networks we meet on a regular basis with some countries we have a better relation than others but also because of proximity but there is a lot of exchange. There is in other countries this is changing there used to be a lot of a lot more reluctance to provide access to non-validated data as legal thing and so forth. Our approach to that was always published that data with a disclaimer saying this has not been validated problem solved but that doesn't stick in Germany for example but in general I mean there's a lot of exchange on how we engage I mean also the joint research centre from the Commission is also very active on these things. The Commission is also well for example jumping on curious news and looking to see how they we can engage and use that as an example for other European countries. Belgium, the Netherlands we are always a bit ahead of the pack because air quality of course we are living in one of the big hotspots but still our air quality is certainly better than the Po Valley in the north of Italy to give you an idea and certainly better than Poland but now I will stop measuring mentioning names but I mean that's well known I mean you have these three big big areas and but also our hotspot actually extends to the Hogerbeet in Germany so it's one big big agglomeration but so yeah especially with the Netherlands we have a very very active contact they are part of two of the projects I talked about and often these European projects are also a way to to collaborate to get to know each other and exchange know-how so yeah there is a lot of collaboration yeah satellite Sentinel-5 dedicated to air quality can we expect more accurate data also from this kind of far but I don't know how precise it is and can we expect high-resolution information from this kind of tools yeah not high-resolution but well I in the past week I saw a presentation from someone working with topomi which looked very interesting so there is certainly progress but you still you still have the the fundamental problem of course that there is cloud cover you cannot always use the data so so whatever ever way you you want to whatever approach you use to use that data you will always need to have a backup to calculate some some some other way that's one issue the cloud cover it's not to be underestimated because there are many many many high pollution episodes where everything is covered and that's especially the reason why the pollution is at the level it is because it makes us breathe our own rubbish so but and then another another thing is that of course a satellite measures the whole column you need some strategy to to estimate the levels we are actually breathing which might be very very different big plumes have a huge impact at a higher higher level coming out of a chimney from 80 meters high or something and but but actually they have little impacts on the vicinity can come down of course and there there are satellite data is again very interesting what I see most interesting about satellite data is maybe to to improve our knowledge about long-range long-range import export but but also to to train forecast models to use in forecast models because you you have a better idea of what what what what is actually being imported but for local air quality my personal opinion is that the high-resolution maps are the best best option maybe in combination with with local sensors and local sensors are especially good of course to to identify hotspots so that's certainly an interesting synergy but you will always need a measuring network of stationary high quality accurate measurements which you can use as a backup and use as a as a collaboration tool and so forth thank you yeah yeah okay if you don't mind to have lunch at one instead of half past 12 it's fine with me okay with a kind of a triple question but firstly how many high quality sensors do you have in the Brussels region and where are they there are nine nine measuring stations in Brussels in Brussels region and if I'm not mistaking out of my top of my head they don't all measure all the all the pollutants but most of them sort of yeah there are a couple of super sites if you want like Wallenberg has actually pretty much all the parameters and others are maybe a bit more limited in what they measure but NO2 for example is measured in every measuring station NO2 is an interesting proxy parameter for traffic emissions but yeah I mean also of course PM is of course well is more more detrimental to your health nor does I mean it's also still love for to up for debates what what what the big big issues but to give you an idea yeah there are nine stations in Brussels but there are about what 80 stations stations in Belgium and also the stations outside of Brussels are produced useful data for Brussels secondly how well does do other polluants correlate with PM 2.5 which is to say if you have a PM 2.5 reading for this building to what extent will you be able to deduce NO2 or Ozone or it depends on what what pollution is is close by so it's something which is very important in when when you measure is to think about the representativeness of your measurement is it representative for your garden situation is it representative for a street canyon situation so it depends on the surrounding I would say but so to give you an idea traffic particle emissions stay they generally are more noticeable in your PM 2.5 then MPM 10 because they are mostly smaller PM 2.5 is also not bad for what what combustion for example because those are also the smaller smaller particles but it will miss quite a lot of the wood combustion because wood combustion is really even even smaller so there you would would rather look at black carbon as a measuring measuring method but black carbon there are different ways of measuring black carbon different frequencies you have a better better response at frequencies which are for example not measured by the AE 53 the little device you were talking about because those devices have been developed for traffic emissions and for that you need a bigger device the AE 33 which measures on seven wavelengths and then then you have a formula to deduce your woodburn part out of that but as I say in terms of mass the contribution of woodburn is pretty low even if we come to 50 percent of the PM 2.5 during winter winter periods which is a lot but it has a bigger bigger impact on black carbon there you would then have huge you know 80 percent woodburn and so forth and finally what was the improvement you would like you would most like to finance in the Deftatsen network maybe the data infrastructure because I feel that it is at a level level that is that is really well the approaches is very interesting that was chosen I mentioned the air sensor project there the shield is pretty expensive and that is something which Luftarten solves brilliantly by doing using Wi-Fi networks but yeah of course in certain situations you would also need an extension with the gprs receiver for example to or yeah the other possibility I'm not not aware it's the first time I hear of it probably once already mentioned that the look on this week so I'm very interested to learn about that