 Okay, I just want to say a thank you to Fred and also Gabriella for ICTP for helping to organize this meeting. It was a bit stressful at times, but I'm glad to see that everybody's here and thank you for making the journey some of you I know I've had longer journeys than others Anybody who's not got a bag there's some spare bags at the front with some You send away of goodies as a cap and a USB stick which you may use the USB stick in the practical exercises So the practical exercises I think I'll just leave Erlen to talk about open IFS and just mention the practical exercises Which will be in the info lab, which is across the hall This afternoon, I think we'll be mainly just getting set up There are several computer accounts that we need to get you logged into both here and at ESMWF The plan will be that you will be able to log into the ESMWF Computers and actually run an ensemble of open IFS yourself. I hope And then process the results and retrieve the results back at ICTP for reviewing and Comparing with some of the other experiments so there will be more details about the experiment this afternoon There's also the opportunity to run speedy. I think Fred Which some people may want to do to compare with a better model? Yes Okay, but that's the general idea, but if anybody has questions, please ask But I think today it will largely be just getting everything set up making sure everybody can log in correctly And then tomorrow we can start the experiments proper But you will be able to form groups and decide what kind of experiment you want to do It largely revolves around modifying sea surface temperatures and sea ice fields for the El Nino for 2015 and 16 So we've pre-run some experiments, but then you can modify the SST fields yourself based on what kind of experiment You prefer to do that's that's the outline and then there will be Teams and then the teams will Create plots during the week and then we'll have some discussion about the results at the end of the week So that's the general outline Think that was everything Perhaps perhaps a special mention to Paolo who's hiding somewhere is he got he's over there so actually Paolo was the instigator of this meeting because he's done a lot of work in Promoting open IFS a la quilla and also other Italian groups and he was his suggestion originally to come to ICTP, which I'm pleased to say that Fred agreed to so it's a great pleasure To be in Italy and thank you again Fred So the program starts with Erlen Shelen who is our director of research at Eastern WF Erlen is actually due to leave Eastern WF soon after six seven years, eight years, I beg your pardon, eight years at Eastern WF Open IFS is still a relatively new project at Eastern WF and I never had any doubt that it would be of interest to people outside the center What concerned me more was the support inside the center for the project but I'm very pleased to say that Erlen supported the project wholeheartedly and This is my chance to thank you Erlen for your support and encouragement in in the project So without any further ado, I'd like to hand over to Erlen to give the first presentation Thank you very much Fred. Thank you very much Glenn for these introductory words So good morning, everybody. My name is Erlen Shelen and I'm the director of research at Eastern WF And as Glenn just told you My term finishes this summer So I'm actually leaving Eastern WF in August this year and going back to the University of Stockholm in Sweden where I've spent most of my career First of all, I'd like to say that I am very happy that the open IFS has taken off in the way it has You can see here today. We have a full room of people who are interested in open IFS and Open IFS Started about five years ago five six years ago when we decided at Eastern WF that we wanted to make a version of our model available which could be used more widely in the research and academic teaching community so that's why we set up open IFS and We hired two people to do this one was Glenn Carver and the other one was Philip Banya who will also be speaking later here today And Glenn and Philip have really done a marvelous job both in Streamlining the IFS so you can use it even if you're not a technical specialist on the ECMWF systems and also promoting a Community around the open IFS and I think these open IFS workshops is a very important way to foster that community This is the fourth one If I remember correctly, we had one in Helsinki one in Stockholm one in Toulouse, right? And then the fourth one here in in Trieste So even if I'm leaving ECMWF I sincerely hope and I think that because the way open IFS has taken off this will continue also into the future Because there is definitely a strong need in the research community to have a model like this and also from ECMWF side I'll come back to that later But of course we do this from ECMWF because we think we can gain something from it and I will tell you a bit more about that And so I have 40 minutes now this morning to give you an overall presentation And I know that there are quite a few of you here in the room who maybe don't know that much about ECMWF So I will actually start to describe what ECMWF is and then I'll talk a little bit more about what we do some recent research results and also the Forecast products that we make available to our member states and also to the rest of the world And then I'll say a little bit about future prospects at ECMWF So starting with ECMWF We are a European intergovernmental organization and we were set up in the mid 1970s 1975 to be exact and Our role is to address the critical and most difficult research problems in medium-range numerical weather prediction that no one country could tackle on its own In the early 70s, you could do forecasts two or perhaps three days ahead, but not much more than that and A number of European countries decided to join forces together to buy a big computer and To hire the best off they could get hold of to construct a new model a new numerical weather prediction system Which was aimed at improving forecasts in the three to ten day range So it was quite bold to say in the early 1970s that you could make ten day forecasts And there were actually some countries in Europe who said that this is ridiculous You'll never be able to do that. We will not join ECMWF because we don't believe it but there were enough countries who signed up and This is a picture from somewhere in the early 70s one of these official meetings of some European intergovernmental body signing the agreement between the member states of ECMWF and If you look at the geographical distribution of ECMWF member states, this is the current map The dark blue states are full member states and the light blue countries are so-called cooperating states Which means that they can make use of most of our products and they also have access to many of our facilities But not everything and they pay reduced fee and in addition they do not have a seat at the council table But as you can see Most of the Western and also Central and Eastern European countries are actually members or cooperating states of ECMWF and The mission for ECMWF is to develop a model system to provide medium-range forecasts And then operationally to deliver these forecasts to the member states every day We deliver both medium-range forecasts up to ten days But we also actually deliver other forecast products monthly forecasts as well as seasonal forecasts And I'll come back to that we have been in existence for over 40 years and During this time period of course ECMWF has developed a lot and If we look at some numbers we today have 34 member and cooperating states We are about 340 staff working in Reading Our site is at Schindfield Park in Reading but because we have grown a lot especially over the past three years We also have some office space at the University of Reading which is nearby about two kilometers away from Schindfield Park The staff come from about 30 countries not all member incorporating states are represented And we also have research and actually also operational partnerships all around the world We are both a research institute as well as an operational production of weather forecasts So if we look at the deliverables On top is research in earth system modeling So to develop a forecast model to continue to improve the forecast quality and the forecast model is both The actual earth system model itself as well as the data assimilation system to make use of all the observational information We use a global model for global numerical weather forecasts if you want to do forecasts beyond three days You have to use a global model We have a big super computer Presently a cray XC40 and of course the supercomputers have evolved over the years in the very early days of ECMWF The first computer was a cray 1a It was bought in 1978 and it was a fantastic computer at the time I was actually there working at ECMWF in the late 70s and early 1980s And there was this huge computer hall with a little computer in the center It wasn't much bigger than this And it was considered as the most powerful computer in the world at that time Of course the computing power you have in your mobile phone today is much much bigger than was available in the cray 1a in those days That cray 1a lasted until somewhere in the early 1980s and then it was upgraded to newer cray systems and Computers have been changed every four or five years roughly So the latest computer which we have now the cray XC40 was installed in 2015 and we expect to buy a new computer in 2020 so all the time we're upgrading to be on top of the computer development We also have extensive data archiving Coupled to the supercomputer is a huge archive I think we have the biggest meteorological data archive in the world both with forecasts as well as with observations And that one is available both to our member states But also to the research community around the world the operational forecasts are only delivered to the member states and some Commercial customers the operational forecasts have a commercial value. So we don't give those out freely But the research results the different data archives We have the TIGI archive the reanalysis and so on they are freely available to everybody around the world So we are really there also to serve the global research community We also have an education and training program. We run a series of NWP training courses every year We run a annual seminar. We run workshops and we have user courses forecast user courses and also computing courses So all the time we have some activity going on with external participants at ECMWF Which I think is another very important function of ECMWF to be a focus center for numerical weather prediction in Europe when ECMWF was set up in the 1970s the European Union didn't exist There was something called the EEC, which was only I think the four or five biggest country in Europe We who had a collaboration So ECMWF was not set up under the EU flag. It was set up under an organization called OECD Which goes back to the 1950s and we are still an OECD organization. We are not an EU organization But of course we are very very reliant on the EU for a lot of our funding and in particular in recent times We have been given the responsibility to run a number of Copernicus services The Copernicus program itself is a big EU program mainly for funding new satellites the Centinel satellite series for Earth observation To make sure that the European community at large can make use of all this satellite data the EU have also funded dedicated or Copernicus services and we are running two of these services the atmosphere monitoring service, which is about atmospheric composition and The climate change service, which is both about climate reanalysis as well as seasonal forecasting and Climate projections into the future. We don't do all the work ourselves within these services We also outsource some part of the work to other research institutes in Europe But we are responsible for delivering all the data of the Copernicus services and it is a huge undertaking When it's fully spun up in one to two years time roughly half of our budget will come from the Copernicus services So it is a big thing for the ECMWF and this is directly funded by the EU as well as many of the research programs and projects that we are involved in are also EU funded And I'm sure many of you wonder. Okay, so what happens with Brexit then we're in the UK and the UK is going to leave the European Union We don't know We have no idea yet exactly what will happen But it is important to remember we are not a UK organization We are a European intergovernmental organization and in principle we can still have access to EU funding But of course with such a large part of our budget coming from the EU One wonders if the EU will be prepared to pay all that money into the UK if the UK decides to leave the EU But that still remains to be determined We don't know yet Okay, so that was a little bit about the background the organizational background So to do weather forecasts, of course you need observations and we use all the observational data We can get hold of this just shows you in pictures some of the observation platforms commercial aircraft different types of satellites ships Radiosons weather balloons radars surface stations the lot We use it and continuously receive information from all these observations. That's fed into our computers Now if you look at the distribution the global distribution of all these observations They are of course very unevenly distributed The top left you have the surface stations You know these little white huts that you see out in the countryside measuring temperature humidity surface pressure and wind And there are lots of those we make some use of it, especially the surface pressure observations But we have some difficulties in making use of all the surface observations because they really represent fairly small scales That we are not able to model very well with the global forecast systems that we use We also use the radius on balloons There are not that many of them about 600 and they are only launched twice per day midnight and 12 noon GMT And they are mainly in the northern hemisphere overland areas But they are really a backbone of the observing system They provide us with in situ information of the basic variables that we need to initialize our forecasts So even if it's a very limited amount of data relative to all the other data sources It is absolutely vital that we can use these radiosons and that they continue to exist We then have a number of different satellite systems. We receive about Data from about 70 different instruments, and it's about 40 million observations every day that come from the satellites And you see the satellite tracks here of some of the observing systems the polar infrared the polar microwave and Down on the right hand bottom are also the some of the geostationary satellites, which are then fixed over the equator all this satellite data is processed in our data assimilation system and Translated into model variables to initialize our forecasts We also use data from commercial aircraft, which is the middle bottom plot there It looks like it's fairly decent coverage But if you look at it a bit more in detail It's of course mainly over Inhabited land areas and also over well-trafficked ocean areas and of course most of the aircraft data is at about 10 kilometers height So the vertical coverage is not that great where the aircrafts take off and land you get vertical profiles But that mostly coincides with the position of the radiosons, so it's not so much use The aircraft data is still important But it is not as extensive as you might think when you look at this map So how do we insert all this data into our assimilation system? Well, I think one of the pioneering developments at ECMWF has been the so-called variational data assimilation what you basically do is you take a short-range forecast and Then you compare that short-range forecast with the observations and you adjust the initial state of the short-range forecast To fit the observations as well as possible over a certain time interval normally 12 hours This plot here tries to show you also a further development of this variational data assimilation system by using an ensemble method Now when you compare the observations with a short-range forecast What you really want to determine is the relative errors of the observation versus the short-range forecast If the observation has a small error and the short-range forecast has a large error You want to rely more on the observation But there are also instances where the observations have large errors and the short-range forecast is actually quite good And then you should put less confidence on the observations and more confidence on the short-range forecast And that's the way it's done in our assimilation system and to estimate the error of the short-range forecast We produce an ensemble of forecasts over a 12-hour interval here from 9 in the morning to 9 in the evening and the blue Different curves here on the left-hand side show you trajectories of the model in this short-range forecast and also with a wide range indicating the uncertainty of the short-range forecast Assume now that you have two observations in reality. We have 40 million over this time interval but just assume here that we have two observations and They actually fall slightly outside the interval of this short-range ensemble There is an overlap if you look at the error bars of the observations and the short-range ensemble Well, you then adjust the initial state of the short-range ensemble And you also tighten the short-range ensemble with this observational information to produce a new first guess ensemble And then you do this iteratively So you use the observations over and over again to try to tighten as much as possible your short-range forecast ensemble to get an accurate estimate of the initial state and also an estimate of the uncertainty of the initial state and Then that is used to produce a forecast for the future Which is both a central forecast the most likely development as well as a possible range of an ensemble Given both the uncertainty of the initial state as well as the uncertainty of the model So all of this goes in to our forecasting system and it's run continuously 24 hours a day we run this ensemble Assimilation and prediction system and then we produce 10-day forecasts every 12 hours Actually for some member states we produce them also every six hours, but they pay a bit special for that So the standard products available are forecasts every 12 hours The model we use is a spectral model Which was designed a long time ago in the 1990s, and that's the IFS the integrated forecast system But as you all know in a spectral model, you also have to have a grid representation and every time step you switch between spectral space and grid point space and One of the more recent innovations is the development of the so-called Octahedral grid which is depicted here on this picture Which is a very special grid which makes as efficient use as possible of the grid point information given a certain spectral truncation The highest resolution we have at the moment in our forecast system is nine kilometers horizontally and 137 levels in the vertical and We compute wind temperature pressure humidity clouds and precipitation at all these grid points Of course the picture I have here is with a much coarser grid the nine kilometers if I would show the nine kilometer grid It would be just black on the picture. So this is just to show you this octahedral grid and its general properties Now the virtue of this octahedral grid is that you actually get a very nice reproduction of the atmospheric motion spectrum and These curves here show you the wind spectrum at the operational Resolutions we have in our spectral model so you have on the horizontal axis the wave number and on the vertical axis the spectral energy density and It's surface wind speed over a particular area over the ocean where we have very good measurements From altimeters which can give us a very accurate description of the surface wind field over ocean areas and the black curve here shows you the observed spectrum of this ocean wind field and the red curve is the spectrum of the highest resolution the nine kilometer model and The purple curve is the resolution of the ensemble system And what I just wanted to highlight here is the fact that if you look at the tilt of these curves in the Long wave range of the spectrum We have this minus three tilt and then the observations show how the minus three then is transformed into a minus five Third tilt of the spectrum, which it should do according to theory Now numerical models have a hard time in actually getting this minus five third tilt of the spectrum and With the cubic octahedral grid with the highest resolution We are starting to resolve parts of this minus five third part of the spectrum and this is a particular feature of the cubic Octahedral grid that we were not able to get in the linear grid that we used before we introduced the cubic octahedral So it is a much more efficient use of the spectral resolution to have a cubic octahedral grid in our forecast model Now we say that the resolution is nominally nine kilometers, but that's a nominal resolution It's the distance between the grid points. Of course to capture really a phenomenon. You must have four five grid points With the spectrum you can see when the Curve of the model the spectral energy density starts to deviate from the observations And you can then determine what is your actual effective resolution of the model for a given spectral truncation and Sally Abdullah has done this a little bit more systematically to see the effective spectral resolution for different resolutions both in linear and also with this cubic octahedral grid and If you look at this curve where you have the grid resolution the nominal grid resolution On the horizontal axis and the shortest resolved scale on the vertical axis The blue curve is eight times the grid distance and the green to a curve is six times the grid distance and you see that most of the linear grid Models fall along the blue curve while the cubic octahedral falls along the green curve So the effective resolution of this cubic octahedral grid for a given spectral resolution is much better than we have in the linear grid and I think that's another sort of efficiency aspect of the IFS That we have a spectral model with a dual grid Which is very very effective in representing the kinetic energy spectrum of the atmosphere Of course, we also parameterize physical processes in the ECMWF model This is just a cartoon showing a number of different physical processes We know that a lot of physics happens on scales which are smaller than the resolved scales of the model with a nine kilometer grid you have an effective resolution of maybe something like 30 kilometers and You can have lots of clouds within 30 kilometers So you definitely need to parametrize the clouds the convective clouds as well as the stratiform clouds You need to parametrize the radiation You need to parametrize the turbulence the boundary layer processes gravity waves and we even have some parametrization of chemistry in the atmosphere There is a continuous development of the parametrizations they are made better and better Convective parametrization has been improved in several stages There was a new convection scheme introduced I think in 2007 2008 which markedly improved the forecast in the tropics in particular We are introducing a new version of the radiation parametrization Which will be both more accurate as well as more efficient We are constantly working on the boundary layer and in particular the stable boundary layer to improve the parametrization of the stable boundary layer And we have become increasingly aware that the parametrization of Orographically generated gravity waves is quite crucial both for a weather prediction model as well as a climate model and I'm a bit Sorry to say that this has been a neglected area of research and development at ECMWF over the past ten years at least But it is an area where we are putting in new efforts Because we realize that how we parametrize these orographic gravity waves has a huge influence Not just on the short-range weather predictions and the local effects around the mountains But actually on the mean state of the whole model the sonal mean flow for instance It's very sensitively dependent on how you parametrize the orographic gravity wave drag And we need to put more effort into improving that in the future, but that is going to happen Then of course we have Interactions within the earth system. I mentioned the chemistry But you also have the interaction with the ocean surface The latent heat flux and the sensible heat flux as well as the momentum flux coupling the atmosphere and the ocean in our forecast systems Only about six seven years ago. We only had an atmosphere model for medium-range forecasts We used fixed sea surface temperatures and fixed properties of the ocean and the sea ice We have realized that also for medium-range feather forecast It is actually important to have a coupled ocean atmosphere model So since a few years back we have in our ensemble prediction system a coupled ocean atmosphere model And we also initialize the oceans it turns out to have a clear positive effect on the forecast scores in particular in the tropics to couple the ocean with the atmosphere from day zero in the ensemble prediction system and We are moving towards also a fully ocean atmosphere coupled system also in the highest resolution in in the medium range It's coming. I think next year if I remember correctly Okay, so open IFS all of this is a very complex system Couplings to the ocean. We also have a model for the ocean waves the surface waves Coupling to the land surface and then the data assimilation system The open IFS has been produced to make the model more easily usable for the research and academic teaching community It is at the moment confined to the atmospheric and land surface component of the IFS It is not coupled to the ocean and it does not include data assimilation at the moment I should say My hope and expectation and I can say that because I'm leaving UCMWF is that they will soon be also data assimilation version of the Open IFS, but it's not there yet But having the model available I think is still quite important for the research community both to use an up-to-date NWP system to look at specific processes if you're interested in parameterization for instance I think it's very good to have a model which involves all the other things that you need to test your parameterization Schemes or if you're interested in predictability in general to have a numerical laboratory where you can experiment With the atmosphere in the form of a numerical weather prediction model Another thought of course is that if we put the IFS out there in the research and teaching world people will become familiar maybe even dependent on the IFS and This means that they can work with nothing else for the rest of their careers So they hopefully continue to use it in their jobs or even come to UCMWF and take a job and work with the IFS I think it is a way of preparing the community in general for using a system like the IFS Because even if we try to make it as easily useful as possible and simplify it It is still quite complex. You need to put a bit of effort into learning how to use it But it's designed in a way that this should be gradual You can start using it more or less as a black box and then gradually you have to become more and more familiar with the details We also of course want the open IFS community to give us feedback To show how new ideas can be used in the IFS and using the open IFS then of course you have to prepare your codes so we can actually interact with IFS which makes life much easier for us if we actually want to use that piece of code and There are a couple of examples where people have used the open IFS and we have definitely had direct benefit I think the prime example is the single precision coding that has been done at the University of Oxford This is going to be introduced operationally into the IFS Philip has put a lot of effort into that in collaboration with the Oxford group and Peter Duhman in particular It will give an efficiency improvement of something like 40 percent if I remember correctly That's a huge improvement Think of all the computing resources you can save by having a 40 percent improvement in the model efficiency and Single precision is something that's been around for a long time But out of I don't know conservatism or something like that people have used double precision in the IFS from from the very beginning But it's not needed You don't need that very high precision in most of the model code you need it in some places But not everywhere and this I think is a very important development Another example is super parameterization of convection It's a new way of designing a convective parameterization Which is awfully expensive as opposed to the single position, which is much cheaper The question is does it give better forecasts? Well, some people claim it does some people claim it gives a big improvement in the tropics But I think the jury is still out on that one. We are not sure yet We are not going to introduce it operationally in the near future Maybe in the more distant future if it continues to develop Another area is planetary boundary layer research looking at the stable boundary layer That's also an area where research groups in the open IFS community are using both the full open IFS as well as the single column version of IFS to understand better how you can improve parameterization in the planetary boundary layer And then of course NWP training at universities and Met Services The idea is that you can use the IFS as a numerical laboratory and that will be the purpose of the exercises You do here on this open IFS workshop as well to understand how you can set up the model in an easy to use environment and make experiments and try to use your imagination to design Experiments to understand how the atmosphere works to understand how a particular parameterization Interacts with the rest of the atmospheric dynamics for instance or to understand how the atmosphere works in terms of predictability So the last point they are the training at universities and Met Services I think is a very important part of the open IFS and where we really provide a service to the academic community in general Which is also one of the purposes of ECMWF Now how good are the forecasts? What's the quality of the IFS model? Well, this shows you here the time evolution of the forecast skill of the IFS model looking at snapshots in the years of 1990 2000 and 2010 it shows the root mean square error of the height of the 500 hectopascal height field in the northern hemisphere as a Function of forecast lead time from 0 and up to 10 days And you can clearly see how the forecast quality has improved the errors have decreased over this 20-year period The decrease is both due to a very dramatic decrease of the initial state error the development of the data simulation The initial state error has almost gone down by a factor of three over this 20-year time period And of course, it's also due to an improvement of the model that the model error growth has been limited The model has become more accurate in particular with higher resolution and with better physical parameterizations Now Assume that you have an acceptable error level put here at 60 meters And then you can determine at what forecast day do these error curves sort of go beyond this acceptable error level in 1990 this happened about after four days in 2000 after about five days and in 2010 after about six days and now we are up to seven Days roughly which I'll show you in the next plot We are advancing at roughly one day per decade improving the predictable range by one day per decade And that's the pace at which is ECMWF has developed over a long period of time if you look at these Advances in more detail year per year you get these curves that we always like to show Where you have on the vertical axis the number of days of useful predictive skill from five and a half days in the early 2000s 2002 2003 and up towards seven days in the most recent versions of the forecast model The red curve here is for ECMWF the blue curve is for the UK Met Office the Purple curve is for the Canadian meteorological Center the green curve is for NSEP US weather service and then you have the orange curve which is for the Japanese meteorological agency and I'm happy to say that over this time period the red curve is on top of all of the others We can claim to be the world leader in medium-range numerical weather prediction We have improved the forecast skill from about five and a half days in the early 2000s to close to seven days in the most recent forecast systems You also see that there is a large variability from year to year. It's not just a straight increasing curve For instance, there was quite a dramatic increase in forecast quality between 2009 and 2010 Incidentally, I happened to join ECMWF in 2009 However, this had nothing to do with me coming to ECMWF It was it even didn't have anything to do with an improvement in the model system It was just a feature of the atmosphere the atmosphere was more predictable happened to be more predictable in 2010 that it was in 2009 and that was the reason for this jump and you can see that that you have a similar jump in all the other curves as well Everybody experienced that in the NWP world Then it seemed to go a bit down again and then it picks up and was a maximum went down again And then up again So there is this year to year variability which is an inevitable feature of the atmosphere the atmosphere is chaotic sometimes is more predictable sometimes is less predictable and That's why we have the ensemble system because we have the ensemble system We can also determine what is the expected accuracy of the forecast weather situation dependent We can also look at the dotted curve here Which is a curve using a frozen forecast system which roughly corresponds to the operational system in 2002 2003? That those are forecasts derived from the reanalysis the error interim The dotted curve is not just a straight line. It has a slight tilt upwards probably mainly due to improvements in the observing system and then it has this huge year to year variability and in particular, you can see the rise there between 2009 and 2010 also in the dotted curve So what really matters is the distance between the red curve and the dotted curve That's what gives you the real improvement of the forecast system and that one is more monotonically increasing upwards and improving with time So the forecast products we offer cover a wide range of time and space scales This is just a cartoon of different plots that you can get from our operational forecast system medium-range Meteograms as well as monthly forecasts and also seasonal range forecasts atmospheric composition forecasts and also reanalysis And all of this is available through to our member states and also some of it to the world community at large The seasonal forecast system that we are running has been in operations for quite a while since the 1990s And it has also been developed not as frequently with upgrades as our operational medium-range system But every five six years or so we introduce a new seasonal forecast system Franco will give you much more details about the seasonal range forecast system I just wanted to show you some properties of the recently developed seasonal forecast system 5 which will go operational towards the end of the year and One big improvement in the seasonal forecast system 5 compared to the previous version the present operational one seasonal forecast system 4 is the drastic reduction of the systematic temperature errors over the El Nino regions in the Pacific and You can see that here in the plot. It shows you as a function of the calendar month The at both the absolute SST in the dotted curve the observations the red curve That's the time evolution of the old seasonal forecast system and the blue curve is the new one And you clearly see that the red curve Has a big systematic deviation from the observations while the blue curve is much much closer to the observations So we have dramatically reduced the mean error the bias in the seasonal forecast system We have also improved at least a bit the anomaly correlation up to seven months of forecast time and Also, if you look at the RMS error You can see a clearer improvement that the red curve the old system is above the blue curve the new system The dotted curve here. I think our persistence forecasts as a basis for comparison So at the same time as we develop the operational medium range forecast system We also continue to develop the extended range the monthly and the seasonal scale forecast systems I think I will skip that in the interest of time and maybe just show you an Atmospheric composition. Let's see if this one works. Yep, it does We also do forecasts of atmospheric composition and this is a particular case It's the carbon monoxide concentration at 500 hectopascal for a time period in July 2015 Where we had some quite intense forest fires over northwestern Canada and Alaska and this shows you how the plume of CO carbon monoxide was spread From the source regions in Alaska northwestern Canada across the Arctic North Atlantic and all the way into Europe And what I find particularly beautiful here is that you can see this sort of filament structure of the atmospheric concentrations that you have because you have a strong Elongation component in the flow field of the atmosphere you get these streaks a very high concentration that then propagate with the mean flow And those are of course the important ones to predict if you have a volcanic eruptions You want to find out where you have these streaks of high concentration and that's what you're able to predict by having a combined Atmospheric composition and weather forecast system working together and that's what we achieve in the atmospheric composition version of the IFS Just to show you this once more Okay Another aspect that we also have become more and more involved in is climate monitoring Reanalysis has been done at ECMWF actually since the very start of ECMWF in the early 1980s Reanalysis is using old observations and passing them through a modern data assimilation system to get as best in it Estimate as possible of the state of the atmosphere and we actually do it also for the ocean With the reanalysis we have extended them both back to in time to the 1970s The beginning of the satellite era and more recently we have also done a reanalysis to the early starting from the early 20th century So we can actually now produce also a 20th century climate temperature surface temperature record And then compare that with the standard surface temperature records that you get from the University of East Anglia or from the United States We know that there are uncertainties in determining even such a basic feature as the globally average temperature and Depending on how you do the analysis of all the old observations you get slightly different results They are within the uncertainty margins of each of these estimates of the surface temperature But we believe that using the reanalysis technique where you actually have a global model Where you really have a full global coverage of your temperature assimilation you get a better estimate of the globally average surface temperature and in particular in the in the Arctic regions and This is to show you this is the average surface temperature data for the year of 2016 and It's the temperature difference with respect to a 30 year average Which is this WMO basic period and you very clearly see this massive heating in the Arctic region compared to the rest of the globe and It's in the Arctic that it really matters all this heating and a data assimilation system of the type We have even if you don't have that much data in the Arctic the fact that you have temperature observations and a Dynamical model to connect the whole earth system. We believe gives you a better estimate also of the Arctic temperature change So computers to finish off as I told you we have been buying new computers all the time and trying to keep up to date Now if we look at the challenges for the future Today's computers have something like hundred thousand processes in each of them. We have two of these systems With I think hundred and twenty thousand CPUs in each the computers of tomorrow will have even more CPUs hundreds of thousands maybe millions and The big challenge for the future is to design a code which can efficiently make use of all these parallel CPUs and that's the scalability challenge it's both for handling the observations as well as all the degrees of freedom in the model and What you see here is a rough estimate of where we are today in terms of the volume of observations It's about 40 million as I said and 98% of these 40 million come from 60 different satellite instruments If we look at today's models We have about 10 million grid points about a hundred levels and 10 prognostic variables Which all in all is 10 to the power of 10 degrees of freedom in our highest resolution systems today And we model physical parameters of the atmosphere surface waves also the ocean as well as the land surface properties If we try to make an estimate for where we will be tomorrow, let's say 10 years from now 2025 We know quite well where the satellites will go how much the satellites will increase and We expect there to be something between a hundred and two hundred million observations per day coming from the satellites So between one and two times 10 to the power of eight observations from satellites We still believe that about 98% of the observations will come from the satellites If we look at the models and the direction that we hope to go We will have more than models with something like 500 million grid points 200 levels and maybe up to a hundred prognostic variables That'll be an increase from 10 to the power of 10 to 10 to the power of 13 in degrees of freedom So while the observations will increase with a factor of five the degrees of model the degrees of freedom in the model will increase Maybe by a factor of thousand and that's a huge challenge for the scalability to make this possible on the computer architectures of the future We have put a dedicated effort into this scalability challenge. We have a scalability program within the center We have a number of EU projects where we collaborating with the research community in Europe because everybody is facing the same challenge here And the IFS code will evolve and be more and more scalable in the future So by using the open IFS you will also take benefit from the scalability developments that we are pursuing at ECMWF So we have recently adopted a strategy for the next 10 years Where we say that we want to continue with the earth system approach to modeling and analysis We want to have ensemble prediction at a global resolution of five kilometers by the year 2025 That's a very bold goal that we hope that we can reach it And we want to have scalability across the whole NWP chain from the handling of the observations To the dissemination of the forecast information to our member states and this is a huge challenge But if we are given the appropriate amount of resources that we need to do this We feel that the science is there for us to be able to accomplish this That if we get big enough computers powerful enough computers if we're able to maintain and increase the staff We have then we will be able to achieve this and this will be for the benefit of our member states to produce even higher quality weather forecasts So to conclude the weather forecast require observations and data assimilation and earth system modeling Open IFS is an important part of our research and development program and ECMWF is a world-leading provider of weather and environmental forecasts based on research developments Thank you