 Next speaker would be Joe Tribia. Joe gave a talk as well in the colloquium, so thanks Joe for that. When we are ready, you can share your screen. Okay. Can everybody see my full screen because. Yes. So, once again I want to thank the organizers for giving me the opportunity to speak, and I'm going to talk about a rather unorthodox idea as to why we don't do as well as we think we should do in the subseason range. Just because the atmosphere and the weather events within the subseasonal time range may be unpredictable for a reason that I don't think has been explored very carefully at this point so let me go on to my talk. So, as I said, I think the subseasonal range going on beyond week two is a very challenging problem for prediction and, as you can see from this figure from the European Center in 2012. We can predict fairly well for for one week. And we're just looking at anomaly correlation here of the geopotential height field. And by 2012, we could predict day seven, 500 hectopascal height anomalies fairly well. Not perfectly, but well enough. But as you can see from this diagram, the skill drops off rather rapidly after seven days and, and by 10 days, there's, on average, absolutely no skill, no useful skill in the anomaly correlation of something is as benign as the 500 hectopascal height field. So, why can't we do better than this and frankly, the answer to that is a turbulent cascade of error from small scale to large scale and 69 Lawrence propose a diagram that looks somewhat like this one on the left, where the time scale of smaller scale oscillations smaller scale disturbances smaller scale errors was was able to traverse the spectrum at a time that was smaller than for larger scale so very small scale errors could propagate up to the 100 kilometer range in about an eighth of a day or a quarter of a day. So, from this perspective, whether prediction becomes very difficult after about one week, because after about one week, there's almost no skill left in the, in the prediction. But, and any skill that does exist is all in the large scale planetary scale waves. Okay. So, the unified version of this argument this this argument use a three dimensional turbulence closure scheme to estimate the predictability. And while the atmosphere on large scales is much more like a two dimensional turbulent flow on smaller scales and in fact, perhaps down to only down to the sub synoptic scale is it a two dimensional turbulent flow. And the subsynoptic scale it behaves with a power spectrum, kinetic energy spectrum that looks like a minus five thirds which is exactly the energy spectrum for a three dimensional turbulent flow. So, from this perspective, small scale errors can propagate up to the sub synoptic scale pretty quickly so probably in about a day or so, and then the two dimensional inverse cascade would take over and allow us to predict perhaps for a week, perhaps, perhaps a little bit longer out to 10 days, but not to week three which is kind of the first goal of sub seasonal prediction. Okay, so I'm going to propose an alternative to this inverse cascade of error. Something that is motivated are inspired by a surprise we had in some fraternal not identical twin experiments so this is a fraternal twin experiment in the fraternal twin experiment. The model simulation that is trying to predict it is not identical to the models that are trying to predict it and so in the early 2000s, Dave Baumhoeffner and I ran some experiments using a high resolution version of the NCAR model T170 as ground truth and tried to predict it using various course of resolution versions of the NCAR model T106, T63, T42 and we did all those experiments without any topography to not have to worry about the complication of topographic forcing in the problem just to look at the kind of inertial effects on predictability, the cascade of errors from unresolved scales into resolved scales. So, what we found was that, in fact, things behave kind of like we thought it didn't take more than a half a day for the unresolved scales to start growing. And once they started, the resolved scales to start growing, excuse me, error in the resolved scales to start growing. And once they started growing, they started growing exponentially, much as you'd expect in a quasi geostrophic or quasi two dimensional turbulent regime. However, one thing we noticed that was kind of a surprise to us because we tried very carefully to input the exact spectral initial conditions in the resolved scales of all four model integrations. So, to put the T170 initial conditions into the T142 model all the way down to T42 and started. So you would expect that at T equals zero, there would be no resolved scale energy. That at T equals zero, no resolved scale errors at T equals zero. So how did that error get into the integrations when I told you exactly what we did, we put in exactly the initial conditions that the T170 model saw at T63 resolution and T106 resolution and T42 resolution. Well, the answer to that was that before integrating the model, the model checked for static instability, okay, and in fact, spectrally truncated initial conditions can have different static stability, different static stability at a point, quite commonly when you think about how a spectral model behaves, okay. And so this was the T106 temperature error at T equals zero, temperature difference at 500 millibars, 500 at the past scales that we saw using the T170 initial conditions and allowing the system to do this dry convective adjustment before it started integrating. At 500, at 500 at the past scales, you also see this difference in the height field and what I want to point out to you is that this is the initial error caused by this convective adjustment. And what you can see is that there's quite a bit of energy at large scale and at small scale. And large scale and small scale here was the difference between wave number 15 and everything else that evolved. And so you can see that there's large scale errors in there. And if you go back to the previous slide, excuse me. You can see the this energy spectrum at T equals zero is absolutely flat. Okay. So this is the interesting point that I want to point out that this instability this can dry convective adjustment cause is caused by a threshold nonlinearity. At one point, you change, you violate the threshold condition and that violation at a point is localized in space, but because it's localized its space, it is very broad in the spectral regime in in wave number space. And so you put in a delta like forcing in the initial condition a delta like condition caused by violating the threshold at a particular point and the threshold nonlinearity spreads that that influence out over all spatial scales. So it acts, not as a rather slow inertial cascade from smaller scale to large scale. It acts as an instantaneous cascade contaminating both small scales and large scale. Okay, so a possible example of this is shown by the the April 10 2011 dropout bust which was studied in the TIGI regime and Mark Radwell did and a post of other people did a very nice study of what was going on in dropouts over the European regime. The European domain and up above you see the rapid decline in the correlation score around April 10 in the 500 hectopascal anomaly correlation skill score. What you see below is the kind of verification that's common for European sector bus at least in the in the ECMWF system. So this is the common 500 hectopascal anomaly that comes out of a composite of bus situations over about 30 years time, 30 years of samples. What I want to point out is that associated with the height anomalies in that composite and this is there's also a CAPE anomaly associated with that that composite and in the particular case of the April 10 the CAPE initial conditions are shown here and you can see there's quite a strong CAPE anomaly both around the upper Midwest and on the east coast of the United States. This leads to in fact precipitation anomalies both forecast and the observed precipitation anomalies and the observed precipitation anomalies are in fact quite a bit stronger than the forecast precipitation anomalies and precipitation condensation is exactly this kind of threshold nonlinearity and condensation at a point where in a localized region can lead to anomalous errors, not just in this region where the forcing is where the anomalous precip is where the anomalous precip errors are but downstream and upstream because it causes this instantaneous cascade from small scale to large scale. So now I just want to ask if we can see this in a simplified setting. Joe, maybe in a minute. Yeah, I will quit. I'll be done in a second. So what we saw did some experiments with the moist and water equations which have an on over threshold threshold instability showed that in fact you do see this instantaneous cascade in the moist spectral evolution, you don't see it in a dry spectral evolution. And I'm going to skip this because it's a little too hard to explain in a short period of time and move to my conclusions. My conclusion is a somewhat unorthodox explanation because history and tradition views quadratic nonlinearity in hydrodynamics as root cause of chaos and unpredictability. In reality, this mechanism can be quite slow when compared to threshold nonlinearity inherent in the phase changes of water. This is particularly so when one examines spatial scale interactions so you get an instantaneous cascade. And this may in fact limit accurate sub seasonal forecasting in particular to forecast some forecasts of weather to the forecast only a forecast of opportunity and not into a more general ability to forecast at the sub seasonal range and I'll end there and take questions. Okay, thanks a lot Joe. Great point in a great talk. I don't see any questions or hands raised yet just I had a question in terms of being able to reduce this error that comes from the moist physics. Do we need to get into convection resolving regime for models or is it at all possible with stochastic physics and parametrized convection. I think all of these will have some limitations associated in my opinion. Fact is, will be limited to how well in convective resolving situations, how well can resolve the exact location of convection. And we do method scale prediction at convection resolving scales we oftentimes allow ourselves the opportunity to say, Well, the convection is not quite in the right spot or it's, it's delayed a bit in time. Those errors in time and space will lead once again to this kind of instantaneous cascade, as will errors in parameterized convection and errors input by stochastic parameterization so I don't think there's an easy way around this problem. It's a it's a fundamental nature of the high order non linearity associated with kind of threshold non linearities. Thanks a lot again for the talk, Joe and we'll have more questions during the interaction networking sessions later in the week so