 OK, so I think this is a perfect first session to follow Ms Schnabel's very interesting remarks. And so hopefully we can kind of digest some of what we just heard also in this session as well. So let me just open them by introducing Giovanni Nicolo. Giovanni is an economist at the Federal Reserve Board. And I understand as well he was also at the ECB for some months. So I'd like to welcome him back and thank him for being here today. So as you can see, Giovanni will be presenting on inflation and real activity over the business cycle. And this is joint work with Francesco Bianchi and Dango Song at Johns Hopkins University. So once Giovanni is ready, he has 25 minutes. Thank you for also behalf of my co-authors. Thank you for the organizers for including our paper into the program. This is today we'll present joint work with Francesco Bianchi at Johns Hopkins University and Dango Song at the Carri Business School of Johns Hopkins. I'm Giovanni Nicolo and work at the Federal Reserve Board, so the usual disclaimer applies. So prior to the recent inflation surge, there has been a recent revamp of theories seeking to explain business cycle fluctuations in the US over the entire post-World War II period. On the one hand, some theories have been arguing that real activity and inflation are disconnected. These papers point to the main drivers of business cycle fluctuations as shocks that look like demand shocks, but importantly have no inflationary effects. Other theories instead completely abstract from the implications for inflation of their business cycle theories and an important empirical foundation of these theories has been offered recently by Angelitos, Collar and Delas in the recent AER paper. And in their work, the authors look at the US data for the entire post-World War II data and find that business cycle fluctuations are mainly driven by a main business cycle shock that drives real activity, but is disconnected from movements in inflation. So what we do in this paper is to provide an empirical study on the nexus between real activity and inflation at business cycle frequencies for the US post-World War II period. What we argue is that we need to explicitly account for movements in inflation and real activity at frequencies other than the business cycle. In particular, we provide a framework that helps reconcile differences between our results and those in previous studies that point to indeed a disconnect between real activity and inflation over the business cycle. So in our paper, we adopt two empirical approaches. The first one is based on a simple motivating analysis. In particular, we adopt like a band pass filter on US data and on measures of both real activity and inflation. We extract business cycle measures and provide a motivating evidence of a relationship at business cycle frequencies between real activity and inflation. Motivated by this simple evidence, we then adopt the more rigorous approach that consists of three steps. The first one, the first step consists of estimating a trend cycle VR model on macro and expectations data. This framework has been pioneered by Stock and Watson and has been more recently adopted by the Negro and Calters as well as Johansson and Mertens and Ascari and Fosso among many others. The second step consists of applying a max share identification strategy to the latent cycles. This approach has been pioneered by Ulig and has been recently adopted by Angelitos and Calters in the paper that I mentioned earlier as well as by Basu and Calters when looking at the drivers of risky business cycles. Within our framework, we apply this strategy to identify a shock that explains the maximum share of volatility of cyclical unemployment. The third and final step is then to evaluate the contribution of the identified shock to movements of inflation over the business cycle. So our paper provides four key results. The first one, first we find that over the business cycle there is a strong empirical connection between real activity and inflation. Specifically, the shock that we identified explains nearly 30% of the movements in cyclical realized inflation and nearly 50% of movements in cyclical inflation expectations. Second, we argue that it is crucial to extract business cycle measures of real activity and inflation to properly address the question at hand. If we instead adopt an approach that doesn't explicitly account for movements in these variables at frequencies other than the business cycle, then the shock that we identified at business cycle frequencies only explains 8% of the volatility of cyclical inflation. Third, we show that standard VR models have difficulties in capturing this business cycle relationship. This result holds even when we impose long run priors that seek to capture cointegrating relationship within US data. And the last and final result that I will not have time to cover in detail during this presentation is to provide a theoretical reconciliation between our results and those of previous studies. Specifically, we will show a mapping between the trend cycle VRs and standard VRs and we will show that for the question at hand, VR are mis-specified to study this question and point to this disconnect between real activity and inflation because of this mis-specification problem. So let me then dive in into the detail of our data and method. So starting from data, we consider four series which are standard and taken from the St. Louis database. So we consider growth rate of real GDP per capita, the unemployment rate, the effective federal funds rate, and the inflation rate measures from the GDP price index. In addition to these four series, we also consider three expectations series. The first one is a measure of one year ahead inflation expectations taken from the SPF and therefore starting from the early 70s. The second is a measure of longer run inflation expectations for CPI and we then adjust this measure to account for the historical difference between CPI and price GDP inflation measures. And finally, so these first two measures are useful for us in order to robustly extract a latent trend for inflation movements. And finally, according to following the same logic, we then also consider a measure of one year ahead unemployment rate expectation that is also taken from the SPF and starts from 1968. So given this data, we consider the period between 1955 and 1959 as a pre-sampled period to set initial conditions and priors. And finally, we consider the period between 1960 and 2019 as the estimation period of interest. So given these observables, the composition that we assume for each of these observables are presented in this slide. At the top, we look at the growth rate of real GDP per capita and we assume that it follows a trend plus a cyclical component measure as the log differences in the log level of output. Then we consider the two unemployment rate measures. The first one is realized unemployment rate and the second is the one year ahead unemployment rate expectation. And here the key assumption is that the two share a common trend, tau u, but each of the two measures follows its own cyclical component, cu and cu with the superscript e. Then we consider the three inflation measures. The first one is realized and the following two are two inflation expectations measures. And here we also assume that the three have followed a common trend, tau pi, for realized inflation. We then assume that the realized inflation follows its own cyclical component and then a negative moving average component for measurement errors. Then we derive by assuming that the measurement errors is initially on the log of the price level and therefore when constructing the measure of inflation, then we obtain this first difference in the measurement errors. The one year ahead inflation expectations follows the common trend as well as its cyclical component while the longer run measure of inflation expectation follow the same trend. And then it can load on the common inflation expectation cyclical component by means of the loading delta, which we assume to be smaller than one. And then for any discrepancies we allow for a measurement error for the longer run inflation expectations. And finally, as standard in the literature, we assume that the effective federal funds rate follows the trend for the real interest rate, tau r, as well as the trend of inflation tau pi and then a cyclical component. So once we assume these the composition, then we can derive a measurement equation. So we denote by ZT is going to be our vector of observables. And once we define the vector of latent trends, tau t, cycles, CT and measurement errors, eta t, then we can derive the measurement equation as the first equation in blue. Where the observables collected in the vector ZT are related to the latent variables collected in the vector ZT XT by means of this matrix lambda. So what we still need to specify our model is a state transition equation. And as common in this literature, we assume that trends follow a unit root process. And each trend is subject to its own shock, which we collect in the vector epsilon tau. And then the cycles, we assume that they follow a standard VR process of like P, which in our baseline will be, we will consider two legs. So finally we can construct the result in transition equation, which is presented at the bottom. And we assume that the shocks to the cycles, the trends and the measurement error shocks are all uncorrelated among each other. So at this point, it's important to pause and see what are the advantages of using a trend cycle VR model relative to standard VR as to address the question at hand. Specifically, our framework allows to automatically separate the trends and cycles without the need of taking a specific stance on the length of the business cycles. Moreover, cycles and trend are explained by different sets of parameters, which will be important at the time of identifying a shock driving a business cycle fluctuations. And finally, our framework allows for the possibility of talking about relationship between cyclical inflation and movement in the output or unemployment gap, as this will be considered to be the latent cycles in these variables. So this slide shows the output of our estimation procedure. So here I'm showing seven panels, one for each of the observables that we consider in our specification. In each panel, the red line corresponds to the data, while the dash blue line corresponds to the 68 posterior density interval. And as we can see, our procedure delivers trends that basically display facts about the U.S. economy, which are well known. We see the trend of real GDP growth per capita falling and especially being low toward in the latest two decades. The unemployment rate rising in the 70s and picking the 80s before falling down and then increasing again during the great financial crisis. You can notice that the trend of the unemployment rate is common between the two measures of realising expected inflation. And then when we look at the three panels at the bottom, we can see that as it is well known, trend inflation rises in the 70s, peaks at the time in the early 80s, when then the appointment of Paul Volcker led to a decrease basically of inflation and therefore anchoring of inflation expectations, which kept also inflation down over the most recent period. So from the estimation procedure, we also extract latent cycles, which even in this case display features which are well known. As you can see, the latent cycles for realising expected inflation tend to rise during recession and gradually decline during recoveries and similar for inflation measures. So inflation tends to decrease during recessions and then slowly increase during the recovery. So using the latent cycles that are just shown, we identify a shock by targeting the latent cycle for realised unemployment rate. So here and after identifying the shock, then we look at the contribution of the shock for each of the variables of interest in particular the latent cycles. In this table, I'm showing the key result of the paper. So the shock that we identified by construction explains a large fraction of the variability of realised inflation, which is the first column in the table. And as expected and in line with other papers, we can see that this shock explains also a relatively large share of movements in other real variables such as output and expected unemployment rate. What is now a relative to the previous studies, the most recent studies, that this shock explains about 30% of the volatility of cyclical realised inflation and nearly 50% of the volatility of expected inflation. Here in this slide, I'm looking at reporting the impulse responses that we obtain from the identified shock. And as you can see, the shock resembles a demand shock that has, however, implications also for inflation and both realised and expected. And this is in contrast with the recent papers that argue in favour of a disconnect between real activity and inflation. So the second important finding of our paper is that it is crucial to explicitly account for movements of these real activity and inflation measures at low frequencies. So in order to show this, we estimate our specification of our trend cycle VR model, but in this case, we're assuming that trends are constant. So we're trying to get close to a VR specification. And in this case, we see that even if we identified the shock on the latent cyclical component, which in this case would correspond to a demean realised unemployment rate, and we identified the shock at business cycle frequencies, we can see that the shock still explains a large fraction of the volatility of the real variables, but it explains a relatively small portion of the movements in realised and expected inflation. And these magnitudes are in line with the results of ventilators and caulters, which indeed use a standard VR model to obtain their results. So the third key finding of our paper basically consists of showing that VRs have difficulties in uncovering this empirical relation between real activity and inflation at business cycle frequencies with finite data. And in order to check for this and to explain our differences of our results relative to those of other findings, in one case we impose alternative long run priors, which tried to seek to capture cointegrating relationship between data for the US, or alternatively we consider sub-samples that are less subject to low frequency variation. In this presentation I will only focus on the first case. Here in this table I'm reporting the results that we obtained by estimating standard VR models in two cases. The first row is the case in which we estimate the standard VR model simply with standard Minnesota priors, while in the second case, in addition to the Minnesota priors, we also impose long run priors, a la Giannone and caulters. And as you can see the shock still in both cases still explains a large portion of movement in real variables. But for inflation, whether you estimate the model with standard Minnesota priors or also with a long run priors, it improves the results, but still the magnitudes are still well below those that we uncover by means of our flexible specification. So I would just like to conclude. These are the main findings that we have in the paper. We present evidence of a strong empirical nexus between real activity and inflation, and we argue in this paper that it is crucial to control for low frequency movements in these variables. In particular, we argue that trend cycle VR models are better suited than standard VR models to capture this business cycle relationship between inflation and real activity. And this is because while trend cycle VR models can accommodate movements of this variable at other frequencies, standard VR models suffer from a mis-specification problem, which ultimately points to this kind of empirical disconnect between real activity and inflation. And in the paper, there is clear more results, and in particular we focus also on this fourth final result about the theoretical mapping between trend cycle VRs and standard VRs. And we indeed show that with finite data and a finite number of legs, a standard VR model has difficulties of capturing this relationship because of an underlying mis-specification problem. So thank you for your attention. Thanks a lot, Giovanni, for a great and very interesting first presentation. Now let me give the floor to Marta Bambura, who's a lead economist in our forecasting and policy modeling division in DG Economics at the ECB. So Marta will discuss the paper before we open the floor for questions. So Marta, you have 10 minutes. Yes, thank you. So good morning. I would like to thank the organizers for giving me the opportunity to discuss this very interesting, well-executed and comprehensive paper. Before I continue, I would like to mention that the usual disclaimer applies that these are obviously my own views. Okay, so this is an outline, it's a short presentation. I was tired to, like, you know, phrasing the problem and in particular pointing some differences to the approach on Angulatus et al. This is, in a way, a starting point for this paper. Then I will offer some remarks and I will conclude by, you know, proposing some further references to other work, maybe, like, you know, putting, like, you know, connecting, like, this work, like, you know, more empirical forecasting literature. All right. So the question is relatively straightforward. So is there a strong relationship between unemployment or activity and inflation at business cycle frequencies? And this is an, like, you know, I guess, like, by now we all know, given also the previous presentations that this is now, like, you know, a very topical question. There is an important debate on this, so it's important, you know, both for theoretical macro, for empirical macro and importantly also, you know, for policymakers when we talk about, like, you know, the steepness of the Phillips curve, the sacrifice ratios, and so on. Okay, so, you know, as I mentioned, the paper sort of, like, starts with the AER paper of Angulatus when the answer to this question is no. So what they do is they fit the VAR and then they identify a shock to unemployment, essentially, or the shock that moves unemployment at business cycle frequencies using frequency domain techniques. And then they say that this shock does not explain much variability of inflation at business cycle frequency. And what this paper does is it says, well, actually, the answer changes if you are more careful or, you know, use the more sophisticated models to explicitly account for low frequency movements in the variables of interest. And, like, you know, the debate has been there for long, so later I will sign, like, you know, a paper that is essentially 30 years old and, like, you know, so basically like the decades long debate and the paper offers some important insight. And I, overall, I agree with the approach and I find the results convincing. However, I have some questions on the interpretation of the difference with respect to Angulatus. So I don't find yet enough evidence, like, you know, on the interpretation that this is indeed the specification, like, you know, of the model in particular on the low frequency versus that you're after different objects of interest. And to illustrate this better, I have to, you know, talk a bit more about the methodology. So basically what Angulatus et al do, they estimate the VAR for their vector of observables xt, then they invert it so they get the smoothing average representation when ET are orthogonal. So, like, you know, you can think about as structural shocks. And this is basically the smoothing average based on Cholesky, the composition, and Q is the identification matrix. And then they do identification, partial identification in the frequency domain. So basically they search for a column of Q that I denote by this lowercase Q. That would maximize, you know, to find the shock that would maximize, you know, those movements at business cycle frequencies for unemployed. And they do it using frequency domain techniques. So basically frequency domain techniques rely on this idea that every station, each stationary series can be expressed as an aggregation of an infinite sum of waves of different periodities. And these waves have a random amplitude, which are summarized by an object called spectral density. And basically once you have the spectral density, which shows you sort of like which periodicities are, you know, important for your series, you can also actually go back to derive the autocoviruses of the series. So, you know, they use this insight basically. So what you see here is essentially the spectral density of this object. And they, you know, and they, in order to identify few, then don't take the spectral density over the entire domain, but they focus on specific frequency to maximize your Q. All right. And in this, and basically what this paper says is that this is too simplistic. So basically you shouldn't do it for XT, but you should further decompose XT to tau, which is the trend, P, which is the cycle. And for some series, you also have some measurement errors. And basically they say that you should maximize only on the C. So basically forget about tau and eta and focus on C. And this is an important difference as our argue. Basically, my question is, like, you know, is it really them a specification of the low frequency movements, or is this that you're sort of optimizing or you're having in mind different objects. So in particular, like, you know, if you think about the framework of Angelitos, they do not discard tau T and eta T, and there could be also this business cycle frequency waves in there. So basically, nothing in your estimation that excludes the fact that you could have some business cycle movement in eta and tau. And I will give some example in a moment. And in this, so in the paper, you basically define the cycle difference here from Angelitos, where you, when you focus on C. And maybe a bit of a site comment is that you use business cycle and cycle interchangeably. And sometimes it's a bit confusing because as I said, also C T could contain different ways. So not necessarily only being business cycle one. So the question is, like, you know, whether this object that you're maximum, and what you know, but Angelitos optimize for, like, you know, the parts for the tau and for the eta are almost zero. Because if they are not, then basically, like, you know, Angelitos will also use those to optimize their procedure where you abstract from them. And there is a bit of tension of frequency, frequency domain versus time domain definition because Angelitos use, like, you know, this frequency domain definition. Where you focus on the time domain definition. And this is also relevant to your Monte Carlo experiments. In particular, you simulate the tau. And, you know, in the simulation, there is nothing to prevent also this business cycle waves to appear there. So, you know, if they are important there, then, like, you know, the procedure of Angelitos would not aim, like, you know, would not focus on C T, right? So, like, you know, it's a bit unfair to measure it, like, you know, how well it does for C T if it's not focusing on C T. Okay. And I give you a simple example. So basically forget about the low frequency movement. So I have here the simple model where the unemployment rate is given just by the cycle and inflation is given by the cycle and by some measurement error, which is a white noise. And so, like, you know, if you think about your framework, then inflation, like, you know, is perfectly explained by the cycle, right? It's just the same variable. However, if I do Angelitos here, one has to remember that for white noise, like, you know, there are also business cycle frequencies there. So in particular, there's objects that you're optimized will not be zero over the business cycle frequencies. And so if you do, like, you know, the approach of Angelitos here, like this part connected to ETA will appear in your denominator. So, like, you know, there are important measurement errors here in Angelitos at AL, you know, you will get, like, you know, low explanation of C T for inflation. But, you know, this has nothing to do with the slow frequency trends. It's just that, like, you know, you're after different objects. Okay. So, like, you know, I, as I said, I agree with your approach, but interpretation, what I think one would need some more analysis there. And also some more clarity on the definition. And then I have another remark is on the model specification. So when I read, like, you know, your VR, I was thinking, what are the roles of expectations in particular, whether they help you pin down some stuff like trend cycle measurement errors. Or they're there in their own right, because you're interested also in the transmissions for those variables. Then, for example, why do you use short term expectations for unemployment and longer expectations for inflation to pin down the trends doing it all the variables? Why only you have two measurement errors for two specific variables? Why do you consider particular variations of the model? So this was a bit, like, difficult to understand, at least from the version of the paper that I got. And, like, general question is, do we need such a complicated model? Because it seems that your results are robust to a simpler one. So why not use a simpler one? And then I have some smaller remarks. So one refers to this prior for the trend. So the prior for the trend is very tight with a very small variance. And the question is, until you follow the Negro, so the question is whether you really need such a tight prior. So this usually you need, if you let the trend vary freely, but since you pin it here to expectations, maybe you don't need. And also, I was wondering how dogmatic this prior is. So looking at priors versus posterior would be interesting. Then, in some of your simulation studies, some results, in particular, the case two, where you vary, like, you know, how important is the trend in unemployment goes a bit against the results and angelettos that they try, you know, this robustness to taking unemployment gap versus unemployment. And I was wondering whether this is related, like, basically, that when you do the simulation, you might have in your Tauti actually some business cycle, like, you know, elements. And one question that I have is actually, in general, how this method works for non-station. Okay, that's okay. Then maybe we can talk about this later. So the paper that is very old that I wanted to mention is actually stuck on Watson in 94, that used very similar methods that you study in one of your initial sections. So they also have, like, the composition into trend, business cycle, and irregular, and they have similar insights. So I guess it would be nice to refer to this paper, basically saying that you get similar results with different methods. And finally, like, you know, this idea of modeling inflation gap or abstracting from, like, you know, modeling separate the trends and cycles has been there for a while in forecasting literature. And now you also show that it's important, like, you know, as you look in inflation drivers. And I just do, like, a couple of examples. There are also some semi-structural models that show the importance of, like, you know, the separation. And then there are, like, two important, like, interesting papers, like, looking more at this frequency, like, you know, other relationships at different frequencies that you might want to look at. Sorry, I'm late. So thanks a lot, Marta, for that very interesting discussion. Just to remind our participants online, we're monitoring WebEx in case there's any questions coming in. To give Giovanni time to answer adequately, maybe in case I'd like to take maybe one or two questions from the room or from WebEx if there are any to kickstart. Yes, we've got two questions here. I was just wondering, because it's possible for inflation and activity to be strongly correlated and still for the slope of the Phillips curve to be small. So could you back up from your estimation something that would inform the slope of the Phillips curve and would that be very different from estimation that is obtained from non-VR? Another question here at the front of the room. So I wanted, I mean, the Phillips curve is a conditional correlation, conditional on one shock, a demand shock, that's supposed to supply shock. The way I understood Angelitos is that he says, well, let me take a linear combination of, let's say, demand and supply shocks and the one that explains the most while consistent with the covariance in the data. And so it's perfectly normal for it to explain zero of inflation and yet tell me nothing about the Phillips curve, it just says that imagine a world in which 50% are demand shocks, 3% are supply shocks, that explains 100% of output and it happens to cancel out exactly on inflation and it's 0% there. So unconditionally, you can get any of them. So should I interpret your result as saying that in the long run, there are mostly supply shocks that are uncorrelated with inflation, but in the short run or at the business cycle frequency we have some demand shocks and how do I link that with, let's say, the Tenreiro and others' work noting, well, the central banks demand shocks are themselves a response to supply shock and therefore you're telling me something about how central banks are effective or not at long run as a short run, which combines both what they can do with what they have tried to do. Thanks a lot. Maybe I'll give the floor then to Giovanni to give him enough time to respond. All right. Thank you. First of all, thank you so much, Marta. This was a very insightful discussion. There were quite a few points. I tried to take some notes, but I think that the two key relevant ones were in terms of the interpretation of what we consider cycles. So what are business cycles in Angeleto's framework and our framework, whether it's a fair comparison. The way in which I've been thinking about this is that to a certain extent, our framework is more flexible and could potentially be thought of as a generalization of a VR model. So within our specification, we do have room for obtaining the results. So there is room for having basically the same interpretation of what is a cycle. However, what we find data is that once you allow for these low frequency components, the data do prefer that specification in terms of they do find these latent trends in the variables, and then it turns out to be kind of relevant for what are the latent cycles that we extract. And I think that this then bridges nicely with the second point in terms of whether business cycle frequencies may still be present in trends or the measurement error parts. I think that we may do more work on that. For now, the way in which we have been thinking about it was the concern instead of whether the business cycle, the cyclical components that we extract are actually kind of reflective of the business cycle frequencies and whether it's that low frequency or measurement errors were still part of the latent cycles that we extract. And for this reason in the paper, we do have kind of some extra analysis that basically seeks to address this point by identifying the shock not on the entirety of the frequencies of the latent cycles that we obtain, but also we do analysis by means of identifying a shock that basically removes the high frequency component to the cycle part, and we see that the results there are still valid. But we can certainly kind of make a little bit more work in terms of like figuring out whether some business cycle parts are, business cycle frequencies are still part of the trends as well as measurement errors. For the priors, they are tied. Clearly, this has been a topic of debate in terms of this model specification, one of the first concerns that is raised is indeed about the identification of these latent cycles and latent trends. And so we do have robustness analysis in terms of like alternative priors to the trends. So we find that for what we consider reasonable sets of priors, the results are kind of robust, clearly, and that's kind of the second result, key result of our paper. If those priors to the shocks to the trend are too tied in terms of like assuming almost like a constant trend, that's when we go back to the world of a disconnection between real activity and inflation. On the point of making the model simpler, I welcome it and I think the way we're thinking actually about that, we have received this comment while presented at different conferences this summer. So I think we can totally try to make the framework a little bit simpler. And the last questions from the floor, I think that I could probably try to combine the answer because it seemed kind of similar. Fair point, the way in which the shock is identified is by means of a linear combination of the shocks. It is, however, still interesting that the shock that is identified and when we look at these impulse responses, it does look like a demand shock. So you see that when you have a shock that strengthens the economic activity, then you see also inflation rising. And this was the sense to which we were considering it and labeling it as a demand shock and therefore this narrative of the new change of Phillips Curve. That more work could be done there. Different sets of additional price, additional restriction, maybe some restrictions could be a way there to help discriminate a bit the way in which we identify this shock and give a little bit more of a structural sense in terms of being considered as a demand shock and therefore linking better to then what are the implications for the theories and the models that could be built based on this empirical work. But thank you very much for all the suggestions. This has been very helpful. Thanks a lot Giovanni and also Marcia for the nice discussion. Next I'm delighted to welcome Gauty Egerton to the stage. Gauty is a professor of economics at Brown University and this morning he is presenting a paper with a rather ominous title, it's back. The Surgeon of Inflation in the 2020s in the return of the nonlinear Phillips Curve. And this is joint work with Pierre Paolo Bonino of University of Burton. So Gauty you have 25 minutes and the floor is yours. Okay, well thank you very much for the invitation to come here. And let me just acknowledge right up front that the title is stolen from Paul Krugman. He had a paper called It's Back referring to the liquidity trap in 98. Here we were playing on that because we are suggesting that the original Phillips Curve like I'll explain to you in a second is back. So what's the motivation for this talk? Well the motivation is pretty straightforward. It is that a bunch of us, myself included were spectacularly wrong in the spring of 2021. This was right after the Biden stimulus that some including my co-author and friend Larry Summers warned would cause very high inflation and I said no. I predicted no. And even worse in the fall after you know if you recall there was a run-up in inflation in the summer and Larry was running victory laps. And I was still strong in team temporary saying no this is all much to do about nothing it's going to come down. And I even wrote an op-ed about it. Fortunately it was written in, it was only published in Japanese so nobody has seen it saying that you know it was going to come down. But then you know later events of course turned out that I was spectacularly wrong. So you know there were some in the team temporary sort of that tried to twist themselves into pretzels to say well it's sort of temporary-ish it's coming down now. But I think at the end of the day no I mean it was really a first-order mistake so I guess like Cain said you know when confronted with having made a wrong prediction well when the fact changed I changed my mind what do you do sir. Paper is about how I have changed my mind about inflation dynamics following making a sort of a first-order error in forecasting because in my back of my mind at the time was just a standard mutation model. Okay so the summary of this paper is what we're going to propose is a non-linear Phillips curve and as it turned out that is actually much more in line with Phillips original Phillips curve than what we have come to know to be the Phillips curve later and we're going to argue that this non-linearity kicks in we're going to call this labor shortage I mean that's just a word which is going to be some threshold for vacancies over you. You know in the first version of the paper we kind of emphasize the unitary value but like I was saying I don't you know there's nothing particular in the theory that says that the threshold needs to be one it could vary over time it could be different between countries and even regions and I think that's an and I'll comment on that a little bit although one seemed to work remarkably well in the aggregate in the U.S. as you're going to see in a minute. Okay so we're going to talk about this non-linearity kicking in when the labor market is sufficiently tight and by tight it is going to be important this be over you and I'm going to show you why that is important as opposed to only measuring tightness by an employment. So I'm going to first the structure of the talk is that I'm first going to provide some evidence then I'm going to provide a theory and you know explain the inflation due to labor shortage so that's going to trigger inflation and it's going to trigger inflation beyond just you know increasing marginal cost as measured by wages. So there's a recent paper by Blanchard Bernanke who sort of attribute most of the inflation search to supply factors but in their paper the labor market is really just forced to only operate in Phillips curve through real wages. There's not an independent mechanism through which labor market tightness can have an effect but we're going to see in this model it is going to have an effect essentially of real wages. More importantly perhaps is that this is actually key and something we have emphasized more now as we have in the states of rewriting the paper it's in continuous hopefully converging on something which is that in an era of labor shortage when BU is above some critical matter a key aspect a key prediction of the model buy shocks much more inflationary and we're going to show that both in the data and it's going to be true in the model and in the model it's going to be kind of very simple I mean you have two inputs labor and some intermediate input and they're going to be in perfect substitute so if then you have tightness in labor and you have increase in cost on the other input well it's going to be the effect of the increase in the price of the intermediate input given that it's hard to substitute it's going to make the effect of price increases all the bigger so it's going to be this combination so our story is going to be labor tightness but it is also going to be supply shocks but the supply shock in combination with labor tightness and we're going to be able to see in the data we can decompose these okay policy implications like I was actually pointed out here in the nice introductory speech we had here soft landing is very much possible so I decided even to stick my neck out and predicting a soft landing a few months ago just you know maybe to be invigoratingly wrong again and that would have led to another paper where I'd have to change my mind yet again although but I think I've been inspired enough to start making more predictions okay now this here what you see is actually the original Phillips curve and what you notice immediately about it this is in his paper in the economic journal 1958 is how nonlinear it is and in fact the ending sentence of the first paragraph of the paper saying that that's the main point that it is very nonlinear and the argument he makes is that when the labor market is very tight so that's towards the zero in unemployment then firms start bidding against each other to raise wages while when the labor market lacks when there's high unemployment workers are going to be very very reluctant to accept cuts in the wages relative to prevailing wages due to some wage norms or downward rigidities in wages and that's going to be an idea that we're going to leverage up and it's actually harks back to the original pain jenism the crude pain jenism what Anand Blinder calls it in his new book which was basically the idea that prices would just fix in the short run right and that was the Keynesian economy and here I've compared it to this estimated Phillips curve just in one minus mu this is Phillips actual curve just here just re-plotted and the point of the crude pain jenism was that okay you operated here in the Hicksian world with fixed prices until you hit the wall right when factories are underutilized you can expand production but once you have hired all the workers there's only so much you can do you know you run out of workers you hit a wall and you start just entering a new classical world so that was sort of how they were thinking about it and when you look at Phillips original estimation well it kind of has that feature sort of backwards or inverse L and that's going to be the type of Phillips curve that we're going to derive okay so the thing is though when Phillips writes his paper in 58 what really made the Phillips curve you know a household name not household economic household name is when Phillips came to America and that's in Solon Samuelson 1960 AEA it's a famous paper about trade-offs between inflation and output and that sort of found its way in all textbooks but you can kind of see okay maybe there's something in nonlinearities but that's not anything they talked about it looked sort of linearish right and what happened next you know it was not that people started talking about nonlinearities no it was the fact that this was a static relationship in their paper and then you know come so it looked really good the first decade after their paper but then in the 70s expectations started moving all over the place and we all know this picture where you know this sort of led to the breakdown of Keynesian economics and the rational expectation revolution but you know what killed the Phillips curve in sort of the official narrative was that expectation became an anchor okay okay so the pre-collapse this is a typo the pre-collapse and Keynesian consensus really was that what was needed was to you know augment with the old Keynesian which was the Solon Samuelson Phillips curve this is a measure of lack expectations and then some supply shocks right that's sort of where we were and I think importantly the consensus prior to the run-up and this is one example and you know it's one of the panels here Jonathan Hassel paper with Hirono Steinsen Nakamura as a well-represented of that consensus was that you know this number here capo was very very low you know and the important thing to be noticed is that the sample they were using was 78 to 2018 okay just keep that those dates in mind because that's gonna their estimate is going to be completely consistent what I'm going to say here it's just going to be on the flat part of Phillips curve so there's an example here you increase unemployment by 1% according to their estimate inflation just goes down by a third of a percentage right so it's very costly to bring down inflation just like was being discussed before so what was the this was perfectly consistent with the stories of the 70s because the stories of the 70s then according to this narrative had nothing to do with the output gap it had all to do about expectations this here shows you the actual inflation in the 70s and you see in the blue here inflation expectation is measured by the Livingston survey that goes pretty far back and you can see that expectations really became unanchored during the 70s but the thing is that's sort of what that's why a lot of us like me in March and again in the fall thought that this would be a temporary phenomenon because our frame of thinking was that well in order to get inflation really going in a way that's not just some temporary supply shock you really need the expectations to go right and this is just one year had expectation if you looked at the 2005 year forward and you know the fall of 2001 they were doing nothing or you know you could talk about like Ricardo will talk about maybe some tales but you know but compared to the 70s you know it was really not doing a whole lot okay so that let us yeah that let me and Peter Paolo to think okay so what are we missing here so we have an inflation it's not expectations that are driving it it doesn't look like anything like the 70s our last great inflation so maybe we should just look a little bit broader and think about the five great inflation and last five inflation excluding the 70s what do they have in common okay so let's just take a look at that what do five inflation shares have in common excluding the 70s well what you see here and this is actually I got this from a paper I saw this in a paper by my former colleague Pascal Michelatin so they're not talking about inflation in their paper they're talking about what is the efficient amount of labor market tightness and their efficiency criteria is one that you can see here and one thing that I found was curious was that when this thing goes above one which is here post COVID and then it's here in the late 60s when you had the run-up of the Vietnam War and the Lyndon Johnson tax cuts you also see spikes in inflation and if you go further back well the other great examples of very high inflation is World War II where this measure really there's a very tight labor market and World War I it doesn't go quite far but that is another period now there's some blips here that you talk about but these are sort of the five big ones is basically World War I, World War II the Korean War here is another spike although we didn't include that in our sample because of price control we didn't include this either because of price control one thing I've realized in this paper awful as they may be and I've lived with price control growing up in Iceland and they're not a good thing by any means but they do seem to work to some extent I mean there's just rationing then leads to all sorts of inefficiency but they do seem to work on headline inflation they're just a terrible idea for other reasons okay so the key idea then is going to be that we're going to have we're going to be I would maybe even make this flatter if this is just drawn by hand so the hypothesis we have been in this region since basically in the sample for example that Jonathan and go after estimated we have been really in this year region since the 19 since the early 1970s basically and we haven't been in this year region except for in these five major episodes so it's now but you have to go back to the late 60s to find another example where labor tightness becomes such that you enter this non-linearity so labor tightness here I am defining as number of vacancies so that's the number of jobs that firms are trying to fill relative to the number of people looking for those jobs and in the US it turns out that a good sort of cutoff is going to be one there are more firms looking for workers than workers looking for jobs that you know can be changing with time and across areas and countries okay so let me just get right ahead but before that I do want to just briefly okay so why are we emphasizing B over U as opposed to the more traditional U metric and you know we are not the first to do this we start seeing a bunch of papers emphasizing that this is a better measure in the Phillips curve than unemployment and the reason is basically this inflation surge if you look at unemployment unemployment was still above pre-COVID era from like this is this is March 2001 the 21 the month after Biden passed his stimulus package and this year is March 22 when this fast starts having rates I should have had inflation here but you kind of know how it looks like between these two dashed line it just spikes it peaks at starts running up here and then it pops out when the fast starts running right so according to this metric here according to you there is still slack in the labor market if you look at labor force participation the favor of the Fed was the primage that because it doesn't have democratic primage workers relative to population it was also showing some slack on the other hand V over U was blinking you know red so it seemed to be picking up the signal much stronger and that is different from what we had seen in recent past U and VU seemed to be pretty closely tied together but here we saw really that VU was given I think more accurate information about how tight the labor market was than only looking at these traditional measures like U or labor force employed over labor force participation okay so let me just get to the empirical result and then the model and I spent too much time I'm afraid on so this is just data okay this is just showing you here in blue all the observations from 1960 to today this is log of the tightness and this here is for inflation and and you can see here that it seems to log of 1 to 0 right so you can see here when theta goes above 1 and you get the pink dots well something seems to be going on this here is from in the late 60s and this here is basically in the inflation church so in other words for example studies that we're focusing on the data from 70 leading up to all the way up to let's say 2012 they would just be looking at the blue dot where you're not seeing a very big slope you really see the slope when you get a very tight labor market so that's the point of this and we can formalize this in regression and that's in fact what we do next and we try to be as uncreative as we could in the sense of taking a regression that many people have run before where you have lagged inflation and you have like Larry Ball has a similar regression and here we have measures of inflation expectation we experiment with that great number of one our only innovation our only innovation is to be at a dummy when mu goes above a critical value and we have some tests what this critical value is and you know it is not a project that is just one so that's just what we stick with you know you can see it in the figure it seems to match relatively well although we don't want to tie our hands and should do that as the ultimate truth and what I want to point out to you here because we have limited time is that you know the slope this is the later sample the slope here of the Phillips curve it is relatively flat but it becomes a lot steeper when you add the dummy and that's just formalizing what you saw in the scatter block but adding these controls here the second thing is that here you have this cost per shock there also you get a huge extra bump in fact the effect of cost per shock is not statistically significant from zero except when you add the dummy okay it is has the right sign in the whole sample without the dummy but you know of the whole sample period but you know when you add the dummy you get really what looks like especially in this last period you get this big bang okay so five minutes so one thing is that you know there are very moderate supply shock in this period so it's kind of okay how could supply shock because it's a big role these are the traditional measures of supply shocks you know CPA had launch out PC import price shock principle and this is the first principle component but you know this doesn't scream at you it's huge supply shocks it only does so if you kind of interacted with this interacted with this labor shortage and that's what we do here just taking from the regression you see where there are bars these here are the cost per shocks interacted with the tightness that's the red with the lines in and then also you see the labor market tightness with interact with you know theta being greater than one and then so you see that that is really the nonlinearity is doing all the work here you know the actual cost per shock the point estimate was negative although not statistically significant wasn't doing really anything it's the combination of the two so it looks like I only really get to talk about my empirical result here but don't get to the model this is the excitement of the theorist finally having empirics in a paper we did hear a common filter and we see also that the weight on the coefficient really runs up when we run this kind of regression we do a bunch of robustness checks let me just I wanna I certainly don't want to end this talk about managing a major a major motivation for this you know I have a brilliant student Julia who by the way is on the market next year so you know I'll put a explicit advertisement on the bottom of the slide in a second and so this is a picture from her paper with Corrado and you can kind of just see so this is cross metropolitan statistical areas you can see this just unemployment there's some pretty clear evidence of nonlinearity this year this is from 2011 to 2023 okay so this is cross state MSA level she just recently got V over U measures and you can see similar patterns there okay so uh okay so here is just my advertisement that my student is going to be on the market but this was actually the first time I started thinking about this when she was showing me these results and then that combined with what Pascal was working on led to this okay the model is going to be done very quickly the model is just it's just going to be model in which there's an dodge in a labor force participation I'm just going to flip through these slides and the key thing is that at the beginning of each period the household is going to choose labor force participation there's going to be number of people employed unemployed and there's the there are going to be some vacancies and there's going to be labor market tightness okay so that is F just divide into employed unemployed how are these things determined well they are going to be determined at the beginning of period there's going to be just a fraction of people that are attached to firms that belong to the set of employed workers and instead of a new change in model that would be S equal to 1 everybody is attached to a firm if S would be equal to 1 then everybody would have to search in each period okay so sort of nest nicely these two cases then there's going to be a standard matching function here and this is then going to be total labor supply that is composed of those that are attached to firms and both are successfully searched and matched with a firm okay so this is the household's problem and you're just going to get a labor force participation here decision that's the new thing relative to what we see usually in this model the firms are going to be just doing Calvo pricing not Calvo Brodenberg pricing for simplicity and the only quote-quote new thing here that's not really new but is that you're going to have intermediate input apart from labor and this is going to be provided in elastic supply but it's going to have some exogenous price so that's going to be one of the cost for shocks okay think about an oil it's an endowment and the oil plug just decides the price and then you know that's going to trigger cost book push the variations that's going to show up in the long linear Phillips curve so what we're going to get then okay vacancy creation first let me talk about a hot market that's what Phillips had in mind when there's very no unemployment in that case weights are flexible out of time when we get to a normal market there's going to be a weights norm because weights are rigid there's strong evidence that weights behavior is very sluggish what we call norm and we're going to model it in this way which one applies is basically going to be the max of these people have to have higher wages than the norm but they would never accept anything lower and result is going to be a non-linear Phillips curve I'm out of time so there's terrible time management this thing here and I'm just going to end by amplifying the point made in the opening speeds the 70s here are a period of shift in expectations which was very costly to bring down inflation because you wanted this part of the curve this is in my high space now we have the prospect of being on this part of the curve on the steep end where it may be a lot cheaper to engineer a drop in inflation provided we are here meaning that you know their prospect for soft landing so that's I'm out of time I'm sorry that I didn't get to go through the model I assure you that it is very interesting and it can't be Brad so that new framework for understanding inflation spike replacing the Keynesian Phillips curve with sort of an inverse LK new Phillips curve with theta appearing evidence and policy implications and I've just touched on you know a few of those issues thank you thank you very much Gautie for a very interesting presentation very timely I think as well and I hope this time your prediction is right and that we'll have a soft landing to discuss the paper now we have joining us online Antonella Tregari Antonella is a professor in the economics department at Baconi University and I see her there online so welcome virtually to the room Antonella you've 10 minutes and please go ahead thank you thank you very much can you hear me well we can indeed very good so let me share the screen can you see the slide yes all clear okay so we are all so thanks thank you to the organizer for for inviting me to this very interesting conferences I am very sad I cannot be there but I thank you for giving me the possibility to discuss this paper and to do it online let me start by giving you some well known at this point context where the paper contributes to the debate on the drivers of the COVID era inflation search and in particular it tried to answer the question what calls it and why most economists failed to predict it now the possible drivers that have been extensively debated pre-inflation search and post-inflation search are essentially three demand stimulus coming from COVID era fiscal and monetary policies possibly causing excessive labor market over eating and within that factor driver the debate has really focused on you know the most appropriate measure of labor market diner in particular you the ratio of vacancies to an apartment as well as on the slope of the Phillips curve and in particular on the possibilities of non-linearities in the relations between tightness and inflation supply disruptions and then de-unkering on inflation excitation now the debate is often framed within a conventional Newtonian Phillips curve and the view research was I would say mostly optimistic so the idea being that even if policies and the rapidity of the recovery would have caused large over-eating of the labor market this would have not caused a large increase in inflation for many reasons the first was the low sensitivity of inflation to lack a small block coefficient in the inflation curve and second that expectations were likely to remain uncored given the the behavior of inflation in the in the previous decade now there is a current debate on the drivers of inflation many papers are currently being written on what are the most important drivers what is the relative contribution of the various drivers that I listed in the previous slides the debate is not settled this paper concludes that tight labor market is a key driver and the mechanism is going to be via a non-linearity in the Phillips curve but there are different views in the literature and let me just cite a recent prominent paper by Bernanke in Russia a paper that reaches a quite opposite conclusion in terms of the role of labor markets in driving inflation now what does this paper does so first of all it measures the lack with labor market tiniest the duration then it presents evidence of non-linearity in the Phillips curve and the starting point is to document an exceptionally high post-pandemic labor market tiniest as many have done as many others have done as well but it also connected to labor market tiniest in wartime so that those levels have only been seen in post-pandemic actually even pre-pandemic during the Trump presidency for a very short period of time just before the pandemic he came in the war time, World War I, World War II Korean War and the Vietnam War and then it connected to inflation surges and estimates the highest level of co-efficient when market tiniest is indeed high specifically above the threshold defined to be one which is taken to define a labor shortage and finally they this is one of the most important contribution of the paper they develop a model of a non-linear Phillips curve the non-linearity arises from a committed wage setting and the model as I will comment is a I would call it a known conventional search and matching model will ever supply a known forward looking vacancy box then they use the model to explain the great inflation of the like in current is this surge inflation has been extensively compared to the great inflation and so they I think importantly make the point that the paper can explain both episodes and in particular in they when it comes to explaining the pandemic era inflation they argued that the model argued that the inflation surge is caused by an exceptionally tight labor market a labor shortage and that tight labor market will be economy on the steep segment of the philipster and finally then derive policy indication a soft landing is indeed possible because a steep Phillips curve implies that small increases in economic activity can lead to inflation surges but at the same time there is also a good part of it it also implies an easy down small VU reduction will be able to bring inflation under now the goatee had quite some time to go through the the empirical part of the paper and you also mentioned that first document these labor shortages in war time and pandemic time where Michelle inside so what this paper has to that is to connect labor shortages to inflation surge and then to propose a theoretical mechanism which is so I think that I mean not everything can be doing a single but there is a case for strength in the empirical results and in particular they could move along the lines of the recent literature and use geographical variation to test exactly the hypothesis and uncover the well known identification issues in estimating the slope coefficient of the Phillips curve they could somehow go back and I think they have in mind to do that to go back in time and include more than just two labor shortage episodes as they do now in their physical evidence and finally I think it would be it would be interesting that something that I've been thinking over days when preparing this discussion whether labor shortages is really the same phenomenon in war times and post-pandemic times for example there is quite some evidence that during COVID the allocation raise matching efficiency decrease or put in differently the natural rate of unemployment has increased is that the same during war time are all labor shortages created equal that I think okay what's going on now why okay fine now the key mechanism they put forward is what they would call a wage set in a symmetry and they they refer to Phillips 1950 original statement with very few unemployed we should expect employers to be that wages quite rapidly instead when unemployment is high workers are reluctant to offer their services less than the prevailing rate the wages fall only very slowly and the way they formalize it is through what I would call two wage setting regimes one is normal times and there they assume rigid wages and the second one is a labor shortage regime where instead flexible wages where instead wages are flexible now they they discuss this wage setting by referring to downward nominal wage agility but if it's really downward nominal wage agility indeed reaching in normal times wages are going to be rigid upward and downward and labor shortages wages are going to be you know flexible but downward and upward so that's a bit different and it's I think when I when I was lifting to the instructor remarked by Isabel I found that when she was discussing the evidence that firms are more likely to pass to pass on to customers raises in marginal cost and decreases in marginal cost so there is this asymmetry in the elasticity of inflation to marginal cost it's very similar labor market conditions are more likely to affect wages according to what Phillips is saying when unemployment is decreasing and labor market conditions are improving then when unemployment is increasing but that's also a bit different from what they are formalizing so independently of the exact form of wage agility I think it would be really important to provide some micro evidence of wage setting possibly through some survey through some survey and I don't know if those exist now a non-conventional search and matching model so what the model that they put forward is different from a standard search and matching model and it took me some time to understand why they do it I have my own interpretation which they can confirm so essentially they have a standard labor supply decision if you look at the second equation and you abstract from the bread component this is just marginal this utility from supplying labor at the labor supply, labor force participation equal to the wage and that labor supply would be equated in the first equation to a labor demand coming from monopolistically competitive price what they do is that they add in a bit adopt way a role for search and matching assuming that a fraction 1-s of the labor force is implied every period and a fraction s is going to be unemployed and searching for a job and so they will find job at a job finding rate which I call ff now the other unusual you're at time now I'm sorry just to warn you that you're almost at time okay thank you so the other unusual feature is that the vacancy passing decision is not going to be for more looking and then they have a conventional matching technology that is going to determine this job finding rate that enters the vacancy passing condition from employment agency and the labor supply condition from households now what are the implications of those assumptions so first is employment is not a state variable unemployment rate is actually constant job creation is not for what looking and existing matches do not come at right and I think that the last implication is what they were looking for because this is the benefit of simplifying the interlocking wage utility because they don't need to take a stand on rent sharing mechanism at the same time there are costs first of all rent sharing and for what looking hiring can be relevant mechanism but importantly to the debate don't think they have a standard at least beverage curve within the model so it's very hard to frame some key on going debate and in particular the fact that the beverage curve is indeed informative of the likelihood of a soft landing and there is some recent research Lashar and Damash and Summers and Kigur and Waller where they argue that so what's the idea the idea is that this inflation will require some reduction in the VU ratio but a soft landing means more than that means that VU is going to be reduced mostly through lower B rather than higher U and to know how the reduction in VU is going to come about you need information from the beverage curve in particular you need to know whether there have been a shift in the beverage curve through relocation and through much efficient so can the model accommodate this configuration I have a couple of slides and then I'm done so recalls so recalls are extremely relevant for COVID most of the increase in unemployment at the onset of the COVID recession was through temporary layoffs and you know recalls the measure of vacancies of time as they use include you know workers who are on temporary layoff but those workers really need to go through a search and matching process to go back to employment so I wonder whether the empirical results would be robust to excluding workers from temporary layoff from the measure of searches and their measure of time is to be you Rich. Let me conclude by saying that I think this is going to be so the paper as as Gotti said is working progress they are working on assessing the impact of supply shocks they are you know working on the exact way they want to introduce wage utility but in general the main conclusion of the paper is really important for policy prescription for monetary policy and in particular I think it's really important when it comes to assessing the you know recent review of the monetary policy strategies especially in the US we put greater much greater emphasis on the employment mandate relative to the inflation stabilization mandate and in particular non-linear it is in the fields that are going to imply that inflation rates from running the economy are greater than was previously established that the policy prescription that the unemployment is an unemployment rate well below acceptable and even desirable because you know because of aspect related to the distribution of unemployment might be reconsidered and finally the mis-measurement of ZLAT becomes a more serious issue if the Phillips curve becomes the first when unemployment falls below what is the highly uncertain natural rate. So let me conclude with these slides and thank you for listening. Thank you very much Antonela for the very rich discussion I just want to give time to respond so maybe I'll just take if there's any burning questions I think we have one on the panel yes please. Thanks for a great paper Gati. I wanted to ask a question that's related to one of Antonela's so I guess an alternative narrative is that labor supply shocks from search frictions are very inflationary so you can imagine that maybe the so the Phillips curve is relatively flat a big supply shock comes along from search frictions specifically this is something like mismatch that really makes firms want to hire a lot relative to the unemployed and that's where the idea of a U goes up very much you can imagine this after COVID being sectoral reallocation you can imagine during war times too there being a lot of sectoral reallocation in an act of a war economy and so then the primitive shock driving things is more like a search friction shock when it dissipates inflation falls that's a very different narrative it's a very different set of implications for policies and I was just wondering your thoughts in that whether or not you can rule it out in your model in favour of the alternative sort of non-linear system. Thanks. Well thanks a lot to the discussion for an excellent discussion and also let me apologize to the discussion because the paper is a bit of a moving target and I sent her a revised version just last week which in which case some of the issues that the rates are no longer there for example the unemployment is not a constant it and that was why it is true that the search and matching model is somewhat non-standard and in particular it's a period by period it's not like forward-looking bargaining and our assessment I mean we were trying to get a simple model as possible that would nest sort of the standard New Keynesian model and we felt that this was a reasonable discussion for what we didn't mind. Now one thing we would say there and we didn't have not yet had time to flash that out and that relates also to Jonathan's question. I mean the model does have a barriers curve like underlying barriers curve that you know we should have a section of and the reason why we introduced you know I didn't have time to really go through the model the marginal cost of posting and the marginal cost of the marginal benefit and marginal cost of the matching agency so we are just in the middle of analyzing how we can reconcile the behavior of the beverage curve to those who obviously in the data I'm not so I think the model is rich enough so it can speak to some of those questions such as those matching friction you know are in reduced form representation of sexual shift. I have to say that we have been looking at the sexual shift story by looking at you know vacancies over employment cross sector and I was expecting there to be some big asymmetries there but we haven't actually found that empirically speaking but you know that's something we are looking at at the moment now so why do we just emphasize these two episodes and don't go further back the big problem with Black Ham of the Korean War World War II and World is well we only have part of the data for the Korean War and World War II is price control so you can really see if you saw the arrows in my slides that they become very binding so we kind of then we need to enrich the model with a theory of how price control affect inflation then we felt that might just complicate things a lot I agree that going and looking across state variation is important and in fact that's what my student Julia Getty is doing and is going to write much better paper than I present today which is why I should all hire her to do this fall it was anything else you know I think that's one thing to say though about the wage setting there I guess it wasn't made clear enough in the presentation partially because it went very fast is that the wage specification does and compares as a special case just purely for particular parameters because there was a max operator there that chose the workers willing to accept higher wages than the wage norm but not willing to accept anything below it so you know and in that wage norm was lag normal wages so if you chose parameters in particular way it would just be simply downward wage but we have a more general specification which is just meant to capture sort of inertia and wage behavior that we see but you know we're not but I think it's a point well taken that we should try to connect it a little bit better with the data the in the end what we get is a Phillips curve that is pretty relatively conventional in the aggregate but the micro data is not necessarily just for this this is more like just a specification that gives a conventional looking Phillips curve great thank you very much Gautie I believe we've got some questions online but just to keep to the time maybe we can follow up bilaterally on those afterwards so that brings me then to the last paper of this session and that's going to be given by Luca Gagliardoni who's a PhD candidate in New York University so he's going to be presenting today a paper on the anatomy of the Phillips curve which uses micro evidence to make macro implications so this is joint work with Mark Rettler and Simone Lenzu at NYU and Joris Tilmans at Bank of Belgium so you're very welcome Luca you've got 25 minutes and the floor is yours okay thank you so much for inviting me I'm pretty excited to be here so the usual disclaimer applies here he says Bank of Belgium so today what I'm going to discuss is estimation of the slope of the Phillips curve New Keynesian Phillips curve which is a key equation as we know from the previous presentations because it is informative about our ability to do soft landing in the first place in terms of implication for monetary policy and it is also a key for our understanding of the drivers of inflation and how the nominal side of the economy relates with the measures of real activity and so in its conventional form the New Keynesian Phillips so the conventional form we referred to as the one that's been popularizing the Galee and Woodford textbooks and in its conventional form the Phillips curve has inflation against either output gap or unemployment gap as measures of real activity where the output gap is the difference between output and the natural level the one that arises under flexible prices and then there is the usual for a look in term and potentially caused by shocks and the in the first slide refers to the slope of the Phillips curve which is the key object that we are interested in today and if you look at evidence from the last couple of decades but there is an argument for saying that's nothing has changed as well as well estimates seem to suggest that Kappa is quite small suggesting that there is a weak link between nominal side of the economy and the real side of it so of course identifying Kappa is well known to be a daunting problem I'm just catching a few of the identification issues that have been discussed in the literature so one of them is the endogeneity of the output gap because of the reaction function of the monetary policy so when inflation is high and the monetary authority is slowing down the economy and this generates a negative correlation between inflation and output gap which leads to downward bias potentially in the estimate of Kappa if not addressed similarly measurement error is also potentially a difficulty that one needs to address to identify the slope in particular because we do not observe the natural level of output but actually we have to resort either to proxies or model based measures of fit and this might complicate our identification of the slope so the third point which is our starting point for the analysis of this paper which is in between measurement and theory is that if you take seriously the micro foundation of the new model under kind of weak assumptions there is an aggregate relationship between inflation and real margin of cost which we call a primitive form of the New Asian Phillips curve which can be derived under weak assumptions if you want another story is the link between the real margin of cost and output gap so if you open the list test book as follows through the steps the proportionate relationship between the margin of cost and output gap can be derived under very special circumstances for example in Galli textbook one will need the household to be on the labor supplies of flexible wages which we know in the data is often not the case so we start from this observation and our goal is to try to estimate the Phillips curve which is a marginal cost based and for that we look at quarterly panel data on prices and costs and we see what we can learn from that we adopt a bottom-up approach instead of first aggregating then estimating the Phillips curve we go the other way around so we look at firm level pricing equations we identify structural parameters which are going to determine the slope of the Phillips curve from the firm level evidence and those parameters are the degree of price rigidity and the strength of strategic complementarity in price setting so basically the market structure in a way and with those we are going to retrieve a slope of the Phillips curve so the main finding that we have is that if you measure real activity with the real marginal cost rather than output gap or unemployment gap the slope of the Phillips curve is going to be much larger 3 to 10 times larger than a common estimate using output gap and we reconcile our finding with what evidence based on output gap has been showing by saying that there is a missing link between marginal cost the real marginal cost and output gap even at the firm level I'm going to discuss some advantages of using the marginal cost Phillips curve over the output gap Phillips curve and in particular if you look at supply shocks and the relevant one for the recent debate is oil shocks it is much more natural to use a marginal cost based Phillips curve because we do not directly observe the natural level of output which is directly impacted by the shock okay so I'm going to skip the literature review for interest of time so let's delve into the model I'm going to just catch the main equations of the model and then let's see if the intuition goes through so we try to stay as general as possible within a simple framework so we are assuming that there is a potentially finite number of firms which operate in an imperfectly competitive environment within an industry and they face a demand function which is quite general so the demand depends on the price the price index of the industry a vector a vector of demand shifter and the demand of the industry firms are going to choose prices under nominal rigidity where theta is the probability that I'm not able to adjust prices so if the level of failure comes and tells you know you cannot adjust prices you keep the price as in the previous period and otherwise you reset the price to P0 the price P0 is chosen in the standard way so to maximise the present discounted stream of future profits where profits are given by revenues so the price times the demand minus some potentially general total cost function and this maximisation takes into account that there is a probability that I'm not able to adjust prices in two consecutive periods this leads to a standard looking first order condition which relates the set price the firm is choosing to the nominal marginal cost and the markup where the markup is given by the usual learner index with the elasticity and denoted by epsilon so far it is pretty general we're going to now impose a few assumptions to get to the data so first of all we look near the model around the symmetric steady state this leads to a first order condition for looking in which the price that the firm is setting in lockdowns is equal to the discounted present value of the markup over the nominal marginal general cost so in this paper we spend quite some time to show that under several model of competition the markup in deviation from the steady state is proportional to the relative price of the firm is charging so the relative price is given by the difference by the reset price minus this P minus F which is the price index of competitors and this is true for example under monopolistic competition but it is also true with Kimball preferences in static oligopoly settings as in Atkinson and Bernstein dynamic oligopoly as in Wang and Verning so it's pretty general replacing the markup in the first order condition we obtain the key equation of the theoretical section which is the set price can be written as the function of two present values one is the present value of the nominal marginal cost and the other is the present value of the price index of competitors and there are shocks of course omega here captures the degree of strategic complementarities and in particular when omega is equal to 1 only the price index of competitors matters when omega is equal to 0 there are non-strategic complementarities and so only my marginal cost matter we can put together the set price equation with the log linear price index to derive the primitive form of the New Keynesian Phillips curve which relates inflation to the real marginal cost so here lambda which is the slope is a function of three parameters one is theta the degree of pristiqueness, beta the discount factor and omega is the strength of strategic complementarities in the paper we are going to calibrate beta to a standard number 0.99 a quarterly frequency and we are going to estimate theta and omega and pin down the slope of the Phillips curve from the estimates of the structural parameters okay so data so we are using Belgian data which arguably have the best data in the world it passed through from marginal cost into prices at the firm level so we are going to use the data set that they have it's between 99 and 2019 microdata at the quarterly frequency so this data set is very rich because we have all the information that we need to construct prices so we have domestic sales we have quantity and we take domestic sales divided by quantity and we get prices, unit values and we have almost the universe of domestic firms but also almost the universe of foreign competitors so it is really so the definition of a market is really precise and it is really accurate to construct a price index of competitors within a market we are going to also have very detailed information from the tax records on total variable cost and we are going to use total variable cost to construct a measure of marginal cost in the standard way so we take total variable cost we divide it by output we get average cost and we proxy for marginal cost using average cost so one key feature of this data set is that which I mentioned because we worked really hard for it is that we can track firm on average for more than 10 years so the time series dimension within firm it is very long extremely important for our analysis because we are going to use extensively time series techniques and so the time series found within a firm is very important so econometric framework so we map directly the theory into the data in the following way so we undercalibre pricing the absurd price so the price that effectively a firm is charging in expectation is a linear combination of the reset price and the price that the firm had yesterday where the relative weights are given by the probability of being able to adjust okay so this is the basis for OLS so this is conditional expectation we can plug into the reset price the formula that the model is giving us in terms of present value of marginal cost and pricing of competitors and bring directly this to the data so the reset price we replace it into this formula here and again we express the realized price as a function of the present value of marginal cost as a function of the present value of the pricing of competitors and in terms of the lag price this gives a lot of robustness when we go to the data because this specification with lag prices is very robust in the data and you see that the coefficient in front of the lag price is theta which identifies the degree of price stickiness in a very general sense so it also accounts potentially for menu cost if the underlying DGP is menu cost and once we have theta we can recover omega from the coefficient to the past room and it is over identified because we have two variables for one parameter we're going to also include firm fix effect and sector by time fix effect there is a long discussion in the paper for what are the reasons for including them and in particular we include sector by time fix effect to absorb quite a bit of the variation and come from sector of shocks so basically we are estimating parameter leveraging the cross section I'm going to discuss in a second about alternative specifications so we estimate so we need to address in the generality of the precedence of competitors a measurement error for the marginal cost we do that using instruments so we are going to be estimating parameters with no lia and gmm with the moment condition that are in the slides and so we have a bunch of robustness using different instrument oil shocks money shocks and so on and so forth so in the baseline we start from lags, this is why the time series analysis comes into handy and so we choose to use lags because we want to leverage variation that come from both supply and demand and this is going to identify the parameters and therefore the slope of the Philips curve which is averages across booms and bust over the cycle so we spend some time in the paper to show that instruments are valid so when you lag, so we are taking lags two years ago so these are pretty far back in time and tests of identifying restrictions show strongly that our instruments are valid instruments are very powerful because at least at the firm level marginal cost and precedence of competitors are highly correlated so we do not run into standard weak instrument issues that have been downed in the literature for quite some time so results so let's start from model A which is the one that I've discussed in the slide so we get as an estimate for theta 0.7 which is very close to the number the one will calibrate just using external evidence and it implies between 3 and 4 quarters of thickness and so the estimate of omega is quite high suggesting that there is an important role for strategic complementarity and market structure but it is in line with previous estimate by Amit, Titz, Koki and Connings so half of the variation in prices actually comes from strategic complementarity so this is quite remarkable model B, C and D so model B is the same as model A where I replace the sector by time fix effect with industry by time these are narrower, these absorb entirely the price index of competitors so we do not have to worry about endogeneity about that and model C and D are the same as model A and B where I'm making assumptions on the dynamics of marginal cost and price index of competitors I'm assuming that they are AR1 and I'm estimating the parameter for this dynamic as well you see that the results are consistent across all these specifications implications for the slope we estimate the slope which is quite high it is a little bit larger than 5% and these are on the conservative side so when we do robustness actually we find larger numbers a little bit between 6 and 7% and these are substantially larger than estimates based on the output gap than the previous literature as found Rottenberg in Woodford and Joe here has a very influential paper with a much smaller number so in the paper we do a bunch of robustness I'm just going to mention it quickly we explore extensively other instruments high frequency money based on high frequency money and oil shocks like in the Gertel Karadi and Kanzik style we explore decreasing energy scale and pricing with menu cost and the results are robust aggregate implications so we assume that the aggregate marginal cost follows a random book this is consistent with the data we show it empirically and under this assumption we can solve the model analytically what does it mean means that we can express inflation as a function of the real marginal cost where this lambda tilde takes into account not only the slope of the Phillips curve but also the persistent of shocks to marginal cost what do we and the last term is a cost push-off that we're going to ignore so the difference between what we call what we're going to plot in the next slide is data and the model it can be interpreted as the role of cost push-offs so here goes the plot okay so black is data, red is model the r-square the regression of model on data gives 50% and the correlation is 70% so we're explaining quite a bit of the variation half of the variation of inflation which is kind of remarkable if you do the same exercise using the output gap the red line will be entirely flat so this is interesting for me we repeat the same exercise sector by sector on the top left we have transportation equipment which is the one with the steepest Phillips curve on the bottom right we have food and beverage which is the one with the flatest Phillips curve you see that as the Phillips curve flattens the share of inflation is explained goes down but across the board there is quite a general positive correlation between model and data okay so we interpret our results in comparison with what the literature has been finding so we try to follow the same theoretical step that one would do when not able to observe real marginal cost at the aggregate level so we take the micro data and we postulate that there is an approximate relationship that links real marginal cost to the output gap sigma is this elasticity of marginal cost with respect to the output gap and epsilon is some approximation error which comes from wage rigidity and something that will break a perfect proportionality between the two variables we assume that the natural level of output is composed of two terms one is an industry trend and the other one is a firm specific supply side factor which is independence of demand and demand shock which is the instrument that we are going to be using but we are going to be using money shocks and then we go back to our main regression where we replace marginal cost with output gap and this is what we find so first of all when you regress it with marginal cost again now we are instrumenting with money shocks so this load goes a little bit higher on the other hand when we proxy for the marginal cost with the output gap the slope is much smaller so this is funnily enough exactly the number that Rottenberg and would for find for the slope and this implies an elasticity between marginal cost and output which is quite low it is 20% which is much smaller than what one would calibrate following the textbook one might be worried that by putting industry by quarter fixed effect we are abstracting from general equilibrium forces we take care of that by replacing industry by quarter with industry by three years so that we are not absorbing business cycle frequency and the results are robust before concluding we can use this framework to assess the effects of supply shocks in particular what do we do here we run a panel VR Georgia projection at the firm level using as a shock the Kansik shock so an increase an anticipated increase in oil prices which is inflationary so it increases the real marginal cost and it leads to firms to push up prices then what we do we take this part of the real marginal cost we feed into our model calibrated with our slope of the Phillips curve with perfect forecast firms perfectly anticipate what happens to the real marginal cost and this is what we get in terms of the price response when we calibrated the slope of the Phillips curve to be 6% so we are within the 90% confidence bands always to conclude so seems like if you measure the Phillips curve using marginal cost instead of output gap the slope is larger than we thought and this comes from key roles for price stickiness and a key role for the market structure the strength of strategic complementarities we rationalize the difference between what we find and what the previous literature has found with the weak connection between marginal cost and output run employment gaps and this can potentially come from a lot of different reasons but one that comes to mind is wage rigidity so in the presence of wage rigidity the link between the true variables is not so obvious to be proportionate and finally this missing link kind of suggests that there is an interesting difference between demand and supply shocks because potentially in the inflationary effects because demand shocks might have a weak impact on the output gap whereas for them to affect marginal cost it has to be that's true wage determination or through some general equilibrium feedbacks and if the link between output gap and marginal cost is weak then demand shocks might have a weaker inflationary effect on the other hand supply shocks directly enter marginal cost and so this is suggesting that there might be a difference between the two these are what I have thank you very much Luca for that very clear presentation let me now invite to the stage Jonathan Hazel from London School of Economics where he's associate professor Jonathan you have 10 minutes the floor is yours thank you very much thanks Luca for a great talk, thank you for the invitation to let me present this really interesting and well done and hopefully soon to be influential paper okay so what are we talking about today we're talking about the Philips Curve and in particular it's a New Keynesian formalization which as we all know relates inflation today to inflation expectations in the future the current output gap or some other measure of slack or aggregate demand and supply shocks and really there are two objects of interest that we're looking to study here the first is what people call the slope of the Philips Curve well here I have labeled Kappa how much was a fall in unemployment sort of increasing the tightness of the economy how much of that raise inflation and the second thing is the expectations coefficient beta how forward looking is inflation how much is inflation today reacting inflation expectations in the future what did we know before this important paper by Luca and Company there was a growing consensus at least before the Covid pandemic about the slope coefficient Kappa a growing consensus that it was relatively flat at least in the US data but also around the world that was captured very well on the paper by Stock and Watson and also some earlier work that I have with Juan, Emmy and John perhaps this has changed after the pandemic as Gojji showed recently but at least before the pandemic there was an emerging consensus that the slope of the Philips Curve was relatively low what there was much less consensus about is this expectations coefficient which I called beta no consensus at all really I think looking back over past work one can make a good case for beta being 0, beta being 1 beta being something in between to quote from the influential paper about 10 years ago by Matt Reedy who's conclusions I think still resound today they said quote the identification of the new Keynesian Philips Curve is too weak we think it will be more fruitful to explore fundamentally new sources of identification such as micro or sectoral data the fundamental issue with estimating beta that I will touch upon later in my discussion is weak instruments in the time series I am emphasising this because as much as we have talked about the slope of the Philips Curve today obviously very important I think of sort of the quote holy grail of Philips Curve estimation is looking at this coefficient term beta how forward-looking are inflation expectations obviously this is the thing that animated how we think about the Philips Curve going all the way back to Phelps and Friedman and this earlier debate between the classic thing of Samuelsson and Solo versus it's more modern incarnation even going back to the 1960s this paper so Luca and company ask what can we learn about the aggregate Philips Curve from microdata and the answer is a tremendous amount so before I get into my comments I am going to unpack exactly what this tremendous amount is how they are able to do it and then I will talk about three things I will say in encouragement to these authors in current and perhaps future work that this is a very powerful machinery to also estimate this coefficient beta which I think is very important and hopefully you do too I will talk a little bit about how to think about identification and some alternative models that might be out there that are also very sensitive that I hope the authors can do more the disentangle from the model that we have right now ok so that approach there is a theory piece which Luca already mentioned so I will go through briefly they write down the standard New Canes model the Calvo model the firm price today PFT is related to sort of an outer product which contains the Calvo parameter the degree of price stickiness theta some terms relating to the discount factor of the firm beta then there is an expected present value term that is the expected present value of the marginal costs of the firm that is pre-multiplied by a term omega which reflects strategic complementarities to what extent firms are setting prices in response to how other firms are setting prices there is a final term theta PFT-1 that is lag prices that is how much prices today respond to prices in the past in their model that is going to map directly onto the degree of price stickiness for sort of obvious reasons which is that if you can't adjust your price today PFT then it is going to be the same as your price yesterday PFT-1 one thing I also want to emphasise is this interpretation of this final term epsilon FT which is much like a sort of regression residual you should think about this as all of the other factors that can affect prices other than marginal costs other firms prices and expectations about those things in the future the main interpretation that I think one should bear in mind other than perhaps expectation errors is what I think of as idiosyncratic demand trucks so holding fixed firms marginal costs why would their prices rise today because there is an idiosyncratic increase in the demand for the product of that firm that is the natural way to think about it the authors show that this micro equation implies a standard aggregate Phillips curve and that means that one can estimate the Phillips curve using parameters from micro laters that is the key sort of theoretical innovation of the paper the key insight they also show that this holds very generally so for most of the models that we write down even with menu costs, even with strategic complementarity models that I at least before reading this paper thought were very rich, perhaps too rich to be analytically tractable the authors show that there's going to still be a similar sort of micro demap for mapping so that's the theory side on the measurement they're going to estimate the equation that I just showed you by the generalized method of moments they're going to use Belgian data for the task at hand because it's essentially comprehensive as Luca said you can see prices you can see competitors prices, you can see marginal costs you can see everything that you need to see they're going to calibrate beta they're going to estimate that's the forward-looking parameter on inflation expectations they're going to estimate feeder the degree of price stickiness and in particular they're going to use instruments so the instrument for marginal costs is the eight-year lag of marginal costs the identification assumption is that this is orthogonal to the current sort of regression residual to current things such as idiosyncratic firm level demand shocks their main results I'll go through this quickly before my comments is that lambda is relatively high the slope of the Phillips curve with respect to marginal cost is relatively high but additional evidence suggests that Kappa is relatively low so the slope with respect to marginal cost is relatively high the slope with respect to the output gap is relatively low and that's because a parameter at the bottom of the slide I've called phi is relatively low, the mapping between output and marginal costs so some quick implications is that this is going to be important discipline for future models it matters a lot for the propagation of supply versus demand shocks but I want to get onto my comments and my first main comment that I want you to take away from this is that this is a great paper it's an important question interesting results general and tractable model, painstaking empirical work so I think it's going to be highly influential and that's the one thing that I really want you to bear in mind from my discussion I want to make three comments and the first is perhaps not for this paper which is already packed to the brim but for these authors in future work or for other authors is that I think this is a great framework to estimate beta and remember again beta is how forward-looking inflation expectations is I gave you this motivating point at the start where I said you know we have relatively little information about this most important of parameters for thinking about inflation why not because we've got this weak instruments issue in the time series and then 10 years ago other people were saying well maybe we can go to the microdata maybe we can get much more variation with which to estimate beta these authors have precisely that very rich variation however for now they calibrate beta so in their empirical exercise they pick a value beta equal to 0.99 that seems very reasonable that's what I've done, what others have done in past work but I don't think that's necessarily the final point for empirical work here in particular I hope that in future work they or others can estimate beta now of course one issue is perhaps beta and theta, theta remember again is this sort of price thickness parameter might potentially be poorly identified from each other but we already have lots of evidence on theta we already have lots of evidence on the slope of the Philips curve which is related to theta we have lots of evidence on the frequency of price change including some sort of assembled by authors and so I like to think that because we already know quite a bit about the slope of the Philips curve but relatively little about how forward-looking inflation is I hope this is a machinery that can be put to work to estimate that latter thing like I said there's no consensus on the value of beta how forward-looking inflation is I think of this as one of the holy grails of Philips curve estimation and the sort of in the room to sort of the more policy minded of course as we had in the excellent speech at the beginning it's important to think about the slope of the Philips curve but it's also important to think about how forward-looking this is going to be how much inflation expectations management is going to matter for inflation today my second comment is about identification so coming back to the main estimating equation think of this sort of heuristically as a regression think about epsilon that regression residual as picking up the other factors affecting prices for instance idiosyncratic firm demand shocks so again holding fixed marginal costs while your prices move around because the demand of the firm is moving up and down the identification assumption of the authors is that lagged marginal costs as well as lagged prices are orthogonal to firm level demand shocks and I'm not sure whether this whether or not this is plausible I don't really have a prior in this I think one can tell stories whether this isn't plausible if you imagine very persistent firm level demand shocks you know I'm a manufacturer people over the course of five or ten years really start to want my goods more and more and more my prices rise today my prices rise in two years time and five years time so do my marginal costs this is obviously going to be a big threat to the validity of the instruments this is something that I think the authors could help us understand better and perhaps they could pursue alternative instruments in order to sort of fortify and enhance what they're doing the two instruments that I came up with but I'm sure they can come up with others is firstly something using the foreign components of marginal costs so there's a famous influential paper by Amity Escokey and Konings that have done that or perhaps some kind of Schiffier instrument so there's some exercises with all shocks that go in this direction perhaps the authors can do more of this the point about some kind of Schiffier instrument is one could plausibly build the case that these are things that are orthogonal to the sort of idiosyncratic component of firm level demand shocks that plausibly is very important and perhaps some kind of confounding variation for what they're doing so I think I have one minute left so the final thing I'll say is well we have this model by Luker and co-authors heuristically in this model regressing current prices on lag prices is going to identify nominal rigidity but perhaps poking fun at one of the co-authors who are not here there's an extremely influential paper by Mark Gertler and Jordi Galli who suggested a different model a model of inflation with backward looking price setters in which sort of past inflation has some effect in current inflation today and so others in the audience including Ricardo have developed similar models with some similar or not so different models but with some similar time series properties of course this model is going to have very different aggregate dynamics for instance the sacrifice ratio is going to be much bigger because it's backward looking now if the sort of Galli-Gertler 99 model is true regressing current prices on lag prices is going to have a very different interpretation it's not just going to be price stickiness but it's also going to be something to do with how backward looking inflation is and so in that case the estimates the interpretation of the authors mapping between the model and the data is going to be somewhat different from what they have and so I'm hoping the authors to nudge the authors slightly towards considering that alternative model which is obviously at the top of Mark Gertler's mind at least one and we're going to get out in favor of that current model which is quite a different model it has different interpretations, different policy implications too so with that let me wrap up the main thing is to say that this is a great paper because of its general intractable modelling it's careful empirical work, it's important results in terms of comments I'd like to see more work done to estimate beta in this and other papers useful to know more about identification and to think about some alternative models that might fit the data too thank you very much thank you very much Jonathan great discussion and great suggestions I think there maybe I can just see if there's any questions, any burning questions in the room or online Frank yes please the question is whether how much firm variation there is in sort of degrees of price stickiness and complementarity also across sectors Luca maybe as we're at time I'll give you five minutes to respond to Frank's question the discussion okay thank you so across sectors there is quite a lot of variation in fact the slope of the Philips curve for the transportation which is the steepest one is like 0.3 and for tobacco food and beverages is like 0.03 so there is quite a lot of variation between the two and this comes and like in depending on the sector it comes from either different degrees of stickiness or different degrees of complementarities so there is a lot of variation in both so regarding Joe comments thank you for the discussion it was very useful and very insightful so definitely the comment on beta we're going to work on that I've been exploring that a little bit I remember finding numbers in the neighborhood of one but then we didn't go in the direction so maybe in future work we're going to think about that for inflation in Asia and steepiness so the paper by Mark with Galli so from Mark's perspective not having the lag was a great advantage of this setup so in Mark's eyes it was like we don't need the lag to explain quite a bit of inflation and this is a great advantage of the framework that we have here potentially if we include also the lag we're going to improve on the feet of the model versus data so the black line versus the red line because inflation is quite persistent so including the lag might play in our favor but we didn't want we wanted to show that without having the lag we still could do a good job explaining 50% of the variation in inflation but it would be useful definitely to use this framework to try to disentangle whether we do need or not such a lag and regarding the identification so we've been thinking about that for a long time so basically if shocks even at the firm level demand or supply doesn't really matter are correlated over time or are correlated over time but they kind of disappear before a couple of years then everything that we've done here goes through without any problem the type of firm level shocks that might create issues are those that are correlated over time but at the firm level are correlated over time are highly persistent by neither the firm fix effect so for example firm quality would be absorbing into the firm fix effects because it's permanent or approximately permanent and is not absorbing to the industry by time fix effect so it doesn't have an industry component to it so they really have to be firm specific shocks which are not permanent and are highly correlated over time in a way we do shift share instruments in the robust sections because the oil and money shocks that we construct have firm specific exposure so they can be interpreted as a shift share in a way so they kind of attenuate this type of concerns and we also try to use the Amity's Cocky Connings type of instrument with exchange rate shocks but unfortunately it's not powerful at quarterly frequency so they have yearly frequency we look at quarterly frequency and it seems not to work so yeah thanks for the discussion and thanks for the question that's great thank you very much Luca so that brings us to the end of our first session so I'd just like to thank very much all of the panel members and discussions I think there is a really great start and also to thank again our Executive Board Member Isabel Schnabel for being with us and for her great comments so we'll start back again at 12 sharp so we have a short break now for just over 15 minutes and we'll start back with the keynote address by Ricardo Reich so thank you to everyone in the room and also to those online and see you back at 12