 Thank you, Luc, and welcome to all of you for being willing to participate in our annual research conference, which, as you know, is our flagship research event, bringing together academics and central bank researchers working at the cutting edge of economics. We certainly value research because it contributes to shape the intellectual framework that we use to understand economic developments and to take policy decisions. We are especially keen to keep abreast of new developments in the research frontier because they are on top of the new challenges to understand the economy's behavior, which is vital for devising appropriate policies. The Great Recession, in the words of Larry Cristiano, was a macroeconomics earthquake, to which the field is still adjusting. Cristiano highlights mostly three aspects. The need to recuperate the Keynesian view that demand shocks and the paradox of three can be important for economic performance, the notion that the economy is not quickly self-correcting and requires public policy intervention, and finally that the financial sector can endogenously generate imbalances with significant consequences for the real economy. With insight, it is surprising how these points had been neglected by mainstream economics for so long. Many other aspects, by the way, can be usefully added to that list. First, that stabilization policies are crucial and that not just growth policy counts as significant fluctuations leave behind permanent losses. Related to this is the notion that demand shocks can affect the supply side via easter-rezy effects in labor supply and in the capital stock via investment deceleration. The distinction between short and long-term is necessary for theorizing and teaching, but often is not useful for policy analysis models. Second, the heterogeneity of agents, particularly of consumers, is important to understand aggregate behavior in view of in-depthness, credit restrictions, and income and wealth composition. Third, behavioral economics has also cast doubt on the full rational expectations paradigm as a two-demanding hypothesis about the cognitive powers of economic agents, especially for long horizons. In this context, the myopic assumption explored, for instance, by Gabet, is a welcome development. Fourth, agents heterogeneity relates to the issue of distribution that had been neglected, but which the use of heterogeneous agents' New Keynesian models helped to bring to the fore as exemplified by the recent contribution of Anne, Kaplan, Mol, and others at the Embare Microeconomic Annual Conference titled, precisely, When Inequality Matters for Macro and Macro Matters for Inequality. It shows how significant the feedback loop is and how models with realistic odds of heterogeneity better fit empirical consumption dynamics. Fifth, the assumption of a unique steady state is now challenged by the consideration of multiple equilibria, particularly some without full employment, as the work of Roger Farmer and collaborators shows. More controversial is, of course, the idea of abandoning the notion of a prioristic theoretical equilibria in favor of the pure interaction of heterogeneous agents with behavior rules in agent-based models. A final point refers to the question whether conventional monetary policy is as powerful as portrayed in mainstream DSG models via Euler equations. The proctorated recovery seems to give ground to the whole view of monetary policy effectiveness being asymmetric and weaker in recessions. There is justification for rethinking a more active role for fiscal policy, following, for instance, the recent paper by Auerbach and Gorod Nichenko. A reconsideration of the effectiveness of both macro policies has become more necessary in view of the two major problems that central banks now confront. First, the lingering low inflation associated with flatter Phillips curves that impair the policy transmission, and second, the need of policy instruments to deal with the next recession, even if it would be a mild one. The various points I just listed are some of the relevant aspects of the ongoing revision of macroeconomics and justify the point recently made by Blanchard that we need different types of models to understand, to forecast, and to analyze the economy and the policies necessary to address its shortcomings. In that spirit, I will concentrate my remaining remarks on some ongoing developments in the specification of macroeconomic models at the ECB. Macroeconomic models can be used for a variety of purposes in central banks. They are helpful to articulate relationships between certain variables of interest in a systematic fashion while ensuring that resource constraints are expected. They provide input to the complex process of macroeconomic forecasting, and they can be helpful to conduct scenario analysis and study policy counterfactuals. To perform these tasks effectively, a model should satisfy two very simple criteria. First, many policy questions are inherently quantitative in nature. A useful model must fit the data reasonably well and should be able to produce effective economic forecasts. Model-based counterfactual analysis will only serve as a credible benchmark for policy discussion if the results are quantitatively plausible in practice. This criteria has two implications. On the one hand, the model should incorporate realistic elasticities. For example, the dynamic effects of changes in monetary policy interest rates should be consistent with available reduced form evidence. In the euro area, the model should provide a reasonably good account of the inflationary impact of national fiscal expansions or of developments in national wage negotiations. On the other hand, the model should provide a credible narrative for observed economic developments. The second criterion is really an implication of the first. Partly due to new research findings, partly as a result of puzzling economic developments, we constantly update our beliefs on the key economic mechanisms that are necessary to fit the data. The model should be reasonably flexible to be able to adapt to a change in economic and policy environment and to speak to current policy questions. The financial crisis is a case in point. The macroeconomic models maintained by central banks in the early 2000s were not equipped to speak to all the questions arising in the aftermath of the crisis without further adjustments. From today's standpoint, these adjustments are simply inescapable. We would like the model to provide a reasonable account of the dynamic effects of non-standard monetary policy measures. With policy rates at the effective lower bound, we really need the model to provide realistic implications on the impact of forward guidance. For me, as a policymaker, it is of key importance that our models can be adapted fast enough to address new emerging questions in a timely manner. Of course, in order to reap the benefits of a flexible modeling framework, it is equally important to have expert staff using and enhancing the models in a practical and innovative manner. These considerations played an important role at the ECB when we were just recently faced with the decision of enhancing the multi-country model of the euro area, which paradigm should we adopt? One option was to remain within the SGE framework. For over 10 years, the SGE models have been the key tool used for policy analysis exercises in many central banks. This has also been the case at ECB, where, by the way, the initial development of estimated the SGE models has taken place. The SGE models are typically estimated and thus consistent with the data. They often reproduce the dynamic effects of changes in monetary policy interest rates that are observed in identified VARs. This is also the case for the SGE model developed in the Directorate of General Research, the new area-wide model, which is regularly used in counterfactual policy analysis at the ECB. At the same time, and it will continue to be used, at the same time, a good fit of the data is, to some extent, accomplished by the SGE models through persistent shock processes, which questions the empirical validity of the model's intrinsic propagation mechanisms. There could be much to say about the addochry going on also in the SGE models as they are used in practice. More importantly, the SGE models do not always provide a plausible story for observed economic developments. For example, so-called technological shocks, especially if negative ones, tend to play an overwhelming important roles in accounting for the evolution of GDP, even when external data do not show any evidence of technological innovations. Moreover, the SGE models can also slowly be adapted to a new policy environment. The requirement of full internal consistency makes the incorporation of new features, be it a more granular financial sector, household heterogeneity, or stronger non-linearities, often are very demanding in that context. And indeed, the new area-wide model has been going through all these transformations in order to respond better to the developments of the environment. Enhancements come with long gestation periods, sometimes limiting the ability of the SGE models to speak to newly emerging policy questions in a timely manner. In designing the new multi-country model, the ECBMC, we have, therefore, started from the premise that in the words of Olivier Blanchard, policy models cannot be expected to have the same tight structure as theory models, end of quote. We have decided to adopt a semi-structural approach inspired by two guidelines. First, include financial frictions or financial mechanisms that could allow monetary policy shocks to be transmitted via channels that were absent before the crisis, and second, to adopt a more flexible and empirically driven approach. The emphasis is on equation by equation fit, while the cross-equation constraints are mostly ignored because they do not impinge on the model's ability to provide sound quantitative predictions. When introducing financial frictions, we have relied on a reduced-form representation that is consistent with different theoretical micro-foundations. This more flexible semi-structural approach allows us to model a wide range of banking and financial variables going from bank lending spreads to turn-premier without taking a stance on the exact theoretical mechanisms linking them to the macroeconomy. At its core, the new ECB-EMC model is designed along the lines of the well-known Federal Reserve FRB-US model. The behavioral decision rules of private agents are based on optimization, and in the long run, the model boils down to a neoclassical growth model. In the short run, however, it is assumed that agents face adjustment costs, which implies staggered adjustment of the actual to the desired levels. I believe that the design of the ECB's multi-country model will increase the robustness of our model-based policy analysis and strengthen our capability to address newly emerging policy questions in a timely manner. The model development, which has been led by the Directorate General Research, is a joint effort of economists from a wide range of policy areas inside the ECB, colleagues from national central banks, and academic consultants. I am confident it will be soon be part of the ECB's toolkit. However, it's easy to forecast that further refinements will prove to be necessary in the future for the model to continue being a valuable policy tool. I specifically see four areas where significant progress has already been made, but further improvements are likely to be necessary. The first area is related to the modeling of aggregate consumption. I share the concern of John Muehlbauer that the standard DSG framework imposes unrealistic micro foundations for the behavior of households as embodied in the rational expectations permanent income model of consumption. In typical representative agent models, consumption behavior is captured by an Euler equation, an inter-temporal optimally condition that links today's level of consumption to expected consumption in the next period and further into the future. In the linearized form, it does neither envisage that consumers face idiosyncratic, household specific, and uninsurable income uncertainty, nor that uncertainty interacts with credit or liquidity constraints. This is in stark contrast to recent research that emphasizes the importance of precautionary saving, liquidity constraints, leverage, and of heterogeneity, including heterogeneity in marginal propenses to consume. Compared to simple representative agent models, the ECB and C clearly marks an improvement. The consumption function is explicitly affected by agents' wealth holdings. Agents have shorter average horizons than presumed under the textbook Permanent Income Hypothesis, and the model further allows for the presence of agents that do not optimize, but rather exhibit end-to-mouth behavior. Last but not least, risk aversion and income uncertainty also play a role for consumption behavior. This setup, for example, allows quantifying our larger income uncertainty reduces the power of forward guidance. All in all, perhaps, help to explain the forward guidance puzzle. All in all, I think that we are moving in the right direction. Nevertheless, the modeling of aggregate consumption is an area in which research is currently developing fast, and we should be ready to learn from new findings. The second area of improvement concerns the modeling of expectations. As Sargent emphasizes, rational expectations can be a meaningful characterization of the long-run equilibrium, but the transition to a new steady state might display non-rational behavior, an increasing body of research aims to explore the implications of alternative types of departures from rational expectation for business psych dynamics in general, and the transmission of monetary policy in particular, as, for instance, in the papers by Garcia Schmidt and Woodford, Gabe, or Fari and Verning. Once again, the ECBMC goes in the right direction. The model can be simulated under two expectations setups. First, in a model consistent manner, and second, under the assumption of bounded rationality, where agents form expectations with a small-scale VAR model. Other expectation formation mechanisms, such as learning or the use of market expectations, are also easily implementable. As was shown by Blanchard and co-authors in a recent study on the macroeconomic effects of changes in the expectations of long-run productivity growth, different assumptions on the expectation formation mechanism can lead to considerably different outcomes. Assessing the most realistic way of treating expectations in policy models remains a crucial area for further work. The third area of improvement has to do with the nexus between inflation, wages, and the real economy. When modeled through the expectations augmented Phillips curve, the nexus seems to have become weaker after the financial crisis. Commentators have repeatedly talked about the missing disinflation at the throats of the Great Recession, and the amount missing inflation in more recent years, particularly in Europe. Recent studies have come up with alternative explanations for this phenomena. For instance, the missing disinflation has been argued to be the consequence of either the presence of well-anchored inflation expectations, too much success by central banks, increased downward wage rigidities in recessions, or a fall in total factor productivity, and increased costs of working capital. Understanding the underlying sources of this apparent structural change will be important for monetary policy. I addressed these issues last week in remarks to another conference that we organized on understanding low inflation. The semi-reduced-form nature of the ECBMC makes it ill-equipped to address this deep question, but studying structural changes is challenging for all current models built to study cyclical developments. The fourth and final area where further improvements are necessary in my view is macro-financial linkages. I have already mentioned that the ECBMC incorporates such linkages, as now the new area-wide model is also developing. Nevertheless, the exact way in which they affect the monetary policy transmission mechanism remains imperfectly understood, even if the empirical literature is making important advances. These linkages are also relevant for financial stability and may evolve in response to the recent reforms in the regulatory environment. This is why at ECB we are also making parallel progress on this front within the SDE paradigm through the development of the so-called 3D model. The 3D model has been developed under the macro-prudential research network, Mars, and can be used to assess the macroeconomic benefits and costs of macro-prudential policies. The 3D alludes to the fact that contrary to previous models, it captures the distinct benefits of capital requirements through reductions in default risk in the economy because the model has three defaults, not only for banks, but also for borrowers, meaning non-financial firms or households. Let me conclude. Building models for policy analysis is associated with trade-offs. This is an important reason for central banks not to rely on a single model and a single modeling paradigm, but to make use of a suite of models based on different paradigms. I could not agree more with Olivier Barchard, who recently expressed his view that also other types of general equilibrium models beyond the SDEs are useful policy tools. I am pleased this view is lately getting more traction as alternative types of models will continue to be part of central banks to toolbox. Development of these models can greatly benefit from insights, from academic research, and so I am very much looking forward to the contributions to be presented and discussed at this conference. I wish you all a very good conference. Thank you very much.