 Thank you. Great pleasure to chair this panel. Indeed, remarkable achievements, impressive. I think that's true. That's true. For me, what really was, there have been major changes in recent years. As Mario was explaining, I call it tracing the money. When we came with the Teltro funding for lending, wanted to go where the money was going. And having access to data from individual banks anonymized in the strict conditions was one of the key information input that we had to see, does it work or not work? And so this is one little example about many others. And I was very proud because I remember, when you asked me, to defend the case in the governing council some years ago that it would be useful. And we didn't know at that time that we went and entered that sort of instrument of monetary policy. So when the decision came much later, we had some of the information tools to see a little bit what happens with the money we lend to the banks. I see some other colleagues here, Bjorn, and others here who were behind that project here. So great pleasure to have this first session chair. It's data needs. So we look at the future and monetary policy. We will start with Jan. Jan is the Governor of the Central Bank of Belgium, National Bank of Belgium. Jan. We will follow in the order here with Pablo Garcia, executive board member of Central Bank of China. Also quite a name also in economics, but also in statistics in particular. And then Evald, Novotny, Central Bank of Austria. And then our discussant is Paul Mortimer-Lee, obviously not present now. According to the program, Natasha. Thank you very much, Natasha. Natasha just joined the Central Bank as Deputy Director General Monetary Policy. So thank you, Natasha. And I didn't force Natasha to come here. You did it with pleasure, so thank you. Paul Mortimer-Lee is sick, really. He has a good excuse, and it's not too bad, I heard. But he really couldn't make it. So that's Jan. Maybe you'll start now. And it's about 10 minutes, I think, presentations. We have PowerPoint presentation. And then we will open the discussion. Looks like, yeah. Is it? OK. It's OK. It's OK. No, dear Peter, dear friends, dear colleagues, ladies and gentlemen, thank you very much for inviting me to this conference, and especially this particular session. As it's about interactions between statistics and the future of monetary policy. And we know that this is a hotly debated topic. You will agree that some important and probably interrelated things have been happening over the last decade or so in the monetary policy landscape. Not only have central banks broaden their toolkit to tackle the financial crisis, but also the economic world has been faced with structural changes. And besides, both the use of new instruments and the availability of new data have spurred advances in monetary policy research. Now, the interplay of these phenomena could potentially have, I think, serious implications for the way we think about monetary policy going forward, be it in terms of objectives, instruments, transmission channels, or data monitoring. But, and that is, so to speak, the message I want to convey here, to avoid drawing overly hasty conclusions, I think this requires careful reflection. And in some cases, it implies indeed new data needs. Only after careful investigation of the issues at stake and lessons for the future of monetary policy be drawn. In my remarks today, I will focus more specifically, as you see, on three phenomena that challenge our traditional thinking on monetary policy. First, the role played by heterogeneity, which has been clearly demonstrated by the use of new monetary policy tools, typically, indeed, as the president alluded to, more targeted and something which is increasingly documented in economic research and data. Second, digitalization, probably one of the most notable structural economic changes over the last three years, which also opens the door to a new world of big data. And third, the changing role of the financial sector. I attend to raise a number of questions to foster the debate and hopefully help to structure our thinking. How exactly have these challenges called into questions the consensus on which monetary policy has been based? How can new data help to identify whether and to what extent the practice of monetary policy has changed? And more specifically, can new data shape a possible new normal for monetary policy? And either to say, I do not want to provide any definite answers to these key questions. They have far-reaching implications, making it unrealistic, I think, to settle all this on this panel. But before looking ahead, let's first look back a bit. Data that central banks traditionally look at are broadly tailored to the new Keynesian model paradigm with a view of the world, and I'm grossly oversimplifying here, not doing justice to macro-modelers nor to policymakers. The central bank operates in a framework where representative agents interact, where production is labor-intensive and where the role for financial factors is limited. Building on rational expectations and sticky prices, inflation is driven by expected inflation and the anticipated change in real marginal costs, the so-called new Keynesian Phillips Curve, which links economic activity and inflation. And in this setup, monetary policy should aim for price stability. Doing so requires bringing aggregate demand into line with the potential output path. And the prime way to do so in these models is by steering individuals' inter-temporal choice between consuming today versus tomorrow. And this gives a key role to interest rates, where the working assumption is that the central bank steers perfectly the interest rate that is relevant for the representative agent. The careful monitoring of macro-economic aggregates and their projections successfully supported monetary policy decisions in this view of the world, which seemed, I would say, fairly appropriate until up to about 10 years ago. But like I hinted in my introduction, several developments might have called into question this fairly simple framework. So let me indeed focus on the three challenges to the standard practice of monetary policy and their data dimension that I just mentioned. And the first one, as I said, is heterogeneous agents. The appropriateness of representative agent models has been challenged quite strongly since the crisis. For instance, some people have claimed that monetary policy tools aimed at stabilizing macro aggregates have harmful side effects on specific sectors or types of economic agents. The allegation that asset purchases increase wealth inequality, that the low-rate environment punishes savers, or that easy monetary policy facilitates the survival of zombie firms are just a few examples. But are we only talking about possible side effects of some measures here? I think these reflections are a broader indication of how heterogeneity can also be a transmission channel for monetary policy. And going one step further, it could appear that monetary policy works more via the cross-section than via the time dimension, which is a traditional Eucanesian intertemporal story. To put it bluntly, could it be that an interest rate cut has a bigger impact on aggregate demand because it shifts income from creditors to debtors who stand ready to spend, rather than via intertemporal substitution? Micro heterogeneity and distributional aspects already appear on the monetary policy stage. And you are backed by advances in theoretical research. Brunner-Meier and Seneca, for instance, argue that targeted monetary policy leads to redistributive effects that help mitigate financial frictions. And I think, indeed, credit easing policies are an explicit example of that since specific types of lending are being supported. Newly developed heterogeneous agent Eucanesian models, the so-called HANK models, also help to get essential insight on monetary policy transmission channels when the assumption of representative agents is abandoned. Such models suggest that forward guidance could be less powerful than conventional rate cuts because of liquidity-constrained households, for instance. Now, this trend of research would benefit to come to the topic of this conference from additional data to help rigorously test these theories also at the U-area level. For sure, extra data at a fairly granular level with a panel dimension to capture effects over time as well are of interest here. Microdata from the Household Finance and Consumption Survey already a step forward, and that effort should be continued. For example, these data have allowed researchers at the ECB to mitigate concerns that the APP benefits the wealthy at the expense of the poor. Other euro system data initiatives like the one the president mentioned, Unaccredited, also are very useful, for instance, to study the extent of zombie lending and how it interacts with the monetary policy stance. The second challenge is digitalization. And as you all know, digitalization of society dramatically changes our lives, how we produce, work, trade, consume. So what are the consequences for monetary policy? I shall mention two interlinked dimensions here. First, digital products and services raise issues with measuring the general level of macro aggregates that central banks typically look at. How to adequately capture quantities, when, for instance, Netflix or Spotify memberships allow unlimited consumption of content. How do we determine potential output in such economies? And what about measuring consumer prices for digital service providers, such as social network platforms? Second, technology challenges our understanding of price dynamics. Is price thickness still relevant for digital transactions? How do prices behave when the marginal cost of producing more is very small, even close to zero? Addressing all these questions is no easy task. Overall, digitalization complicates our understanding of the transmission process from extra output to inflation. And this has implications, not only for the way we model the economy, and here I'm thinking about possible adjustments to the new Canadian Phillips Curve, but also for the role we devote to monetary policy. Should monetary policy set different objectives if prices are highly flexible and the costs of inefficient price dispersion are much smaller than presumed? Too early to tell, of course, but definitely worth an in-depth investigation. Meanwhile, and turning back to the data issue, I welcome advances made in measuring macroeconomic aggregates in the digital economy, in particular consumer prices. Across the Atlantic, the billion prices project and Adobe analytics data are promising examples of that. They provide tentative evidence that US inflation could be overestimated, although these results seem to depend on the dataset used. At your area level, national statistical offices initiatives on integration of online and scanner prices into HICP measures, as well as the euro systems choice of investing heavily in research on price setting using micro data will certainly help, too. And while digitalization challenges are thinking about macroeconomic accounting, it can also provide a whole new set of granular and at the same time, multi-dimensional data. In that sense, big data can become our ally. And they will briefly come back to that point at the end. Before that, a word on the third challenge, which is the changing world of the financial sector. Relating to the changing nature, I would say, in a world of financial intermediation, well documented in a research area that exploded, I think, during the last decade. We have not only witnessed greater fragmentation within the banking sector, which has forced us to take unprecedented non-conventional measures to preserve a smooth transmission of monetary policy. We are also observing a slow moving tendency towards a larger world for non-banks in the financing of the economy. With the Capital Markets Union, a project we fully endorse, the role of players outside the traditional banking sector will hopefully get bigger. And this justifies particular vigilance on the part of DCB to be ready to monitor developments in this area. We should also make sure we are able to monitor developments in so-called private virtual tokens that aim to play a role as money, even though I think to think that these developments are not or not yet of macroeconomic relevance. And related to this, the FinTech revolution blurs the traditional boundaries between the financial and the non-financial sector. When such things are becoming more relevant, monetary policy transmission can profoundly change and monitoring the traditional financial indicators can turn out to be inadequate. Therefore, good data coverage of new trends in the financial sector is essential. And fortunately, again, the euro system plays a proactive role here and I would like to give two examples where new data play a key role. During the financial crisis, the euro system wide effort was launched to exploit bank level data underlying the money and credit aggregates that are monitored in the ECB's monetary analysis. And that way, as the president said, the governor council could assess in a fairly granular way the transmission of measures via the banking sector. The data also proved key for calibrating the details of the targeted loans we started giving to banks back in 2014. And thanks to money market statistical reporting, which I recognize is a huge statistical challenge, we also have a better view on the workings of euro area money markets. Moreover, it enables the euro system to provide for a backup risk-free benchmark rate should currently available private benchmark rates cease to be published. In this respect, it is very good to see how new economic realities are being reflected here, contrary to the current benchmarks, transactions with non-bank money market participants could be included in this new benchmark too. So the three to conclude, the three challenges I raised today may not only imply extensive use of existing micro data, but also require further efforts to exploit the new work of data opened up by digitalization, the so-called big data. I do not intend to elaborate much on concrete applications and challenges that come with big data, and these aspects will certainly be more deeply tackled later today in the later session of this conference. But that said, I think technology-driven data brings serious challenges from a practical point of view, above all, because their granularity is multi-dimensional. As correctly stated by Andy Heldain from the Bank of England in a speech he gave earlier this year, it runs through their volume, the cross-section dimension, their velocity, the frequency, and their variety. And we need sufficient data analytics tools to use data properly while being aware of their limitations in terms of privacy and confidentiality. To wrap up, ladies and gentlemen, dear friends, the challenge in future will be how, I think, to translate the changes from new data into concrete policy implications. After all, the micro-evidence has to add up to policy advice for monetary policy, which is a macro-policy with a rather limited set of instruments, and therefore, I think in some cases, other policies, such as micro-pudential, fiscal, or structural policies, could be more appropriate for tackling the challenges that new data reveal. I stop there. Thank you very much for your attention. Thank you. Thank you, Jan. That's a very good start for this conference. And as you say, I mean, we are confronted always with the speed of technological changes and the capacity to adapt, because that requires big investments and big choices in budgets, for example. And it's always difficult to say. Think about an accredit that you mentioned. It was a big investment. The return will come over time. And these are always difficult choices, given the speed also of the technological change. I think it was a very good start. Pablo, many things will come back. I think one, in particular, also the measurement of prices, more particular. I think Evalti will also deal with the measurement of the price of prices in general, of inflation, actually. Thank you, Pablo. You go to the podium. Well, thank you very much. I promise that I didn't see Jan Smith's presentation before titling mine. But as Peter said, many topics will be recurring here. And during the last years, obviously, we are seeing some challenges from digitalization. There are profound disruptions that we are witnessing and that will continue to be present. Automation, artificial intelligence, and new business models such as the sharing economy, they are starting to shift the way we interact, we trade, we consume, and we live in general. It's interesting to note that this is happening not only at the core of the advanced economies, but it's something that is quite prevalent in a number of economies. Most economies are shifting the boundaries of what here is shown as their level of digital evolution. These, on the one hand, make the proper measurement of economic and financial phenomena more challenging. The traditional framework, as has been pointed out, emphasizes transforming basic inputs into final products by a representative agent, for instance, a firm that delivers a homogeneous good to a household. However, in the present day, digitalization and financial innovations put into question increasingly the usefulness of this paradigm. The digitalization of economic relationships leading to, as I said, the sharing economy make a clear cut distinction between producers and consumers harder to pinpoint. The value added of the sharing economy is generated by the match, and therefore we need to distinguish the contribution of both what was traditionally a consumer and the producer in the sharing economy. Also, the diverse way in which individuals and firms can organize themselves into economic activities renders the traditional view of the representative agent a bit more obsolete in terms of describing aggregate behavior. Another example is that the bundling of experiences and demands for goods and services, which in the past were provided by well-defined products, today make the identification and measurement of the prices of these goods and services enhance inflation much more difficult. It is amazing how smartphones, for instance, have today bundled a number of products that merely 15 or 10 years ago were provided by very well-defined different goods. On the other hand, the transmission mechanism of monetary policy is also likely to be shifting. There's one key channel of transmission that is through asset prices and credit flows, particularly the exchange rate and cross-border capital flows. It is not worthy how the globalization process which began in the early 90s has resulted in rapid and unprecedented integration of a disparate set of economies into global finance. This is likely to continue because the rise of mechanism to facilitate cross-border lending and scheme regulations such as crypto-asset is likely to make into the future these types of transmission mechanisms stronger and harder to control even by the most determined jurisdictions. The dispersion of economic activities across jurisdictions also highlights the importance that global value chains into global production and the decentralized use of knowledge and intellectual property obtain in a centralized way through R&D. For instance, according to international evidence, with globalization and the development of these global value chains, exports are increasingly composed of imported inputs, a phenomenon that is closely linked to digitalization will clearly be pushed forward through the use of blockchain in letters of credit, which is an incipient phenomena. And this will obviously on the point of view from central banks that try to calibrate the transmission mechanism, for instance, of monetary policy to the exchange rate to economic activity through net exports will need to revisit that calibration to take into account these features. Therefore, digitalization poses significant challenges going forward for the measurement of economic activity as well as for the transmission of monetary policy. Some of these challenges are very present in the near near term. Others will be more pressing in the distant future. In any case, monetary policy authorities, particularly those that follow inflation targeting will need to be cognizant of these developments to understand and calibrate their policies to achieve their goals. In terms of the need for granularity or heterogeneity, there are some challenges that need to be undertaken in the most immediate term. And I believe that tackling these will help in the future on the oncoming challenges from digitalization. These relate to the need to increasingly incorporate into the assessment and design of monetary policy the granularity of economic behavior. There are a number of examples and I want to highlight a few of them. In terms of prices, obviously an adequate response of monetary policy to shocks depend on the proper understanding of price dynamics. Consumer price micro data reveals everywhere and in particular in Chile, a highly heterogeneous price setting process in the economy. Here, I show the case of the case of my country of Chile in spite of inflation being well anchored in the target, our target is 3%, average inflation for the past 20 years has been 3.2, quite close to the target. And we aim to keep it most of the time between two and four percent. Interestingly, it has been between two and four percent, probably 50% of the time, so almost for most of the time, but it's very volatile how to understand this volatility and how to see whether it affects or not the achievement of our target is quite the challenge for us going forward. In terms of the labor market micro data, it is worth measuring that measuring labor's contribution to output has become more complex by these new developments such as increased labor participation by women, significantly more school years, immigration, self-employment into services, and also technological advances that are pushing out low-skill workers, among other phenomena. In this setting, what used to be traditional measures of slack in the labor market such as the unemployment rate or the employment levels have lost significant. This has been a feature of the economic landscape in a number of economies after the Great Recession. We need to understand the changing role of the labor market in driving wage and inflationary pressures that should come from the availability of more timely and granular information encompassing not only demographic and employment characteristics, but also the link to labor market outcomes in terms of wages. I have to say that this is obviously important not only for monetary policy, but goes well beyond that in the realm of public policy in general. In terms of financial data, this has also been highlighted. Borrowing structure stands out as other relevant element for monetary policy underscoring the need to close and timely monitoring of credit sources. The reliance on average behavior proved woefully inadequate in the run-up of the great financial crisis. We will need to be forever cognizant that credit events in very narrow slices of the specific markets, for instance, such as the subprime loan, had eventually systemic implications unexpected for the global financial system. So therefore, we should always be aware of the understanding right in financial markets. There has been major advances in the availability of more and better information from the financial sector, but obviously this needs to be pushed even further. Identifying the recipients of financing for today's monetary policy requirements is not enough. It is necessary to characterize their behavior, solvency, and vulnerability. For instance, in the face of an economic slowdown to predict the effect on business spreads, employment, and salaries. A balance sheet approach that highlights the fact that macroeconomic policies adopted in response to shocks may be constrained by domestic balance sheet mismatches. For example, tight monetary policy aimed at preventing an excessive real depreciation may protect balance sheets with large currency mismatches, but create further pressures on balance sheets that have significant maturity mismatches. Having this into consideration is important, especially in times of financial stress. A further example of the need for more granular data is on cross-border spillovers of monetary policy. They have been at the center of international policy debates, particularly since the onset of the global crisis. Understanding the channels through which one country's monetary policy affects the international economy is an ongoing research agenda, including spillovers via internationally active banks, and the BIS obviously provides a very good source of information on this area. However, the rise of alternative mechanisms for financial intermediation is likely to deepen over time. It can be noted that in those jurisdictions where crypto assets have become more popular and where authorities have shown a more restrictive approach to them, are also those where overall controls on financial integrations are more acute. These points towards the future where the ability of different jurisdictions to impose controls on cross-border capital flows will be diminished compared to the past, and this will also likely increase the mechanism of transmission of monetary policy through cross-border capital flows. Let me finish with some remarks on the potentials for merging administrative data to improve our understanding of monetary policy. The availability of data for monetary policy conduct could be improved, but simply increasing the volume may not be enough to significantly improve policy effectiveness. Having more information at hand can help make timeliness and timely and appropriate decisions, but only insofar as those responsible for monetary policy are able to correctly analyze and interpret this data. New data sources, open new opportunities, and one particular area where this is true and there were countries lagged behind their potential is in the availability of high-quality merge administrative data. Due to confidentiality issues and inter-departmental bureaucracy within different branches of the government, large data sets often sensual capturing different aspects of firms and household behavior remain isolated and of limited use. For instance, and going back to the question of price dynamics at the micro level, tax records may provide very useful information about the behavior of margins at the individual firm level. But this analysis could greatly be advanced by merging tax data with customized information to record cross-border commercial transactions. For instance, to understand the past through of exchange rates to inflation. For the labor market is the same thing. Merge data can also be used as a powerful tool to inform multiple decisions to the extent that it is available at high frequency. In the case of Chile, there's a good example that comes from a recent law that has required all firms in the economy to conduct all their inter-firm invoicing of value-added tax electronically. This allows the tax office to receive billions of purchase sale invoices between all firms operating the economy every single day. By itself, this data can be treated to provide good real-time proxy of economic activity since value-added can be extracted directly from every single transaction and also for inflation because invoicing records provide both quantities and prices separately. However, merging this data with additional sources such as bank records could be used to provide early warnings about systemic events in some sectors of the economy since it would include not only the complete network of transactions between firms but also the exposure of banks and financial institutions to individual firms and our particular industry clusters. Let me close with some remarks, concluding remarks. The use of the representative agent framework for implementing monetary policy and for measuring the microeconomy obviously served well the central banking profession for decades. However, the increased digitalization of economic activities implies that not only the measurement of economic relationships such as output, inflation and demand become significantly more challenging but the transmission mechanism of monetary policy itself will also likely shift. Central banks need to be cognizant of the difficulties this trends pose for the achievement of their objectives and they involve short, medium and long-term challenges for statistics, research and also model development. In the more immediate future, a fruitful approach stems from the emerging of administrative data and we have as a central bank that also constructs national accounts. We are heavily invested in that. On the one hand, this will provide an enhanced granularity in assessing economic behavior by heterogeneous agents. On the other hand, they need to understand and process this data by itself is a very good stepping stone for further challenges that will come down the road from big data and digitalization in general. Statistical agencies and central banks, I believe, are adequately placed to preserve the integrity and anonymity of reporting entities whether they be households, financial or non-financial corporates. This care for privacy is however a topic that in itself will deserve a deeper study. Thank you very much. Thank you. Thank you. Thank you very much, Pablo, also for bringing the international dimension but also for the rest. Thank you. Irland? Ladies and gentlemen, dear Peter, the title of this session is New Data Needs for Monetary Policy. And of course, this title could lead to the temptation to produce now a long wish list that economists, of course, are and should be data-oriented, so it would not be difficult to compile such a long wish list. But economists are also trained to think in cost-benefit dimensions to balance merits and costs and providing new data may involve quite substantial costs, costs for all concerned and in fact, we are exposed to some criticism in that respect. So I do not intend to talk about, let's say, wish lists but rather about new perspectives that means not necessarily add-ons but to talk about new products that may substitute old ones. And to start with a very basic problem for central bankers, I want to make some short remarks on problems of measuring inflation in a globalized world. So we do have, of course, the classical problems of inflation measurements. These are the four biases, product substitution, bias, quality change bias, new product bias, outlet substitution bias. And there have been new statistical methods introduced to reduce these biases. So annual updates of consumption baskets. In fact, this is what we do in Austria since 2010. Quality adjustment of prices with all this field of hedonic methods, we know quite sometimes not so easy one. Frequent adjustments of the surveyed outlet structures. But there have been two big challenges in the last 10 to 15 years. The effect of the internet on prices and inflation and in closing the costs of owner-occupied housing in inflation measurement. And I want to make some short remarks on both of this. With regard to the effect of the internet so in general, digitalization and Jan already dealt with this to some extent, I think it's quite interesting to see this, just this pure, again, statistical facts about the substantial increase in the use of e-commerce in the Euro area. So the red dots are for 2003. And blue lines is 2014. So you'll see practically everywhere there had been a huge increase. But what is also interesting is that we do have quite substantial regional differences. So we have very high percentages in Ireland, Luxembourg, Finland, and lower one, Greece, Italy, Latvia. So this, and you see the same if you have translated into the percentage of individuals ordering goods and services online. This is, again, has risen substantially. And again, it is different among countries. So the question is, what are the effects of these very substantial changes that we have in economics structures? And the question was, would this lead to a tendency that we can have lower prices due to this e-commerce? So it could be because we have saving costs for wholesalers and retailers. And this might be passed on to consumers. It might be, it might also mean increased profits by some of the agents. Or we could have, it mean of obviously increased competition, transparency. The question is, is this something that is only temporarily or is this something that is really a massive structural change? There is quite a number of evidence and a number of studies on the effects of e-commerce on consumer prices and inflation. But one has to say, it is not really conclusive. We have quite different fields. And if I just only restrict myself to the latest one, a very encompassing study by Cavallo, so that there is not really a clear indication that we have differences in online and in offline prices. So this is kind of an open field. But it may be, it may be one of the fields why we have this general topic of persistently low inflation rates. The other topic that is relevant and this much discussed is the integration of the costs of owner owned occupied housing in the HICP. And the basic question is of course, do we see housing as a consumption code and then included in the HICP as consumption expenses? Or do we see that it's an investment code and then of course it would not, I mean to include it. And as you see, we have according to the legal frameworks that we have, we have in the HSBC, of course, no asset price elements. We separate if we have owner occupied house price indices. There we have again some questions, what does it really mean? How to include land prices, this clearly asset prices which we may have more difficulty to assess. And we see that we have different approaches in different countries. So we do have owner occupied housing included in the US approach for inflation measurements. For currently, we do not include it in the HICP. But by the end of 2018, the European Commission will assess the suitability of integrating owner occupied housing in the HICP. And then the question of course is, will this make a substantial difference? And we have tried to look at this as far as we can with the data we have available and to make a long, short story short, you see the effects just if you look at the euro area are not very dramatic. So yes, it is something that of course, also if you want to compare with the US or so might make a difference. It is also not systematically. So it's not that it varies let's say with a business cycle, but it is a difference. It leads to slightly higher inflation rates as you see, but not really to a very, very high amount. So just to give these as two examples, where there is of course a clear economic policy discussion. If I may come back to the beginning and to a more general point. So are there new data needs for monetary policy? And if we look at it exclusively for monetary policy, John Smith showed and also Pablo Garcia, a number of of course new questions, new fields, so non banks, of course technical developments with regard to big data and so on. But if I look at let's say the banking side as we have it, it may not be the consensus view in this room, but I think I dare to say that from my point of view, we do not see really a great amount of new data needs for monetary policy. What we see is that we have a rich data stock which should be exploited first and we should try to focus on using data for multiple purposes. We have numerous requirements from various business fields are just different demands with regard to one and the same data stock. So let's merge these demands to gain efficiency and safe costs. We see that data requirements have exploded within recent years and collecting large data sets for one single purpose becomes increasingly hard to justify in the future. So therefore I think the challenge and perspective should be to harmonize, to cooperation and what we see of course unfortunately just now is the large number of international organizations. Each follow their own agenda and this of course drives up the volume of data requests and we do have some of these institutions so I think there is a large room for improvement here. We have therefore we need more cooperation, more coordination to assure that these international data requirements are harmonized and thus are also limited. I think it is not the number of data points that is triggering excessive reporting burden but uncoordinated methodological approaches together with suboptimal operational reporting channels. And I think ECB and EOPRA provide a best practice by merging data needs for both monetary policy, the ECB statistics relation and supervisory purposes so Solvency II in one single requirement for instance for insurance cooperation statistics. So therefore I think and this is to come up with this of course main topic of this conference I think concerning granular data. Yes, I think we can use granular data in a cost efficient way and serve multiple users because I think it is very important that they can provide a value added for a monetary policy and help but we have to avoid duplication. So therefore well-structured, well-structured granular data are an important ingredient for central bank statistics to master upcoming challenges resulting from an increasingly complex and fast changing global economy and we have just heard about some examples of these changes. So granular data may offer more flexibility, enable us to calculate any aggregates autonomously for various purposes. So you have one kind of a general set for this and of course it may also reduce the cost of implementing new statistical requirements in the future. And we have to be aware reporting burdens, looking at it from the side of a bank are fixed costs to a large extent. And therefore of course they are relatively heavy burden for small banks than for the big ones. So if we have this discussion on proportionality, I think this is relevant for the statistical side too and also like President Draghi referred in his introduction. I think in the U.S. we have perhaps achieved a more efficient distribution between these various types of banking sizes and clearly I think this cost element is something that was also the reason why some of the proposals met quite substantial criticism in the financial world in Europe. So I think what we have achieved is unaccredited. Unaccredited I think has been a big success or has a lot of chances. It promises as I have here high return on investment because it has a powerful granular data stock and this allows to tackle a broad spectrum of current and what is important future data requirements. So that means that it will replace inefficient and expensive occasional services dealing with data. It offers the perspective to harmonize international reporting requirements for loan data for instance in the long run which and this is a relief then for the reporters. And of course it unfolds a long lasting and still return of economic insiders in the near future on some of these problems that just have been set before. And I want to add because this symposium today is I think the last one that will be chaired by Aurel Schubert I start a general of statistics and I would really think it's worth mentioning that unaccredited is one of the big achievements that have been achieved also with your help and under your chairmanship. So I think this is also a good path to mention this. Of course, as I said before in all cases it's about balancing merit and costs. So that means if new data requirements emerge then of course first use existing data sets and that means there has to be some degree of flexibility so I think this is important otherwise costs will explode. Check if an existing status set can be extended and certainly and only if it is unavoidably considered to set up a new data structure so to balance merits and costs. I think that the ECB we have this merit and costs procedure. This is an example for an intelligent and structured framework which takes account of user needs and balances against expenditures but as we all know from our own experience this is a discussion that has to be done in any specific case. So this is not the one once for all but we have to do this again and again and I think we owe this to let's say our customers or to the public whom we have to serve. So therefore the key messages is that we will have of course major challenges in inflation measurements with due to digitalization, treatment of unoccupied housing. We have of course the effect of e-commerce and so on up to now as we see perhaps a bit counting too ifly the effects seem to be not very substantial. We should reinforce multi-use of data, make increasingly use of what already exists. I think what is really important is coordination and cooperation between institutions. And of course that when we talk about data requirements always balance merits and costs. So I think that as the president said this is we've had 20 years of successful work for ECB statistics and I'm sure there will be another 20 years of successful work and I wish all the best for that. Thank you. Thank you Eva. Also on this other dimension, merit and cost which was not in the previous presentation and I'm sure that will come in a discussion. Natasha, I don't know what is your take from this and thank you for having taken that. Sure, I don't know if I stand up but I have few slides where I take them. Yeah, yeah, yeah, yeah, that's fine. So someone might need to think. So thank you for asking me to step in. So I have the excuse of having known that I will be here since 24 hours about. So you will be, you know, Clément, that's what I'm going to say but I had the privilege relative to Pablo that I could see all the presentations of each other. So what I decided to do was to make a quick transversal summary of the takeaways I got from your speeches and material and then I'll pinpoint two of the examples that some of you mentioned as usefulness of a granularity in data and then a deeper knowledge of money markets and the underlying dynamics in what connects the central banking community with financial intermediaries as a whole. Before making the summary of what you all said, I wanted to recall that there was a time which is not so long ago. I think all the economists that were trained from my generation, I'm not the youngest one but not the oldest one either, where when you were studying monetary economics you were basically told that with GDP, CPI and the interest rate you could go a very long way into understanding how the transmission mechanism for monetary policy was working. And so those were the good old days or the bad old days and I think the whole discussion is about what have we gained and what are we gaining relative to those approaches which were complemented by approaches or approximations of monetary policy preferences by loss functions and Taylor rules and Taylor rules have been the game in town for central bankers for a long time even still now to some extent. So now, and this is what you collectively said the economy is changing, the structure of the economy is changing, the context in which monetary policies conducted is changing as well. The policy tools have evolved. The scope of monetary policy to a large extent has evolved at least in advanced economies. I mean in emerging economies, I think Pablo's remarks were very complementary to what has been said by Jan and Evald, but I think in advanced economies we moved into a new world with new tools and a combination of tools that yields new results. And then the third point is that technology will not wait and it will and it should help central bankers as any other policymaker and any other economic agent through IT improvements, through artificial intelligence and through the availability of data. And this is something which I think the central banking community seems to be aware and really up and running on those advances as opposed to what sometimes is the public perception of public sector entities. I think central bankers are living here. So I thought your views along those lines were fairly converging and complementary. Some of you put more emphasis on some dimensions. One key area for monetary policy that comes out of everything you said is inflation. So Evald spoke more than the other some inflation but the measure inflation is at stake. We need to and perhaps we can better than before follow the data generating process behind price developments more accurately but the DGP has also very likely changed. That's what Jan said and Evald said. Digitalization happened. The basket has changed. The example of owner occupied housing is a very good case at point. I come back to it later. To me it exemplifies the categorization into consumer goods, durable goods and capital goods. Those categories have changed for the mere fact that Pablo was underlying which is the sharing economy is coming up and the intensity of use of what used to be considered as capital is changing the nature of capital. I have a car, if I have a car and I let it sit when I don't use it, it's a durable good. Now if I use a car and you're using it 10 minutes after me and this is a 24 hours process of using the car, the car is becoming something else than a capital goods or durable goods. So this has to be taken into account in price dynamics and I think it points also to something that Evald was mentioning in terms of legal constraints and the definition of CPI which I'm not familiar with but the fact that we cannot include asset prices in CPI, is it something of the past? Do we still, do we want to conform our measurements to legal frameworks that might need some amendments? I leave that open for the discussion but I think the question was interesting to ask. Now, so as a result of all those changes in the DGP the value chains are changing, the value added is captured differently and so you asked the question rightly so, what is the impact on final prices? We should expect a decline in price dynamics or a slowdown in price dynamics but the fact is when you ask firms and I've had the chance to look at this from inside firms before, they see their own share of value added being eroded but this value added is captured by actors that are acting in a monopolistic context and this market structure as long as it lasts will probably prevent price dynamics from reflecting technological changes and improvements in the productivity that should lead, otherwise lead to more muted price dynamics. You may say it's a good thing for central bankers today who are looking for inflation but this is something to be taken into account this market structure argument. Right, so that was the point. Then there was, and that's my last summary of what you said, you had points about methods and best practices, the way data collection needs to be conducted. I'm not an expert at all here but I think the general message is best practice principle for data collection is being parsimonious and to consolidate as much as possible data requirements and now that we've had really a huge injection of needs for regulatory purposes it would be very difficult indeed for actors who have had to swallow those data reporting, new data reporting requirements to have even more data reporting requirements for monetary policy purposes. So given those constraints, the ideal world for data scientists and economists would be taking the existing stock of data as it is and here I come back to what Evald was saying in terms of cost efficiency and not being willing to renew data sets every other day, what is badly needed now is the ability to merge and to match data sets. I think a huge work, and I know a lot of work is being done for this in terms of matching identifiers making databases speak to each other, I think here there are huge efficiency gains that can be made for the end user of statistics. Now granular data has been a very common topic to you or it has lots of advantages you've highlighted that the terms of quality, flexibility, fungibility comes back to the matching data sets argument but it also matters for addressing a new set of economic issues which include heterogeneity and distribution and as far as monetary policy has an impact on distribution and relies on heterogeneous transmission channels this brings in new information that we are not able to address with the triptych CPI, GDP and interest rates so I give the answer a bit but this is also something for discussion. Now between granularity and aggregate macro data that we might be, it's a bit of a balance here we're so happy to be able to have those granular data that we want to have the most of it. Now we shouldn't forget that maybe the sum is sometimes different from the total, different from the sum of the parts and so we need probably to keep both approaches to have a comprehensive, holistic, like a good view about the transmission of monetary policy but also economic structures. Now I have a wish list because I surveyed yesterday the staff who's doing monetary policy here and they had mails coming back with very granular lists of we need this monthly, we need this in terms of stocks and flows so I won't read that all I haven't put that all on the slides but this wish list is out there and it's a kind of shopping list through which I think it's worth going. I summarized it in four points having a full cross-country matrix and I completely agree with that any kind of network analysis or general equilibrium analysis will only be possible when we have bilateral flow data of any sort and for my own research I will use it for international capital flows but that is true very generally as soon as you have to do with the network and the financial system is a network. The question of gross flows and redemptions so having clean net measures more granular data on the OFA sector so of the non-bank sector and multinationals need to be singled out more in national statistics this is also fairly a broad base. Now I have more of it, lots of praise of an anacredit so really anacredit if there's one thing to be underlined in terms of usefulness of the investments that have been made it is that one. It was the rant of some economists who were able to approximate itself but now that it will be available to everyone it will really create a lot of research and hopefully good research on it. The last point of this slide exposure to crypto assets someone mentioned it and I think maybe we should keep an eye on it not because it's very small but because in terms of the link between monetary policy and financial stability lies with crypto assets very deep and deeply rooted close to the meaning of money I don't want to expand on that but this is something fairly key. I just want to add something that was brought to mind by Pablo I think in the euro area as opposed to other countries and probably also as opposed to the US we tend to discard foreign monetary purposes the international foreign monetary purposes I'm not saying we're not looking at that but the international spillovers of our policies a lot of people look at it in this house but for monetary policy purposes using the international role of the euro as a reserve currency, as a trading currency using all those dimensions that have an impact on how our monetary policy transmits into first for example into foreign exchange markets sometimes we don't understand whether the euro goes up because of current account imbalances or because of the uncovered interest rate parity sometimes it switches and behind this there's a logic that we might fail to capture because we don't pay enough attention to this international dimension that emerging economies have had to pay attention to in order to survive as currencies as currency issuers so I think that's it's worth highlighting this I close with two charts one which is an example of why granularity matters and what can be used how we can use granular data without any modeling effort and what kind of messages we can get here you have a picture of the evolution of credit and investments by non-financial corporations this is made on Italy because it was a pre and a credit world but using the granular data we had for Italy or the authors had for Italy they looked at the evolution of credit supply to NFC and investment made by those non-financial corporations and growth rates relative to 2006 they split the sample into two subgroups and the splitting of the sample was made according to the way their banks were funding themselves so I'm looking here at the transmission mechanism of monetary policy without causality it's just visualization of data but the red firms are the ones that are getting liquidity or getting credit from banks that rely a lot on inter-bank funding and the blue ones are the ones that relied on banks that do not rely so much on inter-bank funding so it tells you a bit the link between money market structure and the transmission from money market to infinity to investment dynamics and the right-hand side chart is basically telling you that those who were exposed indirectly to a lot of inter-bank funding in those years were the ones where investment collapsed the most and even more so than the credit constraint as it appeared on the right-hand side would suggest I skip my last slide, Peter, because I feel I have a... Thank you, thank you, Natasha very pertinent point We have about 25 minutes Just quick comments also to the discussant if I may, Natasha The good old days, of course, these were the days where we had imbalances were building up and we were looking at imbalances in the labor market but the imbalances were in the financial market and the data, of course, we didn't have all the data that we would have wished but it was more the attention it was not only a problem of lacking data about interconnectedness but also just sort of focus on how you see the world and so that's just a caveat when you say we lack of data but it's also the attention to the right problems not always easy to do the second, it didn't really come in a discussion directly it's indirectly in it is the... what we use output gap, potential growth, productivity I mean, if you ask me where... I mean, I was asked by, I think, Bloomberg or one of these what will be the biggest mistakes, you know, that we are making today and that we will discover 10 years from now I say probably the growth of productivity because we have little clue now, of course, we have a GDP nominal and then you have both the real and the nominal and the price side and probably there we don't know yet but we know there are big measurement problems there and that may feed into, you know, our policy, of course we don't know yet today but that's a key challenge and the last little point I have or if I may, a little point is the... which was not mentioned and that's where we have this ambition to do which costs money, evald it's... what the New York Fed actually is doing is looking at publics, the public the general public, perceptions on inflation and on monetary policy it's not only because we have some information on markets, you know, via market prices but also service among market participants but from the general public point of view we have no clue, we have some service about do you think prices are going to rise fast or not fast, we have the Michigan and the US these sort of things but I think we should also look more about the public perceptions about monetary policy and the very, very last is also on wages of course which didn't come very much it came incidentally but you mentioned it also Pablo but that's also something we have big data on that but we need much more given what you said actually Pablo you mentioned that actually maybe I ask a very, very quick reactions if you have on the interventions Pablo maybe, if they are none, they are none and then we open, yes, very short if it's very short it's fine so I give a chance for the audience then to react so Pablo, Jan, Evald I think the discussion we will observe and we'll see Yeah, very, very quickly I found the other presentations very stimulating and Natasha's comments also very to the point one thing that I would like to highlight is that the distinction in this dimension between advanced and emerging economies is shrinking and it will continue to shrink really fast one example is that in an emerging economy where you have a weaker statistical base there are very strong local demands for data societies are heterogeneous and there are complaints from marginalized groups or far away places that they would like to get a better statistical representation of course that would become extremely expensive one example is Colombia with their aim at targeting social programs after the peace process how to do that well they found out that the very good proxy for local GDP was the density of cell phone communication networks so that you can get that every 15 minutes another example is in the case of Chile the self-employment is a very good buffer in times of weak economic activity and we've seen that the amount of Uber and Cabify drivers today is about the same magnitude of the increase in self-employment so these are a few ways in which this digitalization is changing the nature of the needs for macroeconomic statistics and finally on the merging I think that it's very clear that it's expensive to have new statistics and the merging of already existing administrative data is at least we see it as a very good venue to particularly is the burden of reporting which is kind of an issue, yes Yes, if you allow me two comments, one on the cost efficiency and one on the impact of e-commerce on the cost efficiency absolutely, I absolutely agree cost efficiency is of great importance no duplication, coordination, merging of statistics but in terms of cost benefit analysis the benefit we have to keep at all costs I would say in all prices is to keep trust in our policies and of people in what we are doing and so there are two things which are key our objective which is an inflation figure and our transmission process and so we have to be sure that we can maintain an excellent measurement of our inflation objective so it probably needs with e-commerce going up that we indeed are adjusting our methods and in terms of transmission process if it is true that this direct let's say inter-temporal substitution mechanism is let's say losing weight in favor of let's say response which is more moving through general equilibrium mechanisms if we know that the measures we have taken may have very divergent impacts because we know that a fraction of households have zero or little wealth and are very little very weakly responsive to interest rate movements if we know that wealthy people instead of consuming more if we are lowering interest rates are rebalancing the portfolios towards other assets if we know that they are fixed and adjustable mortgage rates across the area I think we should try to see that reality more and it is really needing more granular data in order to keep confidence I think in what we are doing second remark on the e-commerce it's three quick remarks one it's true it will have a downward impact on prices more transparency also the impact of technology on other factors of production will reduce prices and costs and think of robots competing with workers and so having downward impacts on wages okay secondly is this temporary or permanent I think a large fraction may be temporary because there is no reason to think that the diffusion of e-commerce at a certain point of time will not stabilize so it will be something more in terms of relative price levels and by the way monetary policy always can make sure that let's see also that the impact of this relative price adjustments is mitigated through our monetary policies and we have always the, let's say, the instrument the policies needed in order to make sure that the increased potential created by the digital is fully exploited but the third and last remark is that there may be also an impact on the natural interest rate of digital and technology so it may be upward because it's raising potential growth but it may also be downward because the demand of capital investment goes down this was an argument which like we saw was developed I think sometimes ago because you see that firms like WhatsApp they have a market value way above many other firms with very, very low capital investment and that's on itself very challenging for monetary policy giving the zero lower bound so we should then, if that is true, reflect also, I think, about our inflation objective, thank you. Very briefly, just I think this follows for quite a number of issues I think, and I think this is the chance also of this conference it's not just about getting numbers I think it gets in numbers to the relevant questions and therefore I think the cooperation between economic theory and statistical approaches is extremely important and one field where this is quite obvious and has been visible is, for instance, labor markets because many of us have been astonished so there was always the expectations of, as you know, the whole Phillips Curve discussion and so on but in labor economics, things have been much more advanced they knew that you have quite different relationships and I have to say, if I remember the discussions and you have been there also at the BIS when our American friends spoke most of their time was about labor markets but from the point of view of labor economics which proved to be relevant and then you know what are the questions that you have to ask for the statistics so I think this is to have this if you allow me, one short point This was not the confidential information what the English is talking about labor markets I hope it is not too disturbing the markets one aspect that I really wanted to didn't want to include this in the written part this is trust in statistics and this is especially important because in the time of globalization we have to deal with international aspects I have been recently being in China I have had a talk with the IMF representative in China about his view on Chinese statistics I only say, we talked about it I don't comment more of this but we have this also we still have some problems also in Europe and I want to finish with reminding you and I think this is for the statistics community still an open sore that we have this case in Greece where the chief statistician still has criminal trials and I think this is really perhaps much more important to mention than many kind of technological changes this is basic and I think this is something where we should remind again and again an independent work of statistics is essential also for statistics being meaningful I think you are right Eva that was very important to say open the floor for about 20 minutes