 ECB and particularly to the statistics stream for an Aurel and for inviting me to be a chair of this panel. It's a real honor. And I think it's interesting that you're marking 20 years of ECB statistics. Those conferences tend to be somewhat retrospective, but I think it's nice that we actually turned it around and said, let's look to the future. And I think we're covering several areas of the demand side for statistics, which are very relevant for central bankers. We talked about monetary policy, and now we talk about financial stability. I think the overarching aim is to better assess systemic risks, not only in a monitoring perspective, but also as an input into policy making to calibrate various and design various policy tools which are increasingly being rolled out both in Europe and globally. I think there's a sense that supervisory data and let's say traditional financial sector data, the MFI, aggregated MFI perspective, for instance, is not enough. We need to better understand linkages, linkages between the financial sector and the real economy and also linkages within the financial sector to understand exactly how shocks get transmitted and where the real fragilities might lie. There's, I think, a need to cut the cake in different ways. We can have a national view, but there's also, since we're dealing with cross-border institutions, you need to view these institutions from a slightly broader perspective. There's the issue of non-bank financial intermediation. We're not talking things like crowdfunding, FinTech, new forms of funds and so on, which if you just focus on the narrow MFI or the banking sector view is going to leave important issues on the table. There seem to be specific needs, I think, also from the borrower-based requirements that are being implemented. I think some of the speakers will address this point, new data. I think we had actually in Estonia some good experience when we were rolling out these borrower-based requirements. We wanted to establish the instrument, but we didn't want to do it in a binding way, so you had to calibrate it to be pretty much in line with the practice in the market at the time, and only through using this fairly granular view was that possible. I think the rollout went pretty well. As was discussed in the first panel, we're going to repeat a lot of things already, the greater interest in granular data. As many speakers mentioned, this issue about benefits versus costs or maybe the data needs versus data wishes issue. I think for analysts and people if you say would a new data series be useful? Of course, since the price is zero, the demand can be almost infinite, and somebody somewhere will always say it's a great idea to have an additional data source, but the sector I guess is already chafing under quite a bit of reporting responsibility. Sometimes you often think it becomes almost as important as the core business for financial institutions, so finding that balance is very important. I think the first session put some ideas on the table. To discuss these issues, we have a great and diverse panel. I won't go long into the bios because I think most people here know, are familiar with the speakers. We'll start with Philip Lane, Governor of Central Bank of Ireland and more recently Chair of the Advisory Technical Committee of the SRB, and Richard Berner is Executive in Residence and Adjunct Professor at the New York University Stern School, third then Luis Pereira de Silva, Deputy General Manager of the BIS, and as discussant we have Hans Helmut Kotz, who divides his time between Frankfurt and Cambridge, Massachusetts, between Goethe and Harvard Universities. I'll give each speaker maximum 15 minutes and we'll try to reserve a fair bit of time for questions from you as well. So Philip, please leave us off. Thank you, Arto. So it's a pleasure to be on this panel. I think, of course, we are following an interesting session this morning on monetary policy, and Natasha Valle, I think, made the interesting point about how much do you really need in terms of data to run monetary policy? Maybe you just need the traditional macro time series. But of course, when it comes to financial stability, if there's a case of granularity for monetary policy, that's even stronger again when we want to talk about financial stability. Let me just say, first of all, in terms of thinking about central banking and financial stability, of course, there are different dimensions. There's essentially ex ante, which is, if the conditions are financially stable, let's see what we can do to preserve that. So that's a microprudential issue in terms of supervision. And of course, then you think about the data you need for effective supervision. And it's a macroprudential policy issue in terms of what data do you need to, as Arto just said, calibrate macroprudential measures. But in addition, exposed if the crisis arise, the conduct of financial stability policies, which for central banks is going to be the conduct of liquidity policies, also has a heavy data requirement. And I think that's an interesting issue. So whether it's ex ante or ex post, this is a big topic. I think the organizers have really begged the question in terms of the composition of this panel, which is a global in nature. So through our work here in Europe, through the work of the BIS, and of course with the US authorities, it's clear that financial stability has a significant global component. And what's interesting is what can we do at an EU level, so going above nation by nation data, and then what can we do at a global level? So within the EU, the European systemic risk board has a unique role in terms of the coordination of macroprudential policies, and also in terms of having the oversight of data streaming out of every nation state. So the ESRB, for example, is a unique window into, for example, the derivatives data collected under EMIR. And of course, here at the ECB, in terms of macroprudential policy, because ECB has top of powers, for example, in relation to the counter-scriptical capital buffer, it's necessary, and I would heavily endorse, it's a very good idea, that in addition to the national level use of macroprudential policy, the option to top up at ECB level provides an additional layer of discipline, and therefore the ECB needs to have that overview as well. And of course, macroprudential policy is quite immature, as in it's fairly recent, it's widespread use, and so the more we can learn from each other, I think that's quite important. And to learn from each other, the common data platform I think is quite important. So there's a lot going on within the European system, and of course, there's going to be a parallel conversation at the global level through the BIS, the FSB, the IMF, and so on. But of course, the extended data sharing at a global level is quite limited. In fact, I think the exception proves the rule. I mean, the most important exception is the BIS data hub, which shares in a very limited way the firm level data on globally systemically important financial institutions among a small group of supervisors. But the fact that that's so heavily limited in ring fans, it's a big step forward, it's happening, but it does indicate what is the near-term potential for much wider sharing of data, I think, let's see. Let me emphasize with, in terms of what's big steps forward, that really I think one thing to notice, and one thing if you look at the ESRB working paper series, you see more and more, is the value of the EMIR data. So again, this is, I think, a big reporting burden on those who are reporting, and it's vital that we demonstrate that, in fact, these data are useful, these statistics are useful. And I think it's proving that way, that in terms of understanding what's going on, say, in the interest rate swap market, FX derivatives, credit default swaps and so on, I think it's proving already quite valuable. It needed the, and I think the same is going to be true, and a credit. It's going to be a little bit of a lag, because the researchers and the analysts need to learn the new data set, need to clean it up, and so on. But I think we can be increasingly confident of the value of that. Now, of course, because it's European-level regulation, it doesn't capture everything we would like to know at a global level. And I think the more we can, over the sufficient time span, to push for corresponding data collection and data sharing at a global level in derivatives, I think that's really important. And of course, important projects such as common identifiers like the LEI is an important part of that journey. Let me switch actually now to the really side of the international economy and the role of multinational firms. So multinational firms, in production, we understand in terms of global value chains and all of that. But more and more multinational firms are very important in the global financial system. We've noticed, you've noticed that some of these firms make a lot of money, and they have very large treasury operations in terms of managing that cash. So if you think about global savings and investments, understanding how multinationals allocate their cash is quite important. So I think there's a lot going on here. So recently I co-ordered with some BIS colleagues and a colleague at the Central Bank of Ireland. It's in the March quarterly review of these BIS. I think we put together our thoughts on this. But I think really understanding the balance of payments these days when you have the multinational firms not just in the real side in terms of the trade balance, but also in the financial account I think is really important. I think also of course these firms in terms of their treasury operations, in terms of using say special purpose entities in international financial centres, that really it's kind of interesting to look at it, but really understanding the headline BOP, understanding the headline national accounts. And it's not just in my own country where this matters, it's much broader than that. So I think the agility, maybe the question here is agility. I mean it's good of stability. It's good that we know the rules for national counting. It's good we're on BMP manual six as opposed to having one every quarter. But the agility of how quickly kind of world statisticians catch up with what's going on in terms of the evolving practices of multinational firms I think is a challenge for this community. So coming back to the value of international data sharing, that of course builds on trust and that's a scarce commodity. So I'm not saying there's any easy, often I hear in certain circumstances people say well just do it. Just do it. It's the Nike view of international data sharing, just do it. But that's not realistic. You do have to build up to that point. And so the kind of slug of working towards that I think is quite important. Let me turn to the borrow based measures because here we know the risk is in the tail of the distribution. So knowing the average loan to value ratio in the population is not super helpful. You do need to think about what fraction of loans are in the different parts of the distribution. So I don't really see how you can have a reliable system of borrow based measures without the granular loan-by-loan data. It's also I think even better if you can match that with other data characteristics like income level, employment status, non-market debt and so on. And this goes to acceptability. We talked a little bit about acceptability this morning. And for example for us I can tell you, because we have a, say the loan to value, we have loan to value ceilings and loan to income ceilings, but we allowed the banks to do a certain amount of lending above those ceilings. And what we do is we're able to demonstrate, in fact, the use of those exceptions, the above ceiling lending, is in line with what people might think is desirable. As in younger people, I'm more likely to get a bigger exception from loan to income because their incomes grow over time. First time borrowers get more of an exception to loan to value because again the case for that is made. So I think it's not just a question of what we need to set our policies. It's also a question of what's needed to demonstrate that these policies are reasonable from a social point of view. So let me in passing just mention we've just introduced a consumer credit register. And this is I think the confidence of what's helpful for banks in terms of having a CCR that they can look at can also have value for statisticians. Let me also mention I think the value of cooperation, say with tax authorities because what's really would be ideal is you're able, not just have the point in time data about when you take out a mortgage loan, what are your characteristics, but also updated. So five years into your loan, 10 years into your loan, what remains and is there an update on your characteristics such as employment status and so on. Let me turn to an obvious financial stability risk which is the boom bust cycle in property markets, whether that's residential or commercial property. So this is again I think an area where anacredit should be quite useful because the anacredit should deliver much more information about the distribution of exposures to the property sector, the interconnections across banks in relation to property exposures, the value of property collateral and so on. So I think that you can really see a tremendous potential here in that area. In relation to pricing, I think there's more maturity in how we think about the construction of residential property price indices. But I think the ESRB has highlighted the data gaps of commercial property prices. And there's an interesting issue about a purest approach where you want to build a comprehensive price index based on all commercial property transactions versus maybe a kind of risk-adjusted approach where you may say, well, maybe the bigger concerns are certain slices of the commercial property sector such as that prime real estate. So I think there's a lot of work to be done there. And then maybe just in trying to close, Ardo mentioned the rising role of non-banks in the financial sector. And although a certain amount of information is collected from investment funds and so on, I do think in terms of financial stability policies, having a better and more uniform way of thinking about the leverage and liquidity positions of investment funds may turn out to be helpful in the future. And really, you know, we've maybe can say we've accomplished a lot in terms of banking data. Maybe, you know, we need to pivot our attention towards these other sectors. And then finally, the other issues, interconnections, the issue of who to whom. So actually, it remains the case that what we've collected can be less than fully exploited because the ultimate owner, ultimate destination is very necessary. So it's kind of dissatisfied if you look at some data set, you see, okay, well, now I know that, say, the Cayman Islands is a big investor in some country. You know the Cayman Islands is just a conduit. So really understanding what is behind that is, I think, remains a challenge. So it's 20 years into the ESCB's work in these areas. I'm kind of a, the glass is basically still half empty, if not more than half empty. You know, so I think there's still a big uphill mountain to climb. And so this goes back to cost issues to reconcile the need for more data and the actual costs, which is also not just on the reporters, it's also on us as central banks who've got to have the systems in place to take in the data, the efficiency issue where maybe the merger with data scientists with IT and so on is just as important as the kind of financial institutions on us ourselves. So let me stop there. Okay, thank you for a very wide-ranging set of thoughts and a very concise executive summary at the end saying we still got a long way to go. So Richard Burner, please, before I start. Okay. Thanks very much. First, I want to thank the organizers for setting up this conference and for having me here. I'm honored to be here. And second, I want to congratulate Oral Schubert with whom I've worked over the past several years for his leadership and all his accomplishments as Director General of Statistics here at the ECB. Oral, when he asked me to come here, asked me to focus on potential threats to financial stability and how statistics can inform decision-making about them. Here today I'm going to talk about both a system-wide and enterprise risk assessment because I think that they should complement each other and use the same basic data. Philip's example, I think, of the multinational firms is a great example of how multinationals manage their risk is extremely important for us to understand as policymakers or whether we're doing research in the area. And using the same basic data makes perfect sense, not just from an efficiency standpoint, but also because we also need to use the same basic facts in talking about the phenomenon that we're talking about. So today I'm going to talk a little bit about, let's see, I need to advance this. Is that the way? Do you know which button to push? There we go. The big one. Okay. So first I'm going to talk about some financial system vulnerabilities and I've identified five just to try to keep the list relatively short. Second, I'll talk about ways to achieve that efficiency and effectiveness and best practices to improve data quality, scope, and accessibility. I'll talk about some critical data needs and give three examples. And finally I'll make some comments about some requirements to realize the potential of using big data, new analytics and new technology to improve system-wide and enterprise risk assessment and compliance. I think first the data must be fit for purpose, something we've already talked about but I'll say some more. In addition, I think an effective partnership between and among regulators and between regulators and industry is essential to align their use of these tools and to standardize them to make them interoperable. So first to the vulnerabilities. I think first there are still vulnerabilities in security's financing transactions. Market-based finance and shadow banking and I draw the distinction between those. We can still see that the default of a broker dealer can create fire sale externalities on a post basis. So I'll talk a little bit about those data needs. Second, we've mentioned earlier in the session the transition from LIBOR to alternative reference rates. LIBOR's foundation still remains fragile and I think it's widespread and ongoing use is going to make that transition a challenging one. Market participants must have confidence in the new reference rates which among other things I think involves the integrity of the data that underlie them. So I'll talk a little bit about the U.S. experience there. There are three other vulnerabilities that are also top of mind. First operational and cyber threats including those from cyber incidents they may accelerate especially against a backdrop of rapid innovation and technical change. Second the current move to either tapering or actually normalizing monetary policy may expose vulnerabilities in rising corporate leverage and deteriorating credit underwriting and credit quality that's certainly true in the United States. And third fragmentation and even conflict among national policies I think may test the resilience of cross-border arrangements in global financial markets in response to external shocks. So I'm not going to spend a lot of time on any of those but I'll just make three comments. First the heterogeneity comment as Philip mentioned I think that all through all five of these require the use of granular data because we are interested in tail risk and because we can't understand that unless the data are fairly granular. Second this list is not exhaustive and therefore when we think about the many vulnerabilities that we might face in allocating our resources to collect data we should think about how we can use the same data for many purposes and make that efficiency really work for us. And third because financial stability and vulnerabilities in the financial system is multi-dimensional that's why we have a macro potential toolkit. I think that the data needs are obviously going to be multi-dimensional as well. Here again we need to think about how best to use those. So that brings me to best practices for filling data needs. This is really not rocket science but it does involve thinking hard about what we ought to be doing. The goals are to improve the quality of the scope and the accessibility of financial data. And to my way of thinking that means we need to align the interests and activities of both officials and industry practitioners. What does that really mean? I think it means that there's an important complementarity between the interests that each of them share. Obviously one at the micro level, another at the system-wide level. But having good risk management practices is going to help both micro-prudential and macro-prudential goals. And understanding what each is doing is going to help both parties in the work that they do. In addition I think there's an important complementarity between analytics and data. And I think Jan Smith referred to this earlier in the panel. Theory alone is not going to suffice when we think about the needs that we have for data. Nor will pure observation. Theory must provide a rigorous framework for hypothesis tests and observation has to ground it in reality. Equally I think we need rigor in how we go about filling our data needs. So some of the best practices involve the following four steps. First we need to identify the data needed and their business purpose. Do we really need the data? What is the purpose for which we're going to use the data? Is there more than one purpose? Are the data that exists sufficient for doing that or do we need to change those data in some way? Second we should design a template for the collection of those data. And that means being very specific and prescriptive about the data that we're going to collect and the way that we're going to collect them. Third I think we need to develop clear and precise definitions of the data that we need so that they align with the purpose that we have in mind. And here it's extremely important to use industry standards. Philip alluded to that. I'll talk a little bit more about that. And finally creating collection specifications for the way that we collect data. And I have more on this in a paper that I'm going to submit but I think all four of these best practices are things we should keep in mind when we go to collect data. There are two additional thoughts on best practices. One we should focus on collecting data not reports. What do I mean by that? I mean in the past regulators have focused on collecting reports and some of the technologies that have been used for example in the United States and the SEC has used Edgar. The electronification looks just like the paper reports. We should really think as I mentioned earlier about collecting the same data that industry uses to manage their risk that we're going to use to assess where the risks are in the financial system. Of course we already do that with swap data. We set up swap data repositories. We're involved very deeply in that when I was in the government from swap data repositories or trade repositories. The problem was that we didn't use effective standards. We didn't use some of the other best practices that were needed to make those data coherence. So both are needed. Second we should conduct industry outreach very early in the process to understand whether or not what we have in mind when we go to collect data really reflects what industry is doing. And I think they are the alignment and the conversations without involving capture by industry are extremely important. That dialogue is very important. Third in my experience performing a pilot collection is invaluable in informing how we go about doing the larger permanent data collection. We learn a lot from the pilot in securities financing transactions specifically in repo in the US. We learned a lot from a pilot collection that we did. That's very important. And last I think we need to engage in evaluating especially when things are changing the structure of the financial system the way that it's evolving to meet new client needs, new technologies. We need to engage in a continuous life cycle life cycle assessment improvement and data. And this chart illustrates that kind of life cycle improvement where we start with data requirements. We implement those data requirements. We assess the data. We identify data gaps. We propose changes to them and then we evolve in new directions. That may involve no longer collecting some data currently being collected because they're no longer relevant. That's all part of the life cycle process. Let me spend a moment on data standards which was alluded to just briefly. I think they're essential for the quality of data in order to compare aggregate and link data sets and that's been discussed in other panels here. We need to have standardization of data so we know exactly what the data represent so that they can be compared. It's even more critical today to use data standards in representing data because if we're going to automate some of the processes if we're going to have smart contracts if we're going to use fintech effectively if we're going to use technology for compliance and regulatory purposes then the data need to be standardized and precisely defined. And of course you're familiar with some of the identification criteria both in terms of the LEI the legal entity identifier which helps us identify who is who but equally we need to know who owns what is Philip alluded to so the UTI and the UPI work that's gone forward under the FSB and other organizations has been very important in that and our engagement in the past with Auro and the Bank of England in that regard I think has been really instrumental in moving that work moving that work forward. Let me give you three examples of critical data needs for financial stability in which I've been involved. The first which I alluded to earlier is in the U.S. repo markets and specifically in the bilateral market which constitutes roughly half of the market so this is pretty important for assessing securities financing transactions in the United States. It's critical for as an ingredient in constructing the U.S. reference rate the so-called secured overnight financing rate and I think it's essential for financial stability analysis. Philip alluded to the fact that we spent a lot of time on collecting data from banks but not so much on data outside of the banking system. We obviously need to do more of that if we focus on activities wherever they occur like repo transactions then I think we have a good shot at doing that. Related to that U.S. money market fund holdings very similar to what's been done here in Europe with the MMSR the SEC in 2010 started to collect data but those data were difficult to access so the value added that my organization added was to make them much more accessible both through visualization and creating a database where people could look at the time series of data that was available on a monthly basis and compare and contrast money fund characteristics by who owned them, who issued them and understanding both on the buy and the sell side of the market what's going on. The third area is in swap transactions obviously that's extremely important understanding derivatives markets making them more transparent that's been an extremely important and beneficial development. It helps us assess risk in markets as well as how we can make sure that we have a better understanding of what's going on with the MMSR and the MMSR and the MMSR and the MMSR and the MMSR and the MMSR and the MMSR and the MMSR and the MMSR and the MMSR there's a very good publication that was put out this spring called SWAPS Regulation 2.0 that I think is a good roadmap to looking at that. One area that we obviously find a challenge and I alluded to operational and cyber risks data on cyber risks are hard to come by. We lack data on the scope of incident and their cost I think in the past there's been a reluctance to report them a reluctance to understand or some difficulty in understanding exactly what data should we report and how should we categorize of those data. I don't pretend to have all the answers but there are some straws in the wind that are pretty helpful in that regard. The IMF has done some estimates using data from experts and doing some tail risk analysis to assess the cost of a cyber shock in the financial system. In the United States the federal framework has improved for collecting and sharing data through institutions like the so-called FSISAC and FSARC which are joined together. Those are industry groups that are aimed at sharing and collecting data among the industry. At the regulated level the so-called FIBIC, the Financial Banking Information Council is also sharing data and looking at best practices and collecting them and within firms we have up what they call fusion centers which draw all across the firm an interdisciplinary group of people to look at how operational risk can affect every aspect of the firm. So in identifying, detecting understanding and recovering from cyber incidents all these things are extremely important. The governance around this really matters training employees making sure that people are aware and what the incident response ought to be thinking about how we ought to collect data and this arena is really important. There's been some good work on data thinking about those issues here in Europe with the Euro Cyber Resilience Board and the Thread Intelligence Based Ethical Red Training or Tiber EU Institution but those are in their infancy and more work I think needs to be needed on that. I'll talk about partnerships again when we talk about the goals of having a dialogue either among regulators themselves or between regulators in the industry I think the goals are really important they are to use the same data and facts for several purposes for example making policy decisions and risk management decisions and I think it's also essential to have that partnership because we need to think about how we're going to create a new approach to regulatory reporting and compliance. Technology can facilitate that and I have a vision the vision really does involve using the same data at the firm level that we can then aggregate up for policy makers and regulators to have but in order to do that we need to make our systems interoperable so there are two venues for collaboration and coordination I think that are needed among regulators it's extremely important so we can share data. In the United States as all of you know we have a fragmented regulatory system to solve the collective action problem of sharing data among the regulators there is still a challenge that regulators are grappling with and then between regulators in industry we need to have build trust between the two groups so that that can happen. Final points I think we need to start expanding the collaboration now well before that technological revolution is complete. If we don't then we'll have to redo some of the things that are now being contemplated. Second I think we need to start improving quality through more extensive use of data standards now and third not just data standards but I think also technology standards are essential at the start to assure the interoperability of the technological marvels that we're putting in place. Thanks very much. Thank you for your views and an array of issues. I thought these issues about operational risk are quite interesting because I guess we usually think in terms of balance sheets and income statements and so on but that these other risks that are emerging now might be coming from some totally different area and require different skills. Louise, the floor is yours. Thank you Thank you very much. It's a pleasure to be here with you. I think we can notice already a statistical oddity. I mean the previous panel on data for monetary policy purpose was a standing panel and the panel on data for financial stability is a sitting panel. I'm not sure this means anything but let's see. Stability, right? We sit. Okay, so I'm going to be trying to make a point of the necessity of course of collecting data and new data but also connecting it to understand what the financial stability implications are. So obviously I think we all know that financial stability has been a concern of policy makers but it obviously has become something much more present after the global financial crisis and maybe let me introduce this speech by making a bit of a provocation with my friends from the previous panels, the monetary policy committee guys. I think probably you have an easier life than the guys sitting in financial stability committees. In what sense? We know that even with the caveat of not understanding perfectly well where Airstar is that the price stability is perhaps easier, more measurable, more direct than financial stability which is a pretty complicated multi-dimensional concept, right? So the life in MPCs in monetary policy committees have a procedure, a clear measure, a metric and as was discussed previously, sort of reliance on relatively straightforward data sets on inflation expectations, output wages and so on and so forth. Now of course as Ival mentioned there was some complications. You want to think of housing costs or if you want to get more granular, use scanner data but overall the life is probably simpler. If you are sitting in a financial stability committee, you have this complex task of first find the metric and why is it after all financial stability? Is there anyone that can sort of define it properly given that it faces very complicated issues of understanding what is systemic risk and the capacity for this risk to change. So obviously you can say look obviously if you have more data you can help policy makers with that and you can anticipate and probably manage the next financial crisis but it might not be sufficient because of this complexity and what I would try to argue throughout this presentation is that as much as data is necessary, the theoretical analytical framework with which you analyze it is also a different amount. So why is it so? Because the scientific discovery is not just assessing data and trying to pull it doesn't happen by accident even if sometimes accidents help accidents only help as Louis Pasteur used to say those who can interpret what these accidents are and well Yogi Berra as you know the baseball player used to say it in a more humorous way if you really don't know exactly what you're trying to achieve even if you have a pile of data it doesn't help you, you just get more confused and somebody was mentioning the cost of collecting data so you're going to spend a lot of money without helping you to understand what financial stability is about. Now hopefully I think we have evolved in financial stability analysis from the sort of very nice broad narratives that you can find in classical works in Kendo Berger in Minsky to something that is much more specific which is to identify things that we can say look these are early warnings indicators of crisis and this is data that pertains to this category because it has some predictive power and of course the more you advance in the understanding of modern financial crisis the more you try to find exactly what is the data that can allow you to fill these gaps to identify the early warning indicators and we all know that exposed it is very easy to say look this was precisely the stuff that created the financial crisis that we are analyzing what is complicated is what exactly is the type of vulnerability that is hidden that is slowly undermining the financial stability of the system creating systemic risk and it's not necessarily showing up in the type of information that we have think of for example the mispricing of risk think of the analysis of rare disaster events that are very difficult to predict so in other words the point is that data is of course relevant and important but you need a theory to be capable of fitting the data into something that is meaningful for preventing financial crisis and helping you to maintain financial stability so as much as collecting data collecting dots is necessary connecting them is also something that is very important now even if you have a theory mind you it might be even also tricky because it doesn't mean that if you have a theoretical framework about the financial crisis that you'll be able to interpret things in a straightforward way why is it so because we know that the theories behind the interpretations of financial crisis are full of false beliefs remember this famous line with Ken Rogoff and Carmen Reynard that there is always a story about oh this time is different in other words the data that we have doesn't tell you exactly the true or the data that we have we don't really interpret it the way it should be interpreted because this time is different for example in Latin America countries do not go bankrupt during the Asian crisis remember people say oh we have so much growth high savings and solid public finance that it can support higher levels of debt and of course we all know during the GFC that we used to say look financial innovation spreading risk and something that enhances stability so in all these episodes we had data we were we had the theory but we were not really capable of interpreting them and of course after interpreting them to take corrective action so what do we stand I think look we do have some stuff that resembles data that can be considered some early warning indicators of episodes that disrupt financial stability we are still missing a bigger robust theoretical analytical framework to understand endogeneity of financial crisis but we do have let's say rough estimates at an aggregate level of things that obviously we all know create unsustainable imbalances and we have data about that credit to GDP gaps prosyclicality of borrowing and excessive restaking but the more probably the more the more we understand the financial crisis the more we understand also that beyond these aggregates you need granularity as discussed here in the previous panel granularity on financial data exposures we know that everybody all of a sudden discover that they were exposed to Lehman interconnectedness between G-SIBS now we know them but before that we neglected that OTC derivatives and so on and so forth so the hope is that we are sort of assembling this data set about early warnings and the whole idea is to see if with this with the stuff that we know now it can help policymakers to anticipate to prevent crisis because that's the game right you don't want just to understand crisis you want to understand with data that you can sort of see the early warning and then take remedial action before this blow up obviously we are increasingly aware that the international dimensions of crisis is paramount needless to say globalization of finance is obvious as a factor of risk in all our analysis the interconnectedness we know that the transmission was sudden brutal from the prime in the west to European banks we need essentially to understand the proliferation of these products and we do have now data on that on bond markets on non-bank intermediaries I think Philip and Richard mentioning the need to assemble enterprise data on this I think we understand now the importance of collecting data on asset managers positions on global C-C-Ps and also on some very big emerging markets that I would call systemically important middle income countries because why because they are highly interconnected interconnected with our financial systems and because they are prone to let's say crisis in the periphery of the system they can cause the spillbacks and they can sort of produce spillbacks that are severe enough to cause financial crisis in the advanced economies and of course then we need data about these cross-border flows cross-border exposures which hopefully now we begin to have finally a tricky thing is that you have stuff that you don't necessarily know now that is a vulnerability in the making you have a suspicion that this is dangerous dangerous for the stability of the system but not necessarily an awareness of where does it fit into the building of risk well everybody spoke here about innovation financial innovation but think of having more data on networks on interconnectedness on trade repositories on fintech of course on crypto assets on algorithm trading and of course I think it was mentioned before if you do have an avenue to explore is to look at big data how could this bring in terms of new information for financial stability now if you don't know stuff which is economics and finance maybe you can ask for little help from other disciplines like physics for example we know that there are some people looking at network analysis to I think I mentioned this at the beginning the necessity to understand the contagion linkages but the point is okay you have a network a stable network is it something that you can observe as a static thing or something that will shift and change immediately if market positions move in the same line you have now physicists that are exploring ways to model what is exactly that causes the changes between solid states to unstable which you can consider more stable states to liquid states or states that you can consider more unstable should we as economists and policy makers try to explore a little bit what this is bringing to us and would that be useful for policy makers we don't know and of course last but not the least what big data can bring us to understand better financial stability I think we all know that this is coming that we can have real-time data affecting the major agency in the system and I think we should explore that global banks the interconnectedness of financial institution I think it was mentioned here mortgage debt how it can transmit instability into the system and how of course the granular data about credit can help us and this is as we know it's a challenge that goes much beyond just the capacity to accumulate this information it's not just an IT an IT problem now to sort of move to the end of this presentation I think well we have let's say some stuff about early warnings we need to assemble new data we can use big data we can use different types of disciplines to understand the dynamics behind this complex notion of financial stability but we also need perhaps to have stuff to present to prevent financial crisis and I'm here thinking of this array of macro potential tools that we are learning to understand I think Philip mentioned this as an important element to reduce excessive risk taking and obviously if you want to measure the effectiveness of these instruments of macro potential tools and there is an array of them in our toolkit you need to have data sets to understand how they play with other macro and financial variables credit growth and balance sheets of institutions and fortunately they are we are at the beginning of a process of developing several data sets with researchers with an effort by the BIS but also of course the IMF and the FSB and there are several of these data sets that are available where you can understand the interplay between the fact that you use an instrument a macro proof instrument and the capacity to bend the excessive risk taking in a financial site problem with these data sets is they are binary so you have only the plus one, minus one for tightening or loosening and zero for stationary might not be enough for doing a sound econometric exercise with this type of data set but I think it's very important to have them finally to conclude with all this what I would say is the first policy implication in order to gather data to help us to maintain financial stability as I mentioned at the beginning first thing I would improve data for early learning purposes which we start looking at the diversity of things maybe outside a little bit the domain that we know because I think we need to be a bit more creative focusing on financial stability means focusing on RAR on large disasters and if you take an analogy of climate change you need to be capable of modeling and understanding with data these more rare events which means that you need to start looking at these rare events with a different lens and the data that you need for that is perhaps different just to make myself clear think of one event which is the development of big tech firms in payments in system payments that are very reliant now in them take for example Alipay in China or Tessent in China these guys have half a billion customers they are basically assuring the transactions the monetary transactions the payment transactions of very large segments of the consumer market in Asia and if by any chance they have a reputation of failure remember the Facebook episode where there was this distrust because of the leakage of private data suppose that you have one of these big guys that also get one of these episodes what would be the consequences for the stability of the payments system in China of course if it is in China it is in the whole world so this is the angle the novel angle which I think we should get into if we want to have data that provokes us the tilt for an early one and finally of course we need data in financial stability to quantify policy trade-offs I think you need of course as I mentioned at the beginning to have a sound theoretical framework for that so you need data to feed this theoretical framework for me to explore policy trade-offs you need to be working in a general equilibrium framework but you need above all to understand how the real economy interacts with the financial sector so whether it is formally modeled as a financial sector within a general equilibrium framework or whether it is just a financial frictions you need to have data for that you need to have an explicit way in which you understand this interaction so that you can test policy trade-offs and of course while you do all this you need to work on the resilience of the system and for that you need of course data to understand if the system as it is now with your capital requirements reserve requirements with your deposit insurance schemes is resilient enough fortunately because of the crisis we've been doing just that in the G20 in the FSB at the BIS and other forums so that we can now have a core of the financial system that is more resilient than before the crisis so thank you I stop here Thank you Luis and I think I'd like the point about the lens and the theory becoming more and more important as all this stuff becomes much more complex this point about knowing where you're going before you start and I think it linked to Richard's best practice principles were along the same lines so Hans Helmut why don't you bring all this together in ten minutes thank you Thank you very much Otto it's a great honor and pleasure to be on this panel and it was really inspiring because the privilege of discussing this these arguments was having access to the papers beforehand so since this by the way is also a little bit about celebrating Auge who gave me by the way the assignment to be provocative so if I'm not polite it's his fault my idea was basically to summarize very briefly what we heard there's a lot of overlap between this panel and by the way the panel before and then Bog up the same tree as basically all of us did it's ultimately theory which we have to care about because to quote a famous economist Marshall facts don't tell their own story so in order to make data meaningful as you said you have to think about what are we trying to to think about here so what I'd like to do is so I'll be impolite in summarizing the three presentation in one page frugal, austere, impolite and I summarize it the way I receive the presentations so Dick Burner was starting with what we should ultimately care about if you're thinking about financial stability issues namely vulnerabilities and they might come up in interesting places which wouldn't have thought of now try to highlight or emphasize this point in a few slides later on then he carefully insists upon improving data quality which is scope as well as excess and also new means to read and interpret data one can be get easily very excited about these new ways of interpreting data but I do think it's important to start from the idea which is an old one technologies change the fundamental laws of economics often don't rarely do this is often deeply rooted in what we knew or learned before. Phillip Lane stressed the cross border, the multidimensional the financial stability dimension which arises out of the repercussions and interactions that's a very important part in particular within the euro area where you still have lots of nationalities increase season as a result of that repercussion internationally it means supporting all those efforts which have been launched in the wake of the crisis in 2009 in the environment of the FSB the FSB has been by the way not starting from data but starting from a list of seven issues which should be addressed that there was in September 2009 so the core of policies at that time conceived was about addressing underlying real economic problems as in order to do that think about what you would need to do that Louis highlighted that the issue of so it's much easier to work in the monetary committee than to work in the financial stability committee because you do not have a well defined objective it's fuzzy it's complex as a result of that you don't really know what type of data you need ultimately you might not even know which type of data you need exposed there's still discussion what was at the source of the great depression in the early 30s so you can have contentious debate about things long back in history so I would like to start with an incident which was important in terms of redesigning the institutions we now have which is basically what happened in 2007-2008 so the main questions I do think we have to care about are do we have the right data which data should we look for what do we do with this data and how should we how should we derive policies from there so what I'd like to start with is summer 2007 these are data it's a graph I don't know if I can so here you see spreads of secured over unsecured interbank money long stretch of history barely five to seven basis points and suddenly it shot up dramatically so what do we do with this how do we read this and we were sitting at the time in the monetary committee and thinking about what's behind that so what is unusual so we have data but how do we interpret them so one argument behind that has been this is the implosion interbank money markets and it was driven by the unraveling of subprime insurance products and a major issue there is what I would like to call foresight knowledge we always read these graphs from right to left so we are sitting here and have to think about what's going to happen here that's why definitely an anchoring in theory is needed this is important because policy or theory should ultimately to policy advice in 2007 there were two views of what was going on one was this is information asymmetry market and the other was it's a run it's a run of wholesale banks on each other the conclusion which the ECB at the time true was to inject liquidity that's here that's August 8, 9, 2007 at the time it was by the way criticized as hyperactive and panicky I don't know how you call this so theory should inform the policy so in the meantime we've achieved many many institutional innovations we've been filling data gaps I just clicked through them in particular in terms of the European system of banks integrated reporting frameworks an integrated vocabulary so common view so now let me dig a little bit deeper into theory financial crisis are and he has been quoted by you Luiz according to Kindleberger Hardy Perennial usually it has been three points over leverage mismatches underpricing of risk and interconnectedness which has become over the last decades very much international Philip has highlighted that so that's where we should look for in terms of which data we would like to see and it is about how markets can become dysfunctional intermediaries can become dysfunctional there's this credit gap it could also be mispricing of term premier and risk premier and the interaction between funding and market liquidity so those were the places where I do think we need theory as well as data to understand what ultimately is important because name me what's going to happen in the real economy so what does this lead me to suggest we have to develop and critically assess analytical tools and there's not one right model this also of course holds true for the new devices and it holds true for thinking about how we derive arguments from looking at individual cases no one right model you might recall the Jackson Hole conference in 2005 when Ragu Rajan suggested that as a result of using micro micro insurance instrument macro trouble should could easily arise he was heavily criticized so one suggestion would be beware of beware of groupthink try to integrate critical views so there's also and Luis pointed to that there's not only a need for inter but I would say also for interdisciplinary debates finally I think it goes much beyond cognition early on in the crisis there was talk about a window of opportunity which was about to be lost and now we have talk about rollback so I don't completely agree with this but I think it's a big in terms of this complementarity between industry and policy makers I do think policy makers are providing a public good financial stability which does not always align with what private sector entities would like to see for example the general public and the general public and the general public and the general public so central banks are somehow in a role of benevolent dictators and that means they need discretion they need judgment and they have to communicate with the general public so maybe this one the policy dimension thank you Hans Helmut do any of the speakers like to react Phillip or Dick or Louise then we'll open up one of the recurrent issues in this panel so far as being this issue about the interaction between data and analytics now I think that is true so it does mean within our organizations it's very important that the statisticians have a vibrant dialogue both with the macro monitor economists and with the supervisors so actually in terms of having a single dataset to collect and how to interpret and what questions to ask I think that's very important but I think it's also probably important to emphasize it's very much a dynamic relationship because Ciri is not static I mean there's no joke about economics which is the empirical guy saying to the modeler you know tell me your model and I will find a way to look for it in the data and the modelers are going tell me the data and I'll trim off the model to match the data and so there's always going to be that dynamic which is the models written now are heavily often the first page for Ciri paper is going to be well based on the data we see I'm now going to write a model to replicate some features of the data so it's a two-way street there and I think that's quite important and then the other thing that Hans Helmuth raised which Peter practice morning also signaled which is what is the role of markets data so one level you know maybe it's a substitute for collecting institutional data if the market is telling you what's going on you know we do think it's one of the pillars of financial stability is to market discipline and to see what we can infer from markets data but it does run into the limit because the markets also learn from us because the public good of statistics is not something that any individual market trader or institution can replicate on their own and this goes back to what some of you were saying essentially it's in the interest of industry that there are good data sets that they can also use for their own analysis so my again come back to what I said earlier on I do think we're in this kind of intermediate phase that a lot has been done but the interaction between the markets the institutions academics it's a very fluid interaction and we're far from having a true view of the world that we can you know guide us we have to recognize the dynamic element of that just as much as I think everybody here and I did emphasize the need for theory or framework to collect the right set of data to look at the right set of data there are also a very important role the statisticians and data brings to policy makers in a more common sense thing remember when I was sitting in the Financial Stability Committee of the Central Bank of Brazil and people were bringing us data about loans to used cars with LTVs of 150% and maturities of 7 years doesn't take a rockist scientist to understand that this is absolutely wrong and that you need to act on that so you don't need a big theory to use the appropriate set of data that people are bringing you to take immediate actions for a localized financial stability which was the car market in Brazil so I make two comments one in response to what Hans said I want to emphasize the fact that when I talk about a partnership or alignment of interest between industry and regulators there's an important asymmetry there obviously because system-wide financial stability policies are needed because neither risk management at the micro level nor micro provincial policies are really sufficient to deal with the system-wide externalities and market failures that really can arise from asymmetric information and misprice guarantees, misprice credit totally acknowledge that but there does need to be a dialogue I think between them so that they can better understand what their respective goals are second none of us have really mentioned a key use of financial stability data namely stress testing a workhorse tool for implementing micro provincial policies and filling in the gaps where we don't have counterfactuals to observe and to understand what the impact of our tools can be so we need to work on the framework for stress testing I think that's particularly true with respect to operational risk which I mentioned earlier it's particularly true with respect to CCPs and how to do stress testing for CCPs does it make sense to test them one by one I think not I think we have to test do stress testing for CCPs in the context of their relationship with their clearing members and other counterparties and with the system as a whole because they're so highly interconnected those are really important issues and we ought to think about the granularity that we really need in stress testing which is intense for CCPs but it may not be so intense for less complex smaller entities such as some of the smaller community banks that we have in the United States we ought to differentiate the way that we use these tools by where we think the risk is thank you very much so we have about 15 minutes officially I don't think we want to eat too much into the lunch break so we'll collect maybe three questions at a time and then address them to specific people at the panel