 So I'm in really big trouble if we don't manage to finish in the right time. So it is a pleasure to share this last bit of the second ESRB conference. Following the tradition started last year, we delivered in the last session, the ESRB research prize that was created in the memory of Ieke Banderhu, which is a founder of, in a sense, the advisory scientific committee that is currently administering the prize. Now, we got many very interesting submissions to the prize, it was hard to identify among the very best the current winners. I would like to take the opportunity if you have students or colleagues that fit the age requirement and the topic to encourage them to submit their work to this prize that is an annual event. It's intended to stimulate young scholars to do research in the area related to the ESRB mission. With this preamble, I think it is my pleasure to announce that this year's prize goes to Marco De Rico from University of Zurich and Tariq Rokni from MIT and University of Ghent for their paper, Compression Over the Counter Markets. It's a very interesting piece on something about which I knew very little when I read the paper, which is a multilateral netting in the case of their particular application in the paper to Over the Counter Derivative Markets. I have to be brief, so let me not elaborate too much. We are gonna give the chance to one of the winners and author of this paper to present the work to you, but we were impressed by the quality of the material and the combination of very serious theory. There are propositions actually very challenging mathematically trying to deal with what will be efficient algorithms, efficient rules for producing this multilateral netting in a way compatible with preserving trading relationships and with not touching, if my understanding is correct, the net position of those who operate in these markets as you know are characterized by enormous volumes of trade and enormous, therefore, resulting gross position. So with these algorithms, as Marco Derico will tell us in a sec, you can get an impressive reduction from gross to net amounts. So without further ado, let me call to the podium to Marco Derico and join me in applauding him and his co-author for having won this prize. So Marco, the floor is yours. You have 15 to 20 minutes to present a summary of your paper. Thank you, thank you. So okay, of course, thanks to the AAC for giving us this opportunity. We are extremely, it's a great pleasure for me and I think also for Tariq who is in the US at the moment and of course, thanks a lot to all the people that commented on our paper during this year, many of those are in the room also, so thank you very much. Of course, a big, big thanks to the secretariat because great part of this work is in particular the empirical part of this work was carried out using indeed data. So we came from a session where there was a lot of discussion about data, well, the issue was how data can be used. And okay, I have roughly 15 minutes, so I will try to walk you through what I'm gonna tell you. I mean, I cannot go into the details of the paper but as Javier mentioned, it looks very technical but the underlying intuition is very simple. So I will try to convince you that there is something simple and then the technicalities can be built upon that. So I'll give you the main intuition. Then I also give you the historical background because as you may infer from the title of the paper we are trying to reduce something in over-the-counter markets, in particular over-the-counter derivatives markets. Like in the previous session, we heard a difference between net and gross. So this will be a key point that I'm gonna make. So I will go through the theory, explain the key concepts and the last point will be about the empirics. In particular, I will show how we use the emu data in order to give estimations, various estimations about how compression techniques can be adopted in the market. So as you may recall, during the crisis but even afterwards, one of the main themes with that OTC derivatives markets were complex, opaque and they were characterized by large notional amounts in the orders if you recall of trillions. We're talking about trillions of dollars and the main question that we were wondering is what characterized these large notional amounts? And what compression does in a nutshell is a post-trade technique. So counterpart is trade these derivatives and then X post, we run an operation that reduces the gross positions while living basically unchange your net market risk. So as an example here, I have a network of obligation. Now the diagram is very simple to understand. Imagine you have three, four financial institutions and that whole money to each other. This can be a contingent claim like a credit default swap. Yesterday we heard from Richard, imagine this is a situation where a credit event is called and that money is due between counterparties. So they have to pass around that money in order to fulfill their obligations. Now, if you look, the gross amounts, I don't want to, given time constraints, I don't want to go into the marathematics of this but just to understand, you have a total gross notional of 45 in the system and each participants has a given net position. So in the moment of payment, for instance, B expects to give out 10 but it also expects to receive five, correct? The same thing goes for A, it expects to receive 20 but it also has to give out five, okay? So in the moment, A, for instance, doesn't receive the money from C, then you enter counterparty risk, typical example of counterparty risk. So at that point, A may not have enough liquidity to pay B, right? And this can create a chain of distress, chain of default, et cetera. So how do we act on this network of liability? A does not have the full view of the market. So A believes that they have to receive 20 and they have to give out five. Now, if we share information in some way, look at it, we can eliminate the minimal liability from the system. So we subtract five from each of these linkages that you see in the network and this is what happens. We have reduced the system from a previous amount of 45 to a total of 30. So we have a gross notion of reduction, gross notion of obligation reduction of 15. And if I manage to convince you about that, I think the rest will come very easy because the rest is how we find optimal ways to solve this problem in a very complex environment. A key, I believe, a key economic interpretation of this activity is that it's a sort of system-wide deleveraging process. So assets and liabilities are reduced, but this deleverage does not entail any asset sale nor injection of capital. So it's purely deleveraging that is achieved by correctly using information. And what we do in our paper is exactly to optimize these type of techniques. So I want to take a step back about the size of what you see with these markets. Credit default swaps at the onset of the crisis were in the order of 60 trillion. And immediately after they started shrinking. Now the question is, did the market disappear or did something occur that made these swaps disappear? So what you heard before is that many authorities are looking into data, but not having granular data. It's very difficult to see what really occurred on those markets. And the surveys and reports attribute this decline exactly to compression. In particular, both PIS and ISTA achieve these large reduction in notional amounts and notional obligations to compression. And if you see how we arrived there, we do need to understand the market structure because otherwise you can't understand the level to which this level of reduction of notional can be achieved. And in order to do that, you need to draw a CLI's model of the market. This can be done using data. So data are consistently reporting this type of structure when you look really a counterparty, counterparty relationship. This is a very good model of CLI's model, how the market looks like. You have, for instance, sellers of CDS and buyers of CDS at the very end points of this network. And then you have a large amount of dealers that are interconnected between one another. And this may sound surprising, but this is what we find consistently in the data. The dealers in the market actually account for 80% of the total notional. So the end user roughly 10 and 10, and then the dealers for 80%. And a large part of these obligations indeed lies within this intra-dealer set and within these closed chains of intermediation. The response to this very complex network structure was exactly to collect more data. So to come back to the, and this was the reaction that the G20 leaders committed to in post-quadruce summit. The problem is that the compression was not new to the market. So this production in notional amounts is not something we came up with. It was in the market. But it was mostly seen in the early 2000 years as just good housekeeping. It would help it to reconcile some trades to have some payments, but there was not much more than that. Importantly, in the global financial crisis, the immediate aftermath, actually few weeks after Lehman defaulted, there is a very nice article by the economists that is called The Great Untangling and relates exactly to credit derivatives. In particular, credit default swaps. And this article mentions that only now the industry is discovering the joys of compression. So that notional amounts, that huge decline immediately after the crisis is probably due to compression. And we find in our, with our paper, exactly that the enormous amount, so a three-fold reduction from 10 trillion, so 60 trillion to 30 trillion, is indeed completely reconcilable with compression techniques. Importantly, there's also not the pressure that put down this market. So it helped this market shrink. And it was exactly part of the regulatory response to the crisis. So it has capital requirements, leverage ratios that take into account also for gross exposures. And of course, the issue of margins and collateral. The more gross position you have, the more you may expect to have to post collateral. So if, especially for unclear trades, you ask for more collateral, then it would be way easier for participants to try to find ways to reduce that gross amount. And indeed, in a post-crisis assessment, Duffy mentioned recently, a very nice paper, that compression is indeed the greatest source of improvements in OTC derivative exposure. So this is what we are looking into. Compression is used today at the bilateral level. It's very easy to do. It reduces to bilateral netting. You don't need any sharing of information. But if you recall in the previous slide at the very beginning, you need to have the full picture. Otherwise, you can't intervene in the network. Or at least to have quite some good level of information on the network. And so what participants do that they go and talk to external providers. And they ask this, submit trades, and they ask, can you optimize and reduce my gross exposure? What is compressed, interest-free swaps, both with CCPs and without with our cleared and unclear, CDS, again, single name and index, more recently, more complex products are getting into compression. And I want to mention compression is an evolving service. So now it's getting more and more into complex products. And it will require more and more financial engineering. The numbers are staggering. Trioptima, one of the major compression provider, claims to have eliminated one quadrillion cumulative throughout from 2003 to 2017. One of the major cleaning houses, the derivative service of my major cleaning houses, reports say 380 billion, so only 2016, is that talks about roughly 70% of reduction. Compression is also included in the regulation. Mifir and Emir are supporting this view. But there is one problem that despite this global support, what we found is that compression by changing the network completely reshapes the type of risk that one may see in the market. So there may be a sort, and this is why we're very interested in this topic because there can be a shift from a local, say, individual incentive to perhaps increase systemic risk concerns. And in particular, I mentioned the risk concentration because if compression can be done only between the large dealers, these reduction in counterparty exposure will increase relatively this exposure to the end customers that are typically known banks. And this may create problems that we all know. Another important point is the legal and institutional framework, especially if we understand compression and we say that is a good thing for our financial market for systemic risk, we also need to have a good regulatory framework to make it work. Also, there is a lack of credibility because the techniques that are used by compression providers are rather opaque. We don't know what these techniques are. They're very difficult to understand at the moment. So we had to somewhat try to understand ourselves what the compression provided, what we're doing. And also, of course, this is due to very limited literature. I mentioned here very interesting papers. What we do in the paper is that we felt that there was a lack of research combining the theory and empirics. And this is why we motivated us to work on this. So given for time reasons, I want to skip through this because what I want to focus on is the notion of excess notional. If, as we saw before, there is some notional in the market that is not needed to satisfy the net expected exposure that you have in the market. And this is quantified, this is the only formula I'm gonna show. This is quantifiable like this. It's very simple. It's the difference between the total gross notional and the minimum amount of notional in the market. Of course, not all these notional that is in excess can be removed from the system. And what we do is that we try to find ways to remove this excess in various compatible ways with strategies of participants. Now, it's important to understand that excess is present in the market only if and only if there is intermediation. So the dealers in the market are those that originate this excess in the market. And what we do is that we look at different classes of compression. I'll go in particular, we focus on a conservative type of compression that preserves the type of counterparty relationship. So as there was a relationship, I say, I was expecting to receive 100 from Javier, the new trade, the new relationship cannot exceed that 100, can only reduce, can be also to zero, can eliminate the relationship between you and I, but at the same time, it cannot increase. And then we have a non-conservative that basically allows participants to reshuffle completely their counterparty relationship. These may occur in certain cases and it is a good benchmark at least. So, of course, what happens in terms of efficiency, so in terms of notion that you can eliminate, you need, you have way more efficient in a non-conservative case for a reason of constraints. Whereas if you want to keep counterparty relationships, then you're never gonna eliminate all the excess in the market. This is very important because in particular, in terms of efficiency, the natural structure that these two different types of compression generate, I wrote them down at the end of the slides, is that they produce natural structure that preserve intermediation, but eliminates close chains, whereas the other type of counterparty, the non-conservative completely eliminates compression because this is simply matching between sellers and buyers. Now, I want to give you an example. This is when we go from a current outstanding network to a conservative and non-conservative case, so we have eliminated the intermediation in the right-hand side. And okay, I want to go briefly on hybrid compression. We simply want to allow inter-dealer relationship to be switched, for instance. So a dealer that was previously buy something from Javier and then I can become the seller. It's very simple. We have produced efficiency rankings, but more importantly, I believe, is the empirical application. I'll take one minute if I'm allowed to do it. We use data exactly from Emir, so we use credit default swaps, cleaning procedure, outlining the SAP Occasional Paper 11. We take monthly data from October 14th to April 16th and in order to make sure that the network that we'll build is perfectly fungible. That is, it makes sense to subtract and add elements to that network. That is, they have the same payoff in terms of, they have the same product. We're not summing apples and oranges. We find, we devise a way to fix maturity and same reference entity. Then we implement the algorithms that we found and we run an analysis, especially on the finding levels of excess and compression efficiencies. So this is the result. The total excess oscillates between the various reference entities from 50% as a minimum to the maximum, so 90% of excess in the market. And this is more or less stable across the various snapshots. So you see, we potentially can eliminate, on average, between 75% and 76% of gross motion in the market. Now the question is, how much can we eliminate with the various algorithm of these 70% and the answer is, well, quite a lot. If you look at conservative compression, the blue, this is a block spot, and you look at the red line at the median level, we can eliminate up to 90% of this excess. So if you think about that, this means that you can eliminate 60% of total motion in the market with this type of compression. Now, I'll conclude. Networking markets generate excess. This is due to intermediation. This excess can be removed by compression. This is widely used in derivatives market, and we also do an important ability in periodical application, showing how this data can be used. Importantly, because this can also lead to probably some discussion, is that we are trying to understand how this tool can be used in a macroprudential way. That is, imagine that you want to compress the liability of the Lehman, that's the next Lehman type of, what data do you need, what counterparties do you need to contact, what counterparties do you need to convince to run a compression exercise? And of course, since compression reduces gross notionals, it will have an impact on margins. So it can be used counter-psychologically. By running constantly compression exercise, you won't face large levels of margins in case volatility increases. Of course, an important point is the epistemology of market size. That is, if markets went from trillions, so 60 trillion of credit default swaps to 10 trillion, and for individual reference entities, say you have a sovereign credit default swap in the order of one trillion, does it really mean that there's a need of one trillion of insurance on the sovereign? Well, we find that the large part of this notion can be eliminated. Last but not least, we are working on netting efficiency with respect to CCP and impact on capital, collateral price, and of course, something for future is the legal framework. So thank you very much, I hope. Thank you very much, Marco. So I think we have time for a few questions, and I will start with two. In some sense, your list of ongoing research was already providing sort of the answer to some of them that probably this is a very interesting policy-connected area for research that doesn't get exhausted by your paper as most good papers happen to have as a feature, that most of the questions are for future work. So one first question. So you saw in your paper that with compression techniques, you can achieve massive reductions of gross positions. Now, if I am a macro prudential policymaker concerned about which risk for the system is behind those positions, is the declining gross proportional to the decline in the underlying aggregate risk, or is it just an image?