 My name is Eleonora Cunha, I work at ISDA and you've already heard from some of my colleagues today. Earlier we had the keynote panel. I see some of you in here right now who are participating in that. And so you've all probably heard about the common domain model, CDM, I will say CDM for now on. And that was sort of the intro, I guess, and the session we had just now in this Hub 4. We have some colleagues here as well still. They were all from the trade associations involved in the initiative. So what's interesting about this session is that now we get to hear from CDM implementers and market participants who have big plans for CDM. And I'm just as excited as you to hear these speakers tell us what CDM could look like in the real world, its potential and application in market infrastructure. So we begin with John Nance from Axoni. He runs product management for derivatives and they focus on the distributed ledger platform, Axcore, I believe. So John, over to you. I believe you have slides. Cool. So the things that I wanted to go through is try to give a number of you of our use case and how we're using CDM. And then also give you basically tips and tricks on how to get started essentially. So it will be a little more technical than the previous presentation. So hopefully that works. So just a background on Axoni, we're a capital markets technology company. We focus on distributed ledger software on our OTC derivatives product line, which I look after. A lot of the basically problems that we're trying to solve is reducing and eliminating reconciliation within post-trade processes. And if you look at a lot of the root causes of a lot of these reconciliation problems, one of the big ones is differing methods on how to process different events, as well as different methods on how you actually represent those events, which is really where CDM comes into play in trying to help to solve that problem. Of course, there's other reasons why you have different breaks in those C derivatives, such as lack of real-time transparency and differing data sources. Not as relevant for CDM, but just worth noting as well. The way that we're using CDM is essentially using it as a way to represent post-trade events within a standard event model. And by event here, we're really talking about basically representing economic state transitions of trades from one to the next. And one of the good things that the CDM does also provides a full lineage of each one of the state transitions. So there's no ambiguity in how a trade went from one state to the next, which is incredibly useful when multiple parties are trying to understand how a trade went throughout its life cycle in a consistent manner. As well as the other component of what we're using CDM is on standardizing how those events are actually processed, not only represented. A couple of the benefits that we find in it is consistency across asset classes, that sense of lineage. Of course, it's a standard and open source as you guys have heard today. The regulatory reporting, especially the DRR initiative that's just launched, has a large benefit in the center for firms to move to the CDM. And also technical flexibility. You know, CDM is a model, but it can be represented in a number of different software languages like Java and Scala, as well as a number of different ways to serialize those events. So JSON, Avril, Protobuf, they're all kind of applicable ways of representing the CDM when it's serialized. So the ways that we are using CDM is I like to think of very workflow oriented. So I'd like to think of it in terms of a standard workflow of how we're using the CDM. So one is messaging coming in. I think one of the common misconceptions around CDM is that it's just another messaging format, a competitor of FPML when they're actually comparing apples and oranges when you look at the two. And a lot of that is because messaging in particular tends to represent instructions. So if you think about an allocation instruction, it doesn't actually represent how you would process that allocation. It really just represents an instruction to do the allocation. Those messages, there's a lot of different standards out there already. CDM isn't really trying to tackle that, so fixed in FPML and other types of messaging formats still apply here. Where the CDM comes into play is in that processing angle. Of course, every application has a lot of different custom application code and custom business logic that helps you differentiate yourselves from others. And so the way that we think about it is that custom application logic should really be built on top of the CDM. And where the CDM comes into play is when you actually want to transition the state of a trade from A to B. That is where you would use the CDM to represent them to process that sort of state transition. So you represent that in storage. So that's capturing that in a database. Also worth noting that CDM doesn't really define a database schema necessarily. It's really just the data model itself of those trades and events. We use Axcore to basically ensure that other customers of ours are 100% in sync on that data. So what Axcore allows you to do is to synchronize multiple data stores in real time in very high large volumes. And then also providing cryptographic guarantees that those two databases are 100% in sync. And what that means is that the data is complete and accurate. It's not necessarily a requirement to use Axcore CDM. It's just we find that it's incredibly beneficial for this particular use case. So once the data is distributed, then a client on the other side essentially needs to take in the CDM events and trigger different workflows internally or to consume that data. So that's receiving CDM events on the data store of choice. So for us we support Kafka as well as different databases, essentially any sort of relational database. And then as a client is trying to internalize that data, you can then use the CDM libraries to already have the events and the trades already modeled for you. And then it's just a matter of trying to make sense of those different events depending on what you're trying to do with those types of... data. As far as how to get started, we like to think of things in sort of two different roles. One is, my color scheme got messed up here, one is event producers and the other is event consumers. So event producers think of this as vendors or just if you're using CDM internally, it could just be upstream app that you're using to represent CDM events. And then you have these event consumers that are trying to consume those events and really make sense of it. So think of that as if you're a vendor, then this is your customer. Or if it's for an upstream app, this could be a downstream app that's trying to make sense of those different events. So from an event producer perspective, if you want to get started, I'd say first look at what are the different events that are applicable. As I mentioned, this is about representing state trends. The way we use this is representing state transitions of trades. Lots of different events that you have in your own systems that don't represent that. CDM is not applicable for that. But where it really matters is on those trade life cycle events. There's a couple other use cases that were mentioned in the previous panel. So one is coming up with an inventory of which of those events are applicable. And the best way to actually represent these in CDM is using the CDM functions themselves. CDM functions are... I like to think of it as sort of like a helper function to make sure that you can create those CDM events and you're not going to mess it up. If you try to just map it field by field, it's going to be a little bit challenging. But those functions, it is very robust, very well built out and allows you to keep up to date with different changes that the CDM has. So basically it's mapping those different events into those CDM functions. The best one that we use is create business event where you essentially plug in what's the current trade state that you are altering the trade for and then also listing out a bunch of different primitive instructions. And primitive instructions, think of it as like a building blocks of these smaller modular, as the name denotes, primitives. So I think that one of the primitives is quantity change and other is contract formation or you have split. And so these are sort of the smaller building blocks that you can mix and match together in order to represent really any sort of post-trade event using these libraries. So combining all those different primitive events together within this function as well as the intent and date allows you to represent those functions. But you can also use them in an ideal case to not only represent it but also to ensure that if that is the way that you're actually altering the state of a trade within your own system, then to make sure that you are 100% in line, that the way that you represent it and process it is 100% in line with anybody else who wants to consume data from you. From a consumer perspective, this is where you would have somebody who is receiving those CDM events. And it kind of depends on how you have modeled out your consumer app. If you're completely CDM native, then it's sort of easy to plug in. But a lot of times people are trying to really map the CDM events into your internal events and into your own internal models until you move towards a more native CDM solution. So one of the other tools to worth noting is things like Legend. It's also another FINDOS open source project that allows you to essentially map different data models together. That's very applicable in the case of CDM where you want to map, say, a CDM event and how it represents different events into events that you may have represented internally. But you can also use different CDM events for other downstream processing. So that could be the DRR for reporting purposes. Or if you're trying to just create a CDM repository internally, one of the big benefits of how the CDM events are constructed is essentially if you were to take those events and apply them using those CDM libraries one after the next, you can from scratch build up the entire current state of the ledger using all these events and just reapplying all those different functions. So there's a lot of flexibility in what you can do with it. Just a really short summary on how we view the CDM and how we're using it. One of the reasons why I wanted to come up here is to give people a little better idea of how you could potentially use it. The number one thing that we want to help to prioritize in terms of the CDM is making sure that people understand it and ultimately adopt it. It's a lot more successful if it has a lot more widespread adoption so we can achieve ultimately this goal of getting rid of our conciliations due to different methods of processing and different representations of processing data. Thank you, John. Do you want me to turn this off? Okay, yeah, so next we will hear from Diana Boyle from Symbiont where she's a business development and sales executive. And that's also in the DLT space, I believe, and they have a blockchain platform assembly. So, looking forward to hearing from you now, Diane. And thank you, Walter, for making time to come in and hear about this common domain model and these are one of the real-life applications again. I'm going to give you a little bit of a history lesson, but I also want to take you through the application of real-life production. As you're all aware, over the last decades, financial services has made tremendous strides in adapting new technologies. Introduction of the fax machine, and by the way, I want to let Amy know that I am fully in support of the fax slogan that ISDA has adopted. We're all still working on it, but it's still out there. But the introduction of the fax, the ATM machine, the STH payments, the wires, SWIT messaging protocols, all of these things that you're very familiar with. Of course, the internet and the cloud have contributed to impressive improvements in the speed and the efficiency of which we supply services to our clients. You can walk into any branch today and make a deposit and have that cash sent virtually anywhere in the world, immediately to a family member or a customer or a friend. Firms trade are not currencies. Counter parties they normally would have no interaction with because they need currencies for their clients to run businesses which serve as the means of non-manageable firms and governments. All of these household technological and process efficiency advantages did come, however, with some issues. Every company that pulled away from our school, any of the new technologies and processes developed their own standards for how these things had to work. If you think about going into an ATM machine several years ago, depending on which machine you went to, there was an information policy. Today, when you go to an ATM machine, they all basically have a portion of your account number. They have perhaps your balance. There's actual transaction information. They're talking about what you just did in the machine. But that's not always the case. It's more some of the new technologies that are rolling out. So if you take a step back, the industry created efficiencies on the front-end where they caught it. But they often created downstream, that's a challenge for me. Excellent. We like to start early on. So if you take a step back, the industry created some efficiencies on the front-end but often created this whole downstream problem requiring manual reconciliation between parties, sometimes even between different divisions in the front-end. Or third-party entity had to be created to act as an arbiter or authority to have the final say in what actually just transpired in that commercial event. So in order to recapture desired efficiencies, firms on governments then started to form organizations. To look across all the activities performed and the different transaction types. These organizations all made up of really smart and intelligent people like the folks in this room then began to work to identify the most efficient or the best right for business to take place. Firms were then either mandated through regulation or persuaded through actual cost savings to alter their behavior. You've all worked in one way or another with those organizations. SWIFT, the Standard Markets Practice Group. NAMCHA, the International Organization of Standardization. The accredited Standard Committee, XP9. In addition to IGMA, ISLA, ISDA, etc. Each of your firms or multiple organizations, whether it's through membership, which is again another cost, or through resources, you've dedicated your time and talent to things like these things. I actually co-chaired the digital assets forum with another group called the SINSE. So there's a lot of these folks around and a lot of activity around to recapture the efficiencies that were perpetually intended. What's happened now after these improvements or standards are identified? Is that better? Okay. Sorry about that. I'm not going to go back and repeat everything. That's okay. I'm going to have to drill with two hands here though. So again, what each firm has had to do now is to, what we refer to as retrofit. So they have to take standards that are developed and completely make sense. But now they have to try and cram legacy technology, legacy processes in order to change to accommodate the standards that have been created. So this back-end approach while it's been very helpful is not really solving the problem. So now blockchain has arrived. An elegant technology which allows us to put our trust and faith in a mathematical framework that ensures that we can create a definition, or rather if we can create a definition, we can enshrine it and cope. And that ensures that transactions are processed for clients and counterparties exactly the same way every single time. That results in reconciliation and errors now being banished to the trash heap. We all really are very thrilled about it. At Sambient, we want to build a better future. At the request of our clients, we're building applications on this new technology of trust that allows for the incorporation of standards at the start of the transaction lifecycle as opposed to at the end when it's really almost too late to prevent errors and reconciliations. We want to go through the entire transaction lifecycle to reap the benefits of the efficiency throughout every single activity. We've actually coined a word for it and I'm hoping to trademark it. We call it pre-conciliation. So if we do it up front, we don't have to do it at the end. And that's very important for us and it ensures that the full speed and properties of well-designed applications truly allow for frictionless processes. So now a little bit about our smart collateral application. We embrace ISDA's CDM as an integral building block. It allows for enormous efficiency. Smart collateral is a post-trade application. It allows for inter-day trade and collateral valuation and collateral movements and instructs the custodian or the dealer bank to make the collateral movements to support those trades. Smart collateral solves for two very large business cases, the mitigation of counterparty credit risk by frequently performing valuations outlined in the party's CSAs and transferring the required collateral and the elimination of collateral disputes that occur in today's processes. And Amy can go on and on and on about all the work that's been done to try and mitigate that. We've actually done it with this application, smart collateral. It was designed to the specifications of market participants. It's in parallel production as we speak with billions and supporting collateral currently active. Additional funds, currencies, counterparties, and custodians will all be brought onto the platform in a highly controlled manner. And the application will expand beyond what we're doing with FX4 at the moment to include other instruments and asset classes beginning in 2023. So this is real and this is required by the industry that we have some sort of a framework in order to allow for the efficient processing. And that's really where CDM comes. None of this would be possible without that. The sheer volume of activity supported by the application requires that definitions and models are highly transparent and implemented in the same manner across the application. The common domain model ensures that the application for the industry, costly reconciliations and margin disputes are eliminated. Additionally, return of collateral, another bane of everyone in the trading world's existence, can be automated in the same timely manner, which eliminates the need to fax, again the word fax, requests, a very costly and also a very risky endeavor. This CDM allows for interoperability across networks and straight through processing firms and platforms, eliminating the need for reconciliation caused by differences in how each firm records every single aspect of the regulatory oversight through transparency and alignment between regulators and market participants from the inception of transactions. It enables consistency in regulatory reporting by providing a standard representation and by automating reporting requirements through code. Consideration strongly suggests that access to note. This will provide immediate access for compliance and regulatory purposes while reducing your costs and processes associated with multiple reporting requirements, whether they're internal or for any external regulator. From a simulant perspective, we're implementing select components of ISDA CDM to standardize the use of trade data on our smart collateral product. It will provide an easier path to integration of ourselves and our clients by providing a machine-readable standard format of trade data processing that reduces the operationally taxing, reconciliations of that word again, and data entry. Symbiote is first initiating initial margin, variation margin, and the eligibility modules to better assist our smart collateral application with the population and standardization of client trade data that is used in the application. The integration of ISDA CDM is currently scheduled for quarter one of 2023. The Symbian and ISDA teams are hard at work at the moment, hoping to complete all the planning before year end and move to active integration shortly after code freezes are lifted in the beginning of 2023. After the successful completion of the ISDA CDM body of work, the roadmap is being expanded to include integration with ISDA CREATE. And earlier we mentioned the importance of getting documentation in almost a machine-readable format. All clauses and terms within the documentation interpreted exactly the same way enshrined in code. That's really important to us. And after that, we're going to expand to include the issuance of digital debt with expanded CDM integration as well. So we're issuing some digital commercial paper and we're also doing some digital bonds that will be built on CDM as well. We want to, as I started off, we want to really build this right from the beginning so that as information comes into our application and is proliferated across the industry and counter parties, it's all interpreted the same way, completely eliminating reconciliations and errors. So we're really excited about it. And we're in the process of building a better future and we really invite you to join us in the CDM adventure and come along for the ride. Thanks very much. Thank you, Diane. It's interesting to hear all the directions this can go in. Finally, we will hear from Sunil, who is head of business architecture at Credit Suisse in the Group Payment Service. And you're also a very active contributor of CDM since the very beginning. So do you want us to appear as well? Either way, it's fine. Does this work? As Ian has kindly mentioned in the previous session, I'm one of the founding contributors of CDM. So if you don't like it, you can blame me for it, among others. And it is great to see the culmination of CDM across the three trade associations today and it being open source with Finos. It's been a long journey. We started in 2017 and I really thank Jane and Ian for nudging me in this session today and giving me opportunity to share some thoughts. So I had like literally probably 24 hours to prepare my talking points so I don't have slides, I don't have materials. But I thought it may be a good idea to reflect a little bit on the journey thus far. As I said, we launched the model at the AGM in Lisbon in 2017. There were very mixed reactions, Ian, at that time that it was too late. I have a minute. Oh, okay. That would be interesting. Yeah, I'll just go for it. So it was launched at the AGM and there was a mixed reaction. As I said, that do we need this? It is too late. We have completed red reporting. Why do we need this, et cetera? If we just look back from that point in time, there are several fuss in the industry that we have achieved in the last five years. Before we look towards what is remaining to do and where we need to focus, et cetera. The first thing is that Isda embraced it as the path towards the digital transformation of derivatives. That's a big step. The second thing is we realize that the time to market is critical. So we got Regnosis to implement the model. I think we issued the RFP towards the end of 2017. Regnosis was picked and we had the model out in six months. That's a huge accomplishment and Regnosis really stepped up to that. We also wanted to have a validation of the model because anybody we spoke to did not quite understand the model. And I really am happy, John, you have distinguished that it's not a data model. It is a state transition machine and it's a processing standard. So it encapsulates, obviously it encapsulates the data model that we have in FPML or other standards, but it is much more than that. So we, as like the founding members of the standard, we were very concerned that is this going to work? That was a big question. So for that, what we did was that why don't we actually organize a hackathon and we work with the DLT providers. We work with the FMIs and we had like about 300 participants and the overwhelming response has been that the model is sound, the design of the model is sound. Axoni digital asset R3, they have actually mapped the model into their internal structures and they have validated it. And the second feedback from that event was that we make the model open source because if we have to promote the adoption of the model that it cannot be tied to any particular commercial entity. The third feedback from that session was that it works fantastic for derivatives. Can we try it for other product classes, other asset classes? So we started then expanding into repos and securities and which kind of all brings it together to today where the three associations have come together and we have open sourced it and it is available for any of you to use. The second part of my comments is in terms of why bother with the model, right? When we started this, we actually started in the reporting space. A couple of us who have been living through the Dodd-Frank Act and then the method and all the regulatory rules and the discussions that we had to have with the regulators in terms of parsing the content of the rules and trying to identify how does that actually fit into our internal models. So one of the objectives was to create a practitioner's view of what exactly do we do? What do we mean by trade processing? You took an example of I think resets. So if you take an activity like reset, can you decompose that into primitive events and can everybody agree on the rules associated with that particular step? And can you then start building them as Lego blocks to model complex workflows? So in doing that, the practical use cases where you can start applying these are obviously the first one is that wherever you have a centralization, a data centralization and a processing centralization. So for instance, clearing is a perfect example that if you apply the model, you have a localized effect for that particular product class, which is that all the participants in that network are agreeing on the steps that need to be performed. And there is a single source of truth. So whether you put that single source of truth on a DLT platform, which the model actually has several hooks to connect to, and it is fairly optimized for a DLT platform, but you can equally do that on a Oracle database or any other format where you where you save the data. So the second aspect then is to say that if you have multiple trade utilities like that, let's say that you have clearing, you have settlement, all of that within a product class. Then as a consumer of those utilities, you do not want to keep on translating, which is what we do all day long within our firms that I have to connect to. I won't take any names because I'll get in trouble. I have to connect to market utility a and they provide a format a I have to then connect to market utility be anybody who has worked in post trade would see that this is this is the bane of our existence. The third problem is that we reconcile the information across the participants all day long. So what CDM gives you is that authoritative version of truth. The second thing is that it models all of the processing steps. And by having derivatives repose and securities, you can actually cut across the product classes. That is a huge win for any large organization on the consumption side of the model to say that I don't want to keep investing. I'm being told to stop. Okay, so people can leave if they want. I'll continue. Yeah, I have like couple of more minutes. So if you can stick around. So on the consumption side, what you can do is that you have a normalized view across all of the market utilities in the in a particular asset class front to back and across asset classes. From a provider perspective, what it helps is that if you want to launch your product, particularly for a fintech company, you don't want to like customize to 100 different data definitions and process definitions. This would give a standardized way to say that, hey, can you adopt to CDM? I have a standardized way and it actually connects into the rest of your architecture within within your firm. So that's one huge benefit which we have pursued with the DLT providers also with the FMIs. There are real use cases. Again, I cannot name them, but from from both the derivative side security side, there are trade processing use cases where CDM is actually being used in a native format for that. The second example is actually in the origin of CDM is the digital regulatory reporting that has been mentioned few times here. What you get is three benefits in encoding the regulatory rules in CDM. The first one is that we have a common language to talk with the regulators and we have actually done pilots with Bank of England, maybe the FCA. I forget in terms of actually encoding the rules into CDM as the starting point. The second aspect you get is that the data definitions of what needs to be reported are all structured in CDM. So the DRR project, for instance, on the CFTC rewrite or refit, I forget, rewrite. We are at 90% upwards of 90% completion on the mapping 100%. I'm one year old. And the third aspect which is a bit more complex is that we can also encode the eligibility rules. That's a tougher problem to solve. We are getting there, but we are focused on the data definition side of it. MR refit, we are at 65%. Now, if you look at CFTC rewrite and MR refit and compare them both, there is about 60 to 70% overlap between the two of them. So by encoding at CDM, at the very least, you are getting a standard representation of the rules and what you need to report. The third aspect which given my foray into payments now is the last step of CDM, at least conceptually in my mind, is to get the payment representation also into CDM, particularly connecting with the ISO 20 or 22 initiative on the data definition side of it and plug the payments events into the transaction events. So at the end of the day, what the model gives you in summary is a transaction view, economic data, legal data and the processing events all consolidated, all of which have been validated over the last three to five years and we have practical use cases. So if there is one takeaway when you leave this room, it is that if you have a market utility project that comes and says connect to my market utility, you should say, do you support CDM or not? If you are investing a dollar in reporting, you have to question. You have to question what do I already get out of the box from CDM and do I want to spend this dollar on my internal reporting platform. These two combined with the collateral use cases, combined with the journey on the legal documentation. The onus is really on the industry participants now to pick this up and take it forward. The model is there, the structure is there and we have like real use cases in use in the industry. Okay, that's a rapid fire. Thank you, Sunil.