 Hi guys, we'll wait a couple of minutes. Hello, Griffin. Hey, hi, Ron. How are you? Okay. So I think we're going to wait, say, one more minute. And I will start sharing the screen. We'll start going through the usual, you know, the meeting minutes and so on. You can share my screen. I mean, you can see my screen. Yes. Okay. So I will add the attendees here. Anyway, before we start, we have to do this. We have to do the antitrust policy and the, and the Hyperlogic Code of Conduct. So antitrust policy. And basically we agree not to engage in antitrust activities. And for people who want to read the whole text, it's here. And that is the only requirement for participating in the meeting. The next one is this code of conduct. Code of conduct basically says that all are welcome. As long as you are. As long as you are nice to each other. Well, nice is a strange word, but basically that you're not rude, that you're disrespectful. Do not disrespect people. As long as you engage in. Proper. This course debate. Disagreeing with somebody is not. Not. Not, not being nice, right? Not being nice. Not being nice. Not being nice. Not being nice. Not being nice. Not being nice. Not being nice. So. Now we got to go down the list and talk to people who are. And ask them to introduce themselves. So I'll start with Gary. Hi. Yeah, so I'm Gary Miller with intane. We are a. The company again. Sorry. Intane. I N T A I N. Intane. And we have software that manages the life cycle for the issue. It's a massive back securities. In the debt capital markets. And we use blockchain and AI. In that process. Beautiful. Money. I am. I am. We are developing. More of a next generation. As you know, all the free assets. And it's all that I'm using. Ron. Good morning, everyone. Good morning, everyone. I'm Ron. I'm the chairman of the wall street blockchain alliance. The WSBA is a nonprofit trade association headquartered in New York. With a global mission to guide and promote. Comprehensive adoption of both blockchain technology and crypto assets across global markets. Some of the folks we are very on this call. We are very privileged to also have as members. And VIP. And as a matter of fact, I think our last meeting in person was that's one of our working group. Working group meeting. Always good to see you, my friend. Next is Tom. Tony. Are you there Tom? Hey, I'm here. Sorry. I was on mute. This is Tom. Hey, actually, I think we talked. While we were in Hong Kong. Anyway, I'm. My name is Tom Lee. I BMR. I work on fabric deployment. And mainly cello project. So I'm here to support our intern. Many. Pillay. He will like later talk about the work. He has done during this entire. Internship. Period. So happy to be in this meeting. Money is your intern. I didn't know that. Yeah. I'm not an intern anywhere. I'm sorry. I'm sorry. I'm mistaken. Okay. The other guy, the other guy. Manic. Sorry. Mistake. Oh. Manic. Money are two different people. Money is a. CEO of swabs hub. So. I would like to. As an intern. That's great. That's good. I messed up the names. Sorry. It's okay. It's okay. It's okay. Yeah. I thought he just put his name. It's a good way to start a call, Tom. Yeah. Yeah. I'm. I apologize. Don't worry. Don't worry. Really bad about those. Long names. In the world. Everyone. Sorry. Sorry. Yeah. Anyway. So. Sorry. Sorry. Yeah. Yeah. Anyway. So I wanted to make a long presentation about the derivative. I had prepared some slides for that. But I haven't covered the whole gamut because it has to go from a lot of. You know, from. What is the purpose of the derivative hack? What was, you know, what was done, the use case that was presented. And the Rosetta, the CDM. CDM. Library that helps. Create. CDM objects. And then. Also the. Hack fest. Hack itself where a lot of different teams showed up in New York and. Created solutions for this toy. But money. Who is. Probably. Creating a real. Which is not a toy use case with lots of truncated. Business processes. But. Can probably talk a little more about that. But before I go further, I think we have to back up a little bit for people who do not know CDM. The common domain model. Created by ISDA international swaps. And derivative. Association. Who had already created something called FPML. Which is widely used. In the industry. And this time. They're. With CDM. They're. Ambitions are a little. Little more. A broad. They. Want to create a solution that. Handles not only swaps. And derivatives, which are. By level. Products. Meaning. They are agreements with. Between two parties. But they want to cover. Loan products. For exchange and so on. But currently. There is nothing. That is. That is live in production from CDM. CDM is still being born. So there's a lot of. Interest from a lot of different people. So. Mostly the big players. And their aim seems to be. To keep the infrastructure. The way it is. Because that is what is required by the regulators, especially the. Having. Brokers broker dealers and so on. So. The big players. And their aim seems to be to keep the infrastructure. The way it is. Because that is what is required by the regulators, especially the. Having. So. Money will probably tell us a little more about it. And I can. Dive in a little. Little more of what I had prepared, but I'll create a. More extensive presentation, especially focusing on the CDM because that is our. That is should be something that we. That we focus on because it will affect. Capital markets infrastructure. So. Money. Do you want to say anything about. What happened. Yeah, yeah, sure. But, but I did see some, some documentation on, on your documented on the site. I want you to bring it up. I think that's a good introduction. What is that this one, the CDM model or. Yeah. Whatever you had documented. And it's the overview of the is the CDM. Yeah. Yeah. That was a good start. I have a few slides to share so I can. So here is the. Sorry. Can you see the slide? Nope. You cannot see the slide. Okay. Let me see. Stop share. And share. Now. Yes. Is this what you want? Yeah. Yeah, this is the. This is a from the Rosetta. I mean, from. Rosetta perspective, right? Yeah. There's a further down there. One more. I guess. It's okay. I can start out with and. Yeah, I think we should. Sketch out the basics because. They are people are not experts in. Either one, either. Okay. Do you want me to share my screen and. Oh yeah, of course. I can stop sharing. You can take over. Yeah. Well, it's yours. Hi guys. This is good to see you. Apologies. I was lead to join the call. I'm here. So that's it. Someone else has joined as well. I'm. Hi. This is Shantano. And. I'm being late. Sorry for that. Okay. Are you able to see that? Yeah. Okay. So. Just to give a background. As we've been pointed out. FPML has been there for almost. I guess. 17 years now in. In various arms within the world. Does it look. About four years. Four, three, four years back is the. Look at that. The existing state data. So. In order to meet the challenges in the new digital world, they had to come up with a much more better. More precise data standard. And then they started formulating. The basis of. The common domain model. Essentially. That took most of the work. But I also started adding. Security spots. Covering everything from. All sorts of asset classes. In the data markets. So. You start out with. And then covering. And then. The fast theater. So. And then. Along the process. Now covers the life cycle of standard digital securities. Equities are bombed. They're still expanding to now include. Repose and. And then a separate group is now being formed. So it's a much more comprehensive standard. And then. And then. It's in progress. As we can find it out. The last year. Barclays being one of the sponsors of the CDM. Along with is the. A. At that time. The purpose of more of. Lights out the events for interest. But this year. And then. With the addition of security processing. And they have done actually introduce a life cycle of. You know, actually the bond. And then let. Players take a standard. Broker dealer with a client and how they interact on. Execution followed by an allocation. In a settlement. Life cycle. And let people work with and come up with their own solutions. It's an interesting event. There are. It depends on New York and London. And this year they added Singapore. So. Different. Have taken. The model and apply to the various life cycle. And. I think both of them are. Different companies. Various locations. So. Looking at why there is such an important interest. That is. This is the. Is there a look at. Meaning essentially needed a digital data. The. Existing ones. Whether it's a fixed protocol. It's purely a message. Standard. And at PMO. More or less a data standard. Had their own purposes of communicating between parties. However, there are no precise standard. That he could. A codifier. And also. Help. In. Parties. Come to a consensus. And cut this. And have a. Kind of a digital reference. None of those. Kind of. Existing. Fixed out at PMO standard. Now. So that's our, that's our main focus in. I've been involved in this. Working group from almost 18 months. I'm going through. All the different. And I'll come to that. What we are doing. Look at the benefits of that is. You could see that. Most. These. If you look at the fixed protocol earlier on is. Primarily and order and execution. Of course. I'm out of cases that added later on. And then additional product on options. You. Added adding. But however. That stayed more towards. Security processing. PMO. Covered on. The derivatives. But. When they've got it. Referring to each other. They refer to. For example, which will cover some of the aspects of option. But you know, in a format that is more. To fix. And yet PMO. Refer to some of the cash products in their own implementation. But there are no common standards. CDM covers both. And actually uniformly addresses both of these. So that you'd have. The same representation. Let's say securities. Whether you're going to use in your security processing. Are used as securities as underlying in. And there it is. So that's a main. Advantage of. Once in the model. And it captures the life cycle events. Precisely. We'll come to that. What does it mean. We are built in data rules. And an interesting problem is that if the PDM is represented as a patient. While. PMM. Yeah. And some of the schema. PDM does. Being data foundation. You might not be strong. However. To actually. Make it better. They will be used. Within the model itself. Data rules. So. If more of a business. Data rules. So. It's interesting. The thing is. Contrast. If you had a schema. It's more of a. Technical verification. That does. We'll look at only from the perspective of. Whether I'm. Whether an implementation. The schema. However. The model looks at more from a business perspective. If you are doing it in a life cycle event. Does that mean. The. Candidates of the life cycle. And that could be by product. And hence it's much more. You. One of the interesting things is that it is. That. Causes an effect. Mostly. Candidates. You just simply. Moving from. State to state. Right. If you are doing it in a life cycle event. Does that mean. The. Candidates of the life cycle. And that could be by product. And hence it's much more. Library. Validation. Of the. You. One of the interesting things is that it is. The. Candidates. You just simply. They are moving from state to state. Whereas here it actually looks at the same. What are my inputs and what are my outputs. Somewhat very similar to. Factoring. How DLT. Or. Blockchain. Work. Except that this is. Completely. An underlying. DLT. Or. Dependent data format. So that it could be applied to any of the. Block. Or. DLT. Which is the. And that. In any ages. Essentially. Cover. Auditail. That. If you go to the life cycle. Then you would always be able to. Go back to the previous event that. That was. Along the life cycle. We will come. Later on this. Talk to Jason. The other advantage is being purely a great informer, being a tech informer that starts an advantage is legal to use, I wouldn't go too far into this, you know, just to point to, you know, what is the division itself, and you know, how that, you know, a broad sense of a, you know, a data structure. It's interesting, it is, you know, we started in college with smart data for smart contracts. If anything, smart contract is a bit more than Nibbler's definition, particularly in natural markets, as the processing cycle, even processing, all the business life cycle processing, that's much more complex. We really needed a very standard way of defining the data and that's what we came up with. So, a typical header looks at that, you know, of data or you indicate the participants or what we call the parties. The event lineage is the one that points to the previous event. So, it's just creating a chain of events with the data structures, so this is very unique. Excuse me, Mike, looks like Ron has raised his hand, maybe he should, what were the questions? Yeah, yeah, I couldn't see, okay, go ahead, Ron. Hey, Manny, I'm sorry, I just want a couple of quick questions, and just, Vivin, building on your comment previously, just so I'm clear, this is not fully developed and vetted yet, right? Is this, is, where are we in that process? Because I would imagine some of the common lifecycle aspects would be relatively difficult to flesh out across multiple asset classes, but where does this stand currently? And then I've got one other question on the header that you just mentioned. Yeah, so it's a good point. From the autism derivatives point of view, it is only covered by cycle from, essentially talk about a trade life cycle here, right? So, from then, it starts from more or less from a post-trade perspective, so you're not talking about trading here. Right. So you start from an execution to allocation to the standard lifecycle events in derivatives, rates and research, then all the stuff that goes on, termination, standard, no ration, all of this are all documented. Now it comes to securities processing, which has happened this year. It fairly attempts to cover everything from, again, execution, that means then more allocation, you would then come up into sort of, getting into the integrated details of net and you just there is still a sheen further developed. But again, it covers broadly uniformly across, now you can talk about cash product, it is a bomb, it's irrelevant because it's just an underlying security. Got it. So it is due to model in development. But the one thing I can tell you is that is, if you look back two years from now, if you're not using CDM in your yield implementation, you are out of luck. Because the most industries now galvanizing towards this, it comes into one common standard across cash and derivatives is usually had been a problem because cash markets were one way and derivatives markets were another way. Here it comes as a common integrated platform and added to that they had ICMA which is covers the security finance, retail and security finance. And now bringing in collateral, you have to bring in a much more common sense to me and by the fact that these things are all linked to each other from the introduction of the Fed all the way to settlement of the state across multiple last classes. You are really looking at a data standard that's trying to use some, can anyone start developing the product? Yes, yes, we have been building our platform almost now more specifically here. Using this model, it's painful because the model has been evolving fast. But it's sort of stabilizing, I would say the last couple of months because most of the product categories are vacant. Now getting into the military details of making sure that your standards are actually applicable, that's why these facts are connected. Got it. Okay, no, and thank you, Manny. And one last question just real quickly, I don't need to monopolize the call on everyone but when looking at this CDM event specifically and maybe I'm getting too far ahead in the technicals but is this a hash within the CDM event? Are they running their own hash functions within this? And do you know what the methodology of the hash is? Or I'm just trying to think ahead to on-chain DLT transactions will that data be absorbed into this kind of CDM event and it's a new hash and I'm probably mangling some of the computer science around it that my colleagues on the call know much better. But I'm just trying to figure out, do they mean hash the way we mean hash in the DLT space? Yes, except that this hash is a model hash. That means by using this hash as a key level upon the CDM objects, you are actually underline DLT independent. So like for example, when we implemented this thing or implementing this on-chord up, what we do is there is a, like any other DLT data, so I use to give a transaction, they do a hash. But this hash is more of a DLT specific hash. So the CDM itself, when we construct the CDM object, it constructs the CDM hash for the entire object. And hence, later on, it becomes a key part of the data being stored, you could actually retrieve it by as a domain hash or you could use the underlying DLT hash. So it's up to you, it's up to the implementer of overuse of the CDM to use whatever they choose. But the advantage is now you can uniformly write code across multiple DLT if they just use only CDM hash. We don't care about the underlying DLT. So you simply have to say, I retrieve data, here's my hash, you'll be able to retrieve the data on no matter the underlying DLT. We are not talking about interoperability between DLT, this here is purely, let's call it a data interoperability between multiple DLT and block. Got it, understood, that makes sense. Thank you. Okay, so, you know, carrying on, as I said, there is a chaining of events and also we capture the event effects. Basically, at a higher level, a CDM is an event that are essentially a transaction that captures a bunch of inputs, which could be zero or more inputs, and then an entire bunch of outputs, much similar to what you would see on a DLT transaction. Except that it's got a more structure, it's got what's called the basic primitive event. The primitive event are the building blocks. What I would say is, how do you create a new trade? How do you terminate a trade? How do you allocate a trade? And then the actual life cycle events is built upon top of that, which could be at the personal level of a business cycle or the portfolio level of a business cycle. Those are all the life cycle events, which are more of a business process. So it's a two-year event. So you could combine a bunch of primitive to actually create a business event. The references to the contract are essentially the outputs are typically contracts or let's call it event objects. This could be a higher level contract as a trade or an execution, or it could be at the last low level as a simple cash flow transaction. So whatever that is, the net result is an output. So for example, if you take a new trade, the output would be a contract. If you take an allocation, the input would be, let's say, a block trade and then the output would be a series of allocated trade. If you take a payment process in life cycle, the input could be one or more contracts or the output could be one or more payment instructions or what we call as transfer events. Everything is model defined, so it is not mapped to any underlying, let's say, DLT specific. So you could really take that and apply it to the underlying DLT of your choice. But by defining a data model without a much higher level, it's a lot easier to move your data from DLT to DLT. Moving on to, this is how, the interesting thing is most messaging systems, if you look at it from the move, if you take some orders to execution, I'm just doing a very broad level to allocations and then going into the clearing and settlement process, these things are just in unidirectional. Essentially, you move from one stage to another stage and you move on to another stage, then you take next of data standards or data processing, these kind of things go to the subsequent stage. So I represent it as green, like it's green now. CDM in return also does a backward-looking, essentially changing the event. That means built-in audit cells saying that if I had an execution as an event, how did this execution come about? It has a link as an event reference, which is a hash into the original set of other event or input built into this execution. So you really can go backwards to say, if I had a transfer and then let's say in simpler stages, internal compliance requirement or a regular data processing, show me the entire lifecycle of, how did we arrive at a transfer? So from the transfer event, I said we can go back into a settlement event or as a method event. From there, we can go into how this event, the settlement itself created because this is netting of bunch of individual transactions. How did they come about? It could be a bunch of threads coming from here from allocation or directly a few of these. I should actually add one more add-on here. But the net result is that the CDM itself, if you kind up a built-in RTO, it would be very, very important. And by this, part is signing on to have that entire history on record on the ledgers. So you don't have to go around looking for our history and also this information being, let's assume that propagated through regulators, they're not going to come back and ask you ever, just in flight, how did you come up with this? Right now, every time there is a question, they will throw it at you and you have to go out and collect this data from potentially multiple dozens of internal systems to piece them all together and submit it to an auditor to a regulator. And if the regulator requires the same data from the counterparty, you will definitely end up with data that's missing or inconsistent, and not sure with the CDM. That's the biggest advantage of having a built-in data. Any questions? Good point. That's what its main purpose is to say, how do we apply digital standards that parties, when they use the standard, do you have a complete self-professional data model? It's JSON text across the entirety of the chain here or the process here, Manny? JSON is the data representative as always. Yes. But CDM is the data standard. It doesn't have different languages. What are the different operating languages from any number of systems? They have really good talk of Java, then digital assets came in and then, you know, they come up with their own demo, I could have turned off. Now there are efforts on anything from JavaScript to anything, whatever, a lot of, if they open source the entire CDM data model, so it's no longer, before, until about a few months ago, it was under the ESS VM, you had to add membership. Now you no longer have to, you can go in the district, simply online, you just have to get access to the entire open source and there are now also open source tools that are necessary to create this model. So you could go ahead and you could tell that there is seven asset class that you wanted to add on to, you could go ahead and model it, just to send it back to CDM for a fuller, or you could just continue in your own product as an extension to do. Okay. Thank you. And another comment on the hash, as you know, in the cryptographic community, hash is known as a commitment, right? Basically, that commitment is what you're signing. So I guess, since the hash standard stands separately from the hashing standard of the underlying DLT, it is in effect something that can be turned separately in the sense that no matter which DLT it's implemented in, that hash will always be similar. The second point is that since it's a commitment, basically it's a legal commitment arises out of it. Yeah. And that's the key part because it's a bilateral agreement, most of these, this is the product that they are supporting now, which are the swaps and the derivatives, are bilateral agreements. Yeah, but it's also applicable to post-paid costing of any individual security between a broker dealer and the customer, or any change in a broker dealer you can hand it anywhere. And I'll say, if you bring in a custodian, it's a multi-lateral as well, a broker dealer, a customer, and get on their first one. So all of them see the same data structure to the like cycle. So that there's no ambiguity in the reference and they're essentially completely eliminated, let them create at any stage of that path independent of the underlying DLT. Maybe PDM by itself, provides you that whole linkages. So in simple form, if we agree to take out the complexity of the underlying DLT architecture, if you simply be all agree to the CDM and then you put it in an immutable database, then you're also golden. Of course, what the deal to provide you is the whole security aspect of it and communication between parties and the consensus mechanism. But as a CDM, it can also independently be used and some of the banks already started exploring this because they're having this huge data replication of great data of multiple systems and they don't have a single reference and they're looking and saying, why don't I use CDM? And then I can, any creating platform, it's impossible to rewrite all of them, but at least if they have a common CDM data, anyone can go and improve the data from that one. This is within the bank, they are looking at how CDM is used. And we are more interested in actual DLT in today. It was strange to see, not to see IBM in there with the hybrid of fabric, because last year they were there. And I'm sorry that Tong Lee has disappeared, otherwise I would have asked him why didn't you guys show up? Yeah, I mean, I guess they already participated, you know, the participants were free to choose any underlying DLT, not necessarily the only one that was there on the hackathon. You could have come up with a hack fabric, which is what last year they did. Some of them used fabric of the blockchain, but it's up to individual participants, whatever they wanted. I'm just gonna go show this thing, how we actually implemented in a CDM in a test environment. This is the life cycle, but I'm not gonna go too much into it, except that we set up nodes for banks to come in and we provided a whole infrastructure for the working group members. So they could actually use it and verify and we did a full test of this, at least a partial test of this whole life cycle early in May with all the participants of the banks and CDs is done and all working together to verify that this is really doable. And we use Coda and we use CDM, everything's documented in CDM and the messaging protocol, what's been stored in this wall or whatever the underlying DLT are all CDM messages, not the underlying, you know, real specific. We just store only data messages. So that is a big advantage of not having to be tied up to one particular DLT. And we did this whole orchestration of how two parties putting in their own data, how would you actually, you know, create a shared service infrastructure so that these states can be actually matched and then create a golden record. I'm just gonna go too fast, there's a lot of stuff into it. But ultimately what you see is that it's once it's matched to create a golden record and that they're also being actually being you know, over the regular. This is what we did a full test of CDM and in Coda, in Coda I could have been sending out the DLT, but you just placed one for that, for that particular thing. So that's an actual use, use case. Now we're expanding to add more, you know, the CCTs are interested in creating so that our next thing that we're working on. So it was just some summary of what we have worked on purely on CDM, providing the test tools that we went through the thing. And now we are building this full lifecycle product essentially bringing, you know, drug dealer customers are essentially, you know, key to senior jobs. You know, most of them are the current implementations that are all purely looked at consortiums. We know that's a very difficult issue, but the sweet spot is we're bringing why something so tight together, that's what it's all about. And mostly interesting thing is I went and talked to the CCT and CFP staff for the past month or so. And we discussed about CDM, PLT and then a more new exciting topic about multi-party contributions to our custody. And that's probably another topic and another thing that's coming down the road. But that's, you know, it's all related to how CDM can be applied. And the regulators are very keen on seeing data standards being applied to be, although not everyone comes up with the wrong applied implementation. I'm gonna skip this thing just to show that how CDM is powerful enough to show that we are actually implementing these important network of implementing the lifecycle of these processes and linking and changing them all together so that visual assets matter where and what part of that underlying assets are being formed, we can actually process and provide them a fully peer-to-peer shared services make it available for them to the banks can use whether it's digital tokens or their own issues that comes out of the new digital tokens, you name it, it's all at a higher level for us just, you know, securities and so forth. Yeah, that's the actual implementation of CDM networking on that. Thanks for your time. I hope, I hope that's been perfect for the time. Any questions? Manny, I know you and I have spoken at many gatherings as well. What's the easiest way to kind of collaborate and work on some of this work, at least see the updates that are going forward on CDM? In what sense? How's it developing? Who's using it? What are the models you're looking at? What use cases have been fully developed beyond post-trade? Yeah, so CDM is being, as I said, we are applying to two different areas, one of the derivatives and it's much more, initially the more of a test scenario, now 60s are getting more serious, CNA, LCH are getting into the game slowly, that means they are saying, okay, how do we apply the theory? And how do you apply to, you know, then there is a separate initiative on LATR, but LATR is still in the very early stages of the CDM model definition, but there are interests within the community of applying CDM for LATR. As I said, we are, as I said, we are implementing CDM assets. There are other interesting new areas around the community token. People are talking about inter-bank cash transfers or intraday cash transfers. There are some interesting applications being developed using CDM as well. I mean, it's still there all early stages. We can say that having worked on CDM for more than a month, or what they are implementing on the LATR, you have come quite a long way and probably the next few months, it would be able to showcase the actual implementation. Okay, thank you. So are these slides going to be available for others who did not see? Yeah, I mean, I would curate with you, Griffin, and this is just a hack and put together, which is essentially what I used for some of the data hacks, so we can work out and see what's relevant and I don't want to bring any of the most of those things into the picture. So let's, as long as it is used to CDM, we can clean up the presentation and put it out and we can add it to the content and we can do what we want. Let's go on with our slides. It's very interesting because a lot of the big banks are really involved in this, including the bank that I used to work for BNP Paribas is a big component of CDM. I don't know how much they will use in the coming years, but definitely JPM, Barclays, people like that are also involved. So we have about 15 minutes. I was hoping to go into some of the projects that we're working on. One of them is, of course, the CDM project, meaning the standards definition project with money is leading. So we'll put the information about that in that particular project and then we will take up the other projects. Unfortunately, most of the people who are here, I mean, I think Kirti is interested in the tokenization project and so money went into details of how he's extending the CDM into the digitalization aspect of things, but independently of that, Ron Resnick had talked to me about collaborating on the token initiative, the TTI, he called it. So Kirti, you have something to say about the tokenization initiative or should we go into the other aspects of the projects that we're working on? Kirti, oh, you disappeared. Okay. Anyway, the other projects that we're working on include a taxonomy, Gary? Will you want to say something, Gary? I guess not. Hey, Vipin, it's Ron. Could you bring up the list of, and again, my apologies, I've been out of it for the past few weeks with some work and travel and I'm kind of diving back in. I know that we talked about allocating some work streams. Do you have that list that you can display or show, share, post this call that way with the folks who missed some stuff can kind of get back on the horse? Yes, let me get to that. Give me a second here. I'm going to share my screen with the meeting minutes, meeting notes, which actually has the projects, okay? So I'm going to share the screen. You can see this, right? Yes. So it's the link is from the meeting notes which we saw before, the CMC projects. And in that, we have a taxonomy and we also have the data standards work which money is kind of leading. But, you know, there is a mix of standards there. There may be other standards that we have to look at, but primarily it's the ISTA CDM. Then we have the use case stream which Stan is supposed to be leading but he hasn't shown up in the last few meetings. Hopefully he'll come back. Then we have the obstacles which are the obstacles to adoption. So I should mention something here because which money probably did not mention but it is very, very important that is part of the CDM. So there are two things. One is that non-specific hash which we talked about. Second is there is something called synonyms which is basically a way to transform from other standards, FPML, ISO 20 or 22, various other standards into CDM. And that's very important because obstacles to adoption are interoperability between the different standards is missing. So unless you have a way to transform from one to another you cannot go forward and also to legacy systems. One of the things that was mentioned in the, during the Deliv hack was a lack of access to identity standards or identity infrastructure through the reference data portion of CDM. So that will definitely cause a problem in adoption. I know that Rosetta, for example, Rosetta is the library that is attached to CDM that should be used to transform data from outside into the CDM specific data has access to life which is a global legal entity identity foundation a mouthful there but this is a very important organization that gathers data from multiple enterprises all over the world so that you can look at the data and say, are you, where are you domiciled, what regulations or what documents have you provided based on where you are? The other thing is it allows for a hierarchy to drill down that means one organization owns another one then you're able to say, what is the parent organization or the children organizations? The third thing is that it will allow you to go deeper and deeper into the actual beneficial owners. So the beneficial owners are very important for KYCAML because the actual natural persons who are at the base of these organizations could be on an OFAC list or some kind of a list tag list that means you should not be doing business with them if they have a certain percentage of ownership in the enterprise you're dealing with. So all of these are important for adoption because now this is knitting together not only the security infrastructure, I mean the infrastructure for handling securities and the various events but also other things like the KYCAML. Ron, you have a question? Yeah, I just want to be cognizant of time and I appreciate you going through the particular work streams. A couple of logistical things Bippin and I suspect some others might be having this problem but as I recall I had kind of raised my hand around being part of or an interested party for five and six I think it was regulation and tokenization I've been unable to log in so I'll take that up separately but if you have access to this please feel free to add me because I'd like to help move forward those two topics if it's useful to do that. So you have to get a Linux foundation ID, do you have one? I do, I do have a Linux foundation ID and then the last couple of times it might be me fat-figuring access or something but... Okay, so you know what the thing to do is when you log into the wiki, right? Which is where you have to go, wiki.ipologer.org. Yep. So if you go to the wiki.... You don't have to do that here, I want to be kind of... It's important because people have been telling me this. So what happens is that it will bring up usually a log in window and if you do not get that log in window then you should just clear your cache or do something that causes that log in window to appear and once you log in you should have full access to the CM state problem. That's what I'm trying to tell you. If you can only up because this is important. If people cannot contribute it is a problem, right? Yeah, to be fair I had tried a couple of times I couldn't get in and it was on my I'll get back to this and three weeks went by and I never did so I'll work on that again today. Okay, so there are several other working groups here, the regulation, tokenization and digital privacy and oracles, which is another part of the CDM the reference data part, which I think I mean I'm not really sure exactly how that works. I'm looking to it a little more but oracles are a very important part of the functioning of any system because basically you have the blockchain system and then you have the outside world. How do you interface with the outside world either to get prices, to get security data to get identity data, whatever it is. So all of that has to happen through the intermediation of oracles. Anyway, we've come to 1057 anybody else has anything to say now is the time? Nothing for me. All right, so I'll make the recording available. I will also work with money to get something he deems presentable on the slides and I'll put in the meeting minutes as far as I can get out of the recordings. Thank you. And hopefully this has been a worthwhile session. Thank you, Vipin. Much appreciate it. Thank you. Thanks. Thanks all. Thank you. Thanks.