 Okay, welcome everybody to this Tuesday, October 17th, Climate Action and Accounting Special Interest Group meeting. We have a special guest presenting today. Before I introduce him, I kind of want to walk through some basics, some housekeeping things. We follow an anti-trust policy. So if this is your first time on the call within the hyperledger community, please check that out. We also follow a hyperledger code of conduct, which is very important for operating and participating within this. I'm getting some sort of bell, David. Do you know what is causing that? Oh, it's just when people join, I'll turn that off. Okay. Thank you. So for those who are joining again, what we usually do is kick off and see if there's anybody who is joining for the first time today. Love to have you raise your hand, give a quick introduction. I'm seeing a lot of old phases, but want to see if there are any new phases. Is there anybody for the first time on the call today? No. Okay. Well, then we can probably avoid the how to contribute conversation. I think I'll probably just hop right in. I want to introduce Matthew Larson. Lawrence and I'm sorry. Who is the chief strategy officer for toll on earth. This is a conversation that Matthew and I've been kind of having some back and forth about for, for a little while now. And I'm really excited that we're able to finally get him to share his time and his expertise with us. I see from, from different people on the call today. So I'm excited to be able to talk to some of the people who I expect it would be interested in this. I'm glad you could join. But let me go ahead and hand over to Matthew. If you want to share your screen. We'll have, we'll have him present and we'll open it up for, for question and answer. Okay. Sure. Thank you. We stopped sharing mine there. You should be able to share. Excellent. Well, thank you very much. Thank you for this opportunity. Thank you for the supportings of this. Presentation many times before, so it's certainly a pleasure to be here, like presenting to you. So can I check so you can see the screen I think. So thumbs up. Yes. Okay. So today I'd like to present toll on earth. And in fact, I'm not brave enough to do this myself. So I brought a river gang so they, they kept quiet when you were asking for new participants, so I brought Cindy Montgomery who is director of ecosystem partners and CG John, the CTO. So any tough questions, they will be able to help. And actually I saw in the participants, Christian Powell and Alex Howard. So both also long-term collaborators and very dear friends of toll on earth. So I think we're well stocked. If there's a discussion at the end of the presentation, but the topic is automation of digital environmental asset exchange. And in fact, I think the reason to discuss this now is because the challenge that we have is huge. And actually I was at climate week a couple of weeks ago and there's lots of discussions, lots of good things happening. But I actually saw very little discussion just about the enormity of the task that we have. So we think that in North America, there's 5,500 million metric tons of carbon being put into the atmosphere each year. And even if we think about the largest kind of direct air capture facilities, that's only half a million tons being sequestered each year. So we have to do a huge amount of work to bring the scale of the carbon markets up to the challenge that they have. And I don't think that will be possible if we're talking about one million dollar deals. If we're talking about over the counter deals where people have to kind of fly out, make assessments or whatever. The only way that we can achieve that kind of scale is by moving from that to on-screen deals, deals that are facilitated in the way that the CFOs of major corporations are used to working, process-driven, so things that are embedded within the day-to-day process of huge enterprises and automated trading. And then on the other side, the origination side, we need to work out how to make onboarding supply far easier. So again, if it's manual processes, that tends to favor large projects, which means that a lot of very high quality smaller projects just can't afford to play. So we need to look at ways to automate the onboarding of supply and also the purchase of assets. And that's not to say that a lot of the projects that are currently happening are not valid or not valuable. In fact, they're providing tremendous value through bringing the patterns, bringing the knowledge, and bringing the industry up to where it is at the moment. But I think as well as that, when you just need to start the conversation about how we're going to move towards the $15 billion market in 2030, which will be very different to the circa $2 billion market that we have today. So Tollum's approach is that we want to bring trusted assets in exchange equitably with trusted prices and at scale and implement in a way to encourage production of emissions. So quite a mouthful of the sentence, but actually it brings to bear quite a lot of activity that we're putting in place and a lot of innovation and new things being brought to the market. So in fact, for each of these, there's a slide to follow. But a quick explanation, Trusted Assets is going to be vital. It's the only way that trading can move from manual assessment to onscreen. If you're buying something from Amazon, you don't feel the need to call up the manufacturer, ask about the factory process, ask about who's on B-Shift on Tuesday night to work out exactly how the thing was made. There's a level of trust that's required in order to facilitate this kind of purchase. And to do this, we use Hedera as Guardian. I think previously on this session was Geisenberg has talked you through the Guardian. But I think also today we can bring a bit of an update as to how that's being used within an ecosystem context. The exchange being, oh sorry, the asset's been exchanged equitably. I think there's a kind of specific thing about the carbon market that people are not just purchasing a good to be consumed. They're also considering somewhat that they're investing or providing some sustainability to the originator of that good. So in order to do this, there needs to be a level of confidence that the money that they're spending is actually being returned to the originator of the carbon credit. If there's an assumption that it's being sucked out by intermediaries, if it's not going back to the origination source, then people won't trust that their money is being used as they want to and in a way that will support the ongoing growth of the market. Trusted prices is important because the assets we're discussing here are in fact very complicated. They have many attributes. They're extremely different. A relatively small-scale nature-based program in Latin America is very different to a huge carbon extraction facility in Wisconsin. And each of them has a different cost basis and also a different climate impact. So the longevity of the climate impact and the way that carbon is attracted is very different between the two, so they have different prices. So there has to be a way for price discovery to be prevalent so people can understand exactly what the expected or a fair price of the assets are. And then at scale, as I say, we need to be able to transact at very high volume. So a way to remove a lot of the friction, a lot of the manual processing and allow people to go to a screen and make a lot of scale carbon purchases in a very easy way. And then the last part is to reduce emissions. So we believe that companies that are committed to the carbon market, so companies that are embedding carbon reduction into their processes, embedding carbon offset purchase into their processes and also reporting correctly of the companies that are also reducing emissions. So we see this as not just a purchase, but also a holistic set of processes and reporting around that purchase. So the Tollum platform also has a set of APIs for reporting and ways to embed those purchases into partner applications as well. So to look at each of these in a little more detail, the first is trust in the asset, and this is where we use the Hedera Guardian. So the Hedera Guardian is effectively a workflow engine. So there's a set of roles and actors operating a workflow which is centered around an open source policy. And in fact, the Guardian has each of these actors, which could be people. It could also be a kind of sensor or a piece of instrumentation. They have it, so they're assigning the transactions as the data is uploaded. And then by collecting this data in a stripped form, there's a set of data that can prove the ecological veracity of the asset. Now if we do this correctly, so if we do this in a well-structured way where we use well-defined data structure, then it allows other people to automate their verification processes so people can build on top of this, take that data and find automated ways to verify the asset. So we're working with various large auditing type of organizations about how they can attest to the purchase by verifying the data that's being produced. And this data can then be augmented with other data. So for example, financial or maybe rating data to bring some kind of automation around the decision-making. So a customer can have a particular desire or profile that they wish to purchase. It could include the ecological data but also the expected or assessed quality of that asset. And by making this data transparent and visible, then these processes can be automated and built on top of the tunnel platform. The next piece was that I'd like to discuss is about the price oracle. So as I said, carbon credits are particularly complex assets. So very difficult to value just because of the variability within the attributes and also the number of attributes that make them. So in order to help here, we've trained a machine learner model to ingest data, look at those attributes that together constitute the actual asset, and then make an estimate of the impact of those attributes on the overall price of the asset. So to do this, we bought a set of third-body data. It's equivalent in value to just over half of the voluntary carbon market. So relatively large data set. And then we train it to really understand the implications of the attributes of the asset. So you can see the second kind of box of the four that we have here. The first box is the overall data set and predicted price against actual price. So we have very good correlation of the overall price. But within that price, you can see on the second box is like a vertical set of attributes. So it's a little bit small. So apologize for that. But you can see at the top, perhaps vintage year, sector, project offset type, et cetera. And then for each of these attributes, you can see a impact of that on the price. So you can roughly see a kind of vertical line at around $8.60. So that would be the average price of assets in the data set. So if you had no information, that would be your best estimate. But then you can see the vintage year in this case is 2020 has got a $2.12 positive impact on that price. Sector forestry and land use has a $1.46 positive impact on that price, but it's an avoidance project, which are relatively policy within the market. So there we have a negative $1.01 impact on the price. So by adding together these various impacts, we can come up with an overall price for this particular asset, which in this case was $12.85. Okay, now within this, you can dive deeper to particular attributes. So the third box is the impact of social development goals. So here we look at the various combinations of social development goals stated within the project and see how those impact the price. So at the diagram I have here, just as the number of SDGs, but we can also see the combinations of those and how they impact the price as well. And then the last box is the purchase preference. So here we've taken a customer. So I think actually in this case it's Delta Airlines. We've analyzed the purchases and were able to predict with a reasonable, not perfect, but reasonable accuracy, the attributes that they will buy in their next purchase. Okay, so this is moving towards the capability to have a kind of recommendation ability for particular customers to help them select their next purchases. So you can see that with this set of information, this is useful at the time of trade. And so people, both buyers and sellers, could have this estimated fair price. So as they discuss and agree a price, then hopefully that friction is, or some friction has been removed. But it's also useful if you're pre-project, for example, if you're still in scenario planning, you can decide, make some choices based on a knowledge of the financial impact of those choices to create the project that has the best return on investment given the set of inputs that you have. And also if you are an investor and you have a portfolio of assets, then you have the ability to kind of continually value those assets over time to allow you to understand the value of your portfolio. But the second part interest in the transaction was the unledged payment flow. So, I mean, there's been some unfortunate stories in the cover market where particular purchases have actually largely led to large outflows of money from the market into intermediaries. So sometimes kind of 70 plus percent of the money that's been paid for an asset is going out to an intermediary. And also there have been people that have traced, for example, the financial flow of the airline kind of carbon offset for a particular journey. And in the discussion I was having last week, for example, someone had bought or offset the journey for $15. It had gone through something like three sets of outsourcing down to what was assumed to be the project at $3 remaining. But actually that project had no information to it. The entity that held it couldn't be found. So at best $3 of the $15 went to a project. At worst the whole $15 just disappeared. So I think this type of kind of murky financial flows will lead to a reluctance of people to purchase offsets and will stifle the market. So what we're hoping to bring to bear is a USDC based payment flow, which means that each of the communities or the projects and then the communities behind them. And if there's an entity associated with a development goal that's then stated in that community, each of those entities can have a wallet and the actual transfer of funds into those entities can be transparent and verified. Okay, so as well as that, we can imagine that people like Tollam, who are intermediaries, could also have a wallet. So their kind of extraction of value from the chain can also be assessed, meaning that at the point or prior to purchase, a purchaser will know exactly how much of their purchase money will go to the project. So the origination entities and how much will be extracted by intermediaries. That can form part of their purchase decision. And we anticipate that if there is a confidence that a high amount of money is going to the origination entities, then in fact, people will be willing to pay a higher price. Okay, now this is a difficult thing to put in place. It's been somewhat disrupted by the disruption in the kind of crypto markets of late. But we are working with major payment providers as partners to put this in place. This is something that we will hope to release in the next year. So now really we get to the part about scale. So Tollam's kind of contribution to trading at scale is what's called the ARM, or Automated Regression Market Maker. And in effect, what this is, is a way to match buyers, oh, sorry, yes, to way to match buyers to assets using machine learning. So as I said before, and as we know that an asset is defined by its attributes and is relatively complex. So the ability to match a particular purchase scenario to an asset isn't straightforward. So we use machine learning to do that. We have the ability to estimate price. Okay, and then we can bring those things together in an automated market maker context in order to execute the trade automatically. Now, in fact, the technology that we have, if you like, the plumbing is in place to do this, we're really using this in a more advisory role internally at the moment, just because of the potential to manipulate the market. We don't yet have the kind of maturity in order to allow this to be released in the wild. But as we continue our marketplace activity, as we execute more trades, then not only is the training data becoming available to improve the models behind the arm, also we can improve the robustness of the arm and at some point release it as an automatic trading agent. And then the anticipation is this, which will drive scale, it will drive liquidity. People will be able to operate liquidity pools with their own arms. They can also automate their various applications used automatic buying and clearing of trades, automatic stocking of inventory levels, et cetera. And this will be the kind of dominant trading mechanism within the towing platform as we go forward. I mentioned at the beginning as well about how it's important to also find ways to have easier onboarding for projects and also really lower the barriers and the complexity of new projects that want to participate in the market. So one thing that we have been doing is working with one of the well-known registries to really integrate the registry to the towing platform and the Guardian ecosystem. So within this, there's the registry. There's also a forwards platform. So the ability to use forwards to fund projects will be available. There's an onboarding platform to allow people to automatically or use a kind of software approach to onboarding their project. And also MRV data or MRV platform in order to provide the data as the assets are created. And these are combined with the column trading platform to give an end-to-end experience that will hopefully lower the cost of entry and facilitate easier origination of assets. So in terms of the progress that we have made, so first of all, we have a major customer in Atma.io. So Atma.io is actually a division of Avery Denison. And effectively it's a very large scale track and trace application whereby all the assets moving through a manufacturing flow are of a digital identity, so they're tracked. And the carbon content of those assets and as they're combined into finished goods is also tracked, which means that at any point in the manufacturing process, you can have a snapshot, if you like, of the carbon content of the imagery within your flow. And you can then connect to the Tollham Earth marketplace and buy credits in order to offset that, which then allows you to put verified carbon neutral imagery into the next step of the manufacturing process. So if you've got them providing onward to a new or to a supplier in the supply chain, then they can have an incoming net zero good, which kind of makes their process more efficient and also allows them to kind of verify their own or validate, sorry, their own ESG goals. So Avery Denison currently has something like seven billion items being tracked. They're used in six of the leading fashion brands or fashion Accraelle companies and four of the top 10 food companies. So really this is quite a large implementation of the embedding of carbon tracking within the supply chain. On the, yeah, on the Hedera network. The next things are all project names with a personal project lotus and effectively this is where a set of companies have combined to create a platform for geothermal digital assets. So these are Rex and also in some cases, carbon credits. And here Tollham acts as the sales channel for that group of companies. Project GMT is a collaboration between Tollham, Alcott, Cultivo and the Carbon Opportunities Fund. So here this is where we're really demonstrating the kind of collaboration within the Hedera ecosystem to bring these guardian native assets to bear with the IDF acting as the kind of purchaser or at least the intermediate purchaser of those assets. Fintra is the project to bring the USDC and payment flow to bear. And then project Harmony is again a Hedera ecosystem based partnership. But here this is where we reach all the way back into forwards and project financing with the involvement of Eversity in the project. Just a quick note on the Avery Denison, APMA collaboration. So here we're starting off with a small number of assets digitally native brought through the Guardian. But then we have a set of partners who will be joining that including Nova and Christian Powell. And the goal is to sell these very high quality assets through the APMA IO platform and effectively to demonstrate the power of the data that's collected during the origination of these assets within the overall APMA platform capability. Okay, so I think that's probably the end of the presentation now. I can talk a little bit more about the access and supply. But I think at this point I'd like to open up to any questions that people might have. Thank you very much, Matthew. And please do raise your hand. Oh, I see one hand raised there. Jim Mason, please. Two questions. One, I've created some marketplaces myself retail on blockchain. And one of the challenges we always have, like I see your transactions require USDC stablecoins. Not worried about the auditability of those, but the conversion costs are always very, very high. They're not cheap. So unlike, they're more like using a credit card on a website where you're paying a relatively high fee. So one of the challenges that I always had was trying to engineer a way to drive down significantly the conversion costs into something like USDC and out of that, because they're not cheap. And then the second challenge is trying to provide a better alternative that's more secure than things like metamask wallets. So the better way to do that is to, in a sense, support custodian wallets through, I'll quote, better custodians. And there are some that are good and some that aren't obviously, but long-term, you know, it's kind of like your IRA doesn't sit in your metamask wallet. It sits with a custodian. So ultimately better custodian wallets are going to be, I think, the bigger place where you're going to be looking for payments from. So just your thoughts on those two items. Thanks. Yeah, so first of all, I agree. I think one thing to notice is we're not dependent on the USDC payment flow. This is something that we're hoping to bring to improve what we do. And it brings the financial transparency, which is an important part of what we do. But actually our initial trades are on fiat currency. We don't yet have this in place. And then I think, yeah, it's a difficult thing to bring to bear and also a difficult thing to bring to bear for a startup like Ptolem. We need kind of major partners in order to be able to do this. So actually what we're doing is trying to partner together with large relief organizations that actually have the same problem but have a bit more scale than us. And then the hope is that we will bring in more significant kind of international partners who have got more investment in stablecoin to hopefully drive those costs down. I have a question. I'm curious about the early use cases that you're seeing with people bringing their carbon credits into the Ptolem environment. What's driving early on the early innovators to kind of adopt and use the platform? What is some of the early use cases? So I think there's two things. The first is, well, I know there's three things. I think the first thing is that we can see at the moment that kind of demand is stalling, right? The market is having a very difficult time. And the challenge is that people are buying things and then it's turning out to be not what they expect. So they're getting caught out and also getting caught out quite publicly. So I think the first kind of attractive feature is the fact that the process that we have provides this kind of verifiable set of ecological data that can bring more confidence to the thing that they're buying effectively. So really there's a kind of coalition of the willing of people that believe that in the end it's a virtual asset. So the asset is only data. So stronger data will lead to a stronger asset. And then they see not just Harlem, but in fact the Hedero or HBAR SIF ecosystem as a way to bring that to bear. So there's a belief that the kind of approach is good. And then I think the hope that they have is that the prices will go up, right? So if they do this, they will get a higher price for the asset they're selling at the moment. Without this, all prices have been dragged to the bottom really. So people are struggling. So they see this as a way to differentiate their assets, a way to show that not all common credits are the same. And by doing this, not have all common credits pulled down to the lowest possible price. Thank you. Jim, I see you have another question. Real quick. Just because your platform is already operational, right? So people are buying and trading on that platform. There's credit, you know, carbon sources I can get. Given that, we're, I don't know the platform obviously, but where is the current market price for carbon offsets on the platform? Ballpark. Yeah. So actually we only trade through channel partners at the moment. So the channel partners that we have provide the platform and then we effectively played a marketplace component within that. And the initial channel partners that we have are still trading at, I would say, relatively high prices. So there's a kind of altruistic approach being brought to bear here that they don't want to follow the price all the way down. If we look at the, for example, the price I gave you before in the slide about price Oracle, I think it was $8, not $12 that asset went for with $8. $60 being the average price of the market. If we look at that data today, the average price of the market is below $5. It's severely depressed. And I've seen some data just this week that's giving prices average around $3. Okay. Now I think this is mostly because the market is now flooded with very cheap assets. But I would say that it's of course highly dependent on the asset type, what kind of origination process is being used, but even kind of high quality kind of nature-based solutions that were trading circa $14 to $16 at the start of the year are probably significantly below $10 at the moment. Great. Thanks. One of the things you talked about was this issue of market manipulation. And it sounds like you were building to anticipate how to, I guess, respond to that or somehow try to mitigate that. Can you share a little bit about how your solution would respond to that and what your thinking is about mitigating the risk against that? Yeah. So, I mean, the first challenge that we have is that you need some level of scale to be able to respond to that. So if you consider the kind of rough function in the arm is that we have a set of liquidity pools. And then within those liquidity pools, there's a kind of underlying demand that's dictating the price. Okay. So when we have relatively scarce assets or availability of assets, then it's quite easy to put assets into those pools, take assets out of those pools and kind of affect the price quite dramatically. So we need to get the scale up. And then we need to think about how we can build in more, just more protection mechanisms effectively to control the way that people can use the marketplace and control any manipulations that happen. In fact, Tollum is a spin out from a company called Object Computing, which is a technology consultancy in St. Louis. And actually, Object Computing has got some history or quite a lot of history working with financial organizations, looking at kind of automated tracking of trades at very high volume to look for fraudulent or manipulative type interactions. So I think the belief is that when we get to a certain scale, we'll be able to apply those methods, but really, we need to lift the market up. And I think it could be kind of one year plus before we get to that level of having the kind of a number of transactions going through that we can automatically detect this kind of activity. Okay. Thank you. And switching gears, kind of pulling back a little bit more broadly, the, you know, obviously, you're a kind of a leader in bringing carbon credits on chain and that, you know, I understand the value of that, but I imagine that the marketplace might have some barriers and some concerns. So I'm kind of curious of, you know, all the conversations that you've had, you know, what are the kind of key barriers to the adoption of this approach that you're hearing? And my second part to that question is, like, how does that evolve to, like, what do you foresee the evolution so that barrier is overcome? Yeah. The key challenge we've had is that we're not, by any means, the first company to tokenize assets. So this has been done before, or the tokenization of assets has been done before. And in those cases, it's, you know, often quite difficult to really understand what the additional value of the tokenization has been. So the customer's having to open a new set of accounts, he's having to go through some other KYB, the steps involved in order to trade these assets. Yet, actually, they're kind of just the same as they are on paper. There's no real difference to them. It's just kind of more digital. But in fact, that just means more work. So the first challenge is to be really able to articulate and have people understand that we're not just tokenizing the title of the asset. It's not just the ownership that's been tracked. And double counting is a kind of well-known problem. So tokenization of the title can help with that. But what we're actually doing is creating a token that is associated with the ecological data that was collected during the origination of that asset. Okay. So there's a data set that can be used to prove the asset has got the climate impact that it's claimed to have. Okay. So that is a big step. Now, in order to get the marketplace open running, we need people to actually have done this, right? So we need assets that have done this that are actually a type of setting up a guardian, having DMRB data flow into that guardian to create this ecological data set. So that is a cost. And I think the kind of carrot to have people to do that is the fact that if they do it, they will get a higher price for their asset. That asset will be better differentiated and better trusted. Okay. But that's a little bit of a leap of faith, especially when the market is quite depressed. So I think this is where it's helped us to have quite large believers in the cause, if you like, to step in and say, okay, we will buy those assets at this price. And including, for example, the IDF fund through the World Bank. So people are really kind of underwriting this step. And then that has allowed us to bring in the project developers to put this in place, make the extra effort in order to get the wheels turning. And then I think once the wheels are turning, it will be kind of clearly visible what the advantages are. But those first few suppliers that have to kind of go through this additional process, it was difficult. Thank you. Jim, I see your hands raised. Real quick. It sounds like if I want to get the roll right for Ta Long, basically you're in the middle. You're not providing any assets directly. You're consuming or registering other alcoholic projects, whoever they are, that are producing the assets and with whatever verification is on that side. And then of course you are, in a sense, providing alcoholic, asset services, I'll call it, out to other marketplaces where your stuff is sold. So in effect, you're in the middle there. Am I right about that? Yep. Now you're dead right. If we consider, okay, part of what's happening here is we're trying to collate a huge number of small suppliers, often in kind of developing nations, to bring that kind of large volume of supply to bear and then connect those to, you know, Fortune 500 companies, like very large corporate purchases. Okay. So there's a big difference between those two worlds. And part of what Ta Long does is to try and bridge that. So we collate inventory together to allow major purchases from big corporations. We handle the kind of management of the assets, as you correctly say, so the transference of the ownership of the assets, we handle the payments. And then of course there's the retirement. And one thing that we didn't get into is within the retirement, we're developing a process that effectively allows you to retire a reduction directly against an emission. Okay. So APMA, for example, they quantify or they also tokenize their emissions. So this means that when they're reporting out, they're able to say, okay, this is the tokenized emission for this exact kind of set of inventory. And here's the offset. So you get the full kind of data chain going from emission to offsetting and the ability to bring that together and have that kind of easily reportable and analyzable, if you like, is kind of what Ta Long does too. Yeah. It's an interesting, I'll call it, spot to sit in the middle. What would be nice as a slide that I don't see is one that shows me that entire chain that says on the one end, you have either a corporation or an individual going to a marketplace as the consumer and saying, hey, I'm interested in acquiring offsets. And on the very bottom, you have, I'll call it the farmer in a sense, or whoever the solar generator is, wind generator, forest farmer, whatever. And saying, here's our chain of all of in a sense, the partners involved in that chain. Do you know what I mean as a model? Because you have actually individuals lined up here showing us. But in my area of focus has been lately focusing, although I have the corporate connections to see who's doing what on that side. The bigger thing is trying to understand, and I'll call it architect better, the supply chain side to manage that in a much cleaner way for me anyway. But yeah, that one slide would be helpful. Thanks. No, you're dead right. We'd like to set up beyond, you know, the person who's kind of purchasing the offsets to the person who's purchasing the goods. So you buy your new Apple Watch series and I know whatever it is, you kind of scan the code, and then you get the carbon history of that product throughout its manufacturing life cycle. And that's really the goal that we'd like to get to. Or, you know, the interesting application in my mind is being able to, you know, to kind of net out the emissions in scope three so that you can kind of show, you know, the emissions that are occurring and the offsets that are occurring all the way through. I mean, that'd be a really interesting application. I will throw it out there that as you think about what you're trying to do, it's fundamentally a supply chain problem. So I'll flip it over and say having worked a lot in my career in supply chain, you need to probably coordinate with what I call the models that the supply chain group here is using. They're SIG, I guess, because when you look at supply chain, what you're always looking at is you have in a sense a distributed set of, I'll call it providers if you will in this chain and the trick to it is, as you're trying to show tracing, they have the same problem with food transparency, right? To say here you're eating a salad and where did that head of lettuce come from? They have exactly the same traceability problem, if you will, and when they're going through that, through those different levels, they wind up in a sense having to in a sense get the tracing to run at what I call a lot level control all the way through external systems, which ultimately would be your challenge when you're trying to talk to me and saying, hey, this Apple Watch used a carbon credit that came from Jim's main forest, that kind of a thing. That traceability is very similar to food supply chain. Yeah. All right. I'm actually in a supply chain one of the six I'm in, so yeah, what was just being talked about actually when it comes to food, some perishable food or wine and so forth. The way it's been accomplished actually is through IoT devices. You actually have to have that third element in there to read out conditions that food is seen and so forth as a certificate of freshness, let's say in a supply chain. I did have one quick question maybe from a technical standpoint is how is this solution holding certificates of authenticity or verification of the generation of the carbon credit. How is that actually held in there? Is it in a smart contract for every single one? Is there a group of transactions? How are those and assured right ones are getting to the right people when there is an exchange, financial exchange of a credit? Matthew, can I time into this? Yeah. Jeff, you hit the right note on like how the traceability is enabled through IoT. This is exactly one of the directions that we're going to take. We're going to take a look at the greater DMRV project, the digital native tokenization which is more on the supply side is going to be heavily involved with where really the goal is that connected projects with IoT devices and everything that is needed on that side are natively digitized. Today where we start out we migrate non-native assets, these tokens, onto a digital platform and build traceability from there on. DMRV project would be natively digitizing onto a ledger and you will be able to trace all the way directly through. You have to really start with digital signatures, provable digital signatures at that point onwards and that's the only way to do it. I completely agree with you. Regarding the tokens and where the certificates are stored in, as part of DMRV whether it's native digitization or this paper to digital transformation, what we have is ecosystem tools like for example, I don't know if Matthew covered the Guardian tool which is another component of this ecosystem that is basically publishing all the metadata associated with every single token to solutions like IPFS with the proper signatures and linked so that they can be cross-reference. So for every single offset for example, if you're looking at an offset that has gone through a cycle of creation listing purchase repurchase possibility and eventually retirement you can go look at that offset independent of Tulum and be able to drill down all of its metadata that's available in solutions like IPFS. It's not on a native ledger solution but it is immutable data that is linked tightly with the ledger and in order to do that you don't have to rely upon Tulum like Tulum or for that matter any other exchange that enables these offsets to be purchased and retired can step back and the data is still available on the ledger for independent verification. Thanks. Also a key point you mentioned is the verification of the source you're sharing the public key It is. They can validate the signature then I see. Thanks. Thank you Jeff. Are there any other questions from the group? Jim please. One quick thought. Signatures are very limited. There are a bunch of challenges in the more mature market I would say digital keys also don't work the same way either you have to move to what we call signature services just like wallet services ultimately so those are architectural changes needed but that said when I look at in a sense one of the huge well the market's the market and that's a problem as you pointed out very well so I can't change the market honestly when I talk to certain corporations I know there's high demand out there and obviously everybody wants to pay the lowest price possible just like a farmer you don't want to overpay for the apples you buy or something but the other side of it is and you really hit it well here in the presentation is the core thing is trust end to end is really what you're selling more than anything else trust is the product and when you think about the product untrust which I am actually thinking about on the supply side of this is you really have to come down on something better on the supply side going all the way down to the bottom and with that one of the things in fact a good example of failure in the market is look at stable coins a lot of the stable coins that have quote not done well minus tether if you get past tether look at the other ones the other ones that aren't doing well are the ones that lack independent proof of governance and so one of the things that really would be probably very important to your success is independent proof of governance which you don't have what you can't do it's like the maker checker model you can't do it yourself you can't say look at me which is what the other marketplaces are doing today they in a sense say oh look I wrote a standard and I'm using my standard to apply to me and I'm telling you I comply with my own standard that makes any sense and that's why the market doesn't have high trust but an independent proof of governance service of which I familiar with that let's put it that way as a model would improve probably the trust level in the marketplace of the assets I'll say thank you for that Jim see Judy did you have something to you what you look like you're about to respond for a second no I was going to concur with that fair comment like and I think like it's exactly right that like you know you cannot we can we can put out the product we can build out the tools we can put kind of like the proofs out there but it like you know would be counterproductive if we say that we have kind of like you know done the validation like you know by the standard that we did like no the best we do and I think like you know is a follow on would be kind of like you know help the community by you know develop the standards to verify this I will say that like you know being part of an ecosystem and not kind of like you know trying to define everything by ourselves is kind of like you know is our goal and we are working with the wider community like for example like you know the standards on offsets like you know we are living it for the wider community we are curating and guiding that discussion but we are not enforcing like you know that it has to be done this way because we want to adhere to an open standard that we have like you know other rich players kind of like you know participating in the same ecosystem make sense Christian I see that your hand is raised thank you sure it is it's actually a question to you in connection with something that Matthew said Matthew thank you for a very interesting presentation sure with my question to use Matthew referred to the ability to match offsets to emissions and in this group I just want to think about the work that members of this SIG has done in terms of emission accounting especially the work played by Sai Chen you know there is a lot of work has gone into that side that side of thing and I just wanted I think you are more familiar with the details than I am do you think there is perhaps a synergy possible between that technology and that work that has been done there and linking that to the work that the law is doing Christian thanks for bringing that up it's funny I was in the back of my mind thinking about the same thing when I made the comments that I made there has been quite a deal of work around the net emissions token system and it's designed to do just that it's designed for essentially to develop an emissions profile of a product moving through a supply chain and there is two components of that there is the emissions profile and there is the offsetting profile and so a lot of technical development has gone to developing an open source prototype that was done within the SIG with Sai with Bertrand and it actually won a couple of different awards the you know it's open source it's waiting I think it definitely benefit from a use case and so yeah it would be I know that the community members that worked on it were quite passionate about it and I'm sure they'd be very interested in kind of exploring any way to find a use case for it I think it's I think it's the future of the way things are going to go and then there's a lot of legwork there that's been done there so it would be interesting and it would be really great to be able to somehow maybe at least connect some of the minds that were working on that to see if there's some sort of application where that platform, that technology that work could be built on thanks for your introductions very good well this has been a really productive really productive call I want to I want to thank Matthew and really everybody who's been working on this project and who's come to present and who's shared your time and your insights this is a really interesting you know just area that you guys are building out I think it's critical I want to offer if there's any final questions I don't want to shut anybody down I don't see any hands raised but if not I want to thank you for everybody's interest thank you for your time and I will give everybody four minutes back up their day thank you very much thank you everybody for the session excellent thank you thank you