 All right. Sounds great. Well, welcome everyone to the March 7th meeting of the Climate Action and Accounting Special Interest Group. Excited to have some special guests with us today, which I'll be introducing. First, I want to kick off with just a couple of housekeeping things. For anybody who's on the call for the first time, I want to share that the Minister Foundation follows an anti-trust policy, which you might want to familiarize yourself with, especially if you get involved in some of our development work. We also follow a hyperledger code of conduct, so please do familiarize yourself with that. I typically kick off each call by inviting anybody who hasn't, who was attending for the first time to introduce themselves. So if there's anybody on the call who's on for the first time, please introduce yourself, share maybe a little bit about, you know, how you learned about us and a little bit about your background really quickly. Is there anybody who's new to the call today? You can just raise your hand or speak up. Daniel. Yeah, please. New to the call, but I'm sure that everyone's going to be hearing from me for at least half this time. So thanks everyone for joining. My name is Dan Orkin. Yep. Thank you, Daniel. This is actually a member of Climate Action Accounting. This is actually his second presentation, so we'll let him introduce himself in a moment. For everybody else, since we don't really have any other new, well, I'll just share real quickly. If you are interested in contributing on the meeting page, we have links to how to contribute. We have both non technical contribution opportunities involving outreach research and some other roles, as well as development roles. If you're interested, if you're interested in that, I would recommend joining our peer programming calls, which happened every other Monday at 9am PST. And so I believe the next one is occurring this coming Monday. With regards to working group updates. Not a tremendous I will I will share for the carbon accounting working group. We are in the conversation of wax we we have for conducting a techno economic analysis with with an energy provider in the main region. And there is part of that may be focused on using some of the prototypes we've developed for kind of capturing and tracking carbon emissions using blockchain. It's kind of a as a prototype there so more to share on that in the future. It's a project unfolds with regards to the standards or the research working group. Is there anybody would like to provide any updates and now is your time just quickly before we can hop into the presentation. Hi Sherwood, it's Alex here. Yes. Yes, again. Yes, how are you. Good. Good. Just a quick update. We are working on submitting a project proposal for the mentorship program. So, yeah, that's something coming up and anybody who's interested in actually actually have ideas about how we can go about this. Things the standards working group should pursue as a project. They're welcome to contribute the thoughts. Otherwise, yeah, that's what we'll be doing next week. Will you will you remind them really quickly when the standards working group calls are this exact time slot but every other week. Every other week that's right. Okay, great. Thank you so much, Alex. Yeah, with that, I kind of just want to hop in this because we have a lot to cover today. We have West Geisenberger who joined us in October of 2021 back when he was working with Hedera hashgraph focused on Hedera hashgraph he still is now he's helping lead the H bar foundation. I met West in person when I was a cop 27. And I was really I've been really impressed with the ecosystem that he's building. In that environment, what I saw was a lot of corporations and a lot of financial institutions and investors complaining about not really complaining, but pointing out the real challenge of kind of deploying assets to decarbonize our global production infrastructure and the challenges that they were describing were data challenges primarily lack of accessibility to critical climate to data. Trust in that data and the cost of generating that data. And so within the H bar ecosystem where they're kind of actively been providing grants and funding and expertise and kind of support to different projects I've been really kind of impressed with the pain points that they're solving in this kind of large ecosystem of kind of, you know, helping unlock resources to kind of move into support the decarbonization of, you know, our production and consumption activities. So, also happy to have Daniel Norkin here who is spearheading one of the, what I think the more interesting technologies within the secret system guardian is as he will share about data provenance and this need to provide transparency into where the data, the history of the data where it's coming from and we're measuring reporting and verifying, which is kind of foundational as everybody knows to this larger issue of kind of collaboration and and resource deployment. So with that, I think I will head over to you, Wes and Daniel was I think you're kicking it off and happy to allow you to share your screen if you like. Yeah, absolutely. And sure, sure would thank you so much for having us we're really excited to be here. The last time we spoke was actually the week that we open sourced the Guardian and made it available publicly for the first time so we always think of coming to the Climate Action and Accounting SIG with very fond memories and and we love, you know, working with the participants of the SIG and really appreciate all the great work. The Hyperledger Foundation does it with a focus on climate accounting and we're excited how we can work together going forward. If it's okay I'll go ahead and share the screen. Sure, I need to stop sharing mine first. Yep. There we go. Perfect. And let me know if y'all can see my screen. Yes. Great. And so today we're going to talk about building those auditable and efficient sustainability markets on Hedera. And just to give a little further background and show what it was fantastic introduction that I feel like you've done you've done it all the justice in the world, but the the H-BAR Foundation Sustainable Impact Fund is a grant giving organization that is focused on bringing the balance sheet of the planet to the public ledger. We do that on Hedera. We are very much focused on enabling public goods, public goods that are reusable, that make it easier to solve for the toughest climate challenges we have today. And so you oftentimes hear about our open source infrastructure like the Guardian, which makes it easier to build digital measurement reporting and verification methodology than any other platform in the world. I believe that you can take a process that sometimes costs hundreds of thousands of dollars and shrink it to a few weeks, you know, with a with a few resources, and an understanding of the MRB process that you want to enable. We have other open source capabilities like the automated regression market maker, which I think maybe, you know, something that our ecosystem may share in these, these meetings coming up, focused on understanding the audit trails and the provenance to actually determine price points based on the attributes of every single asset created. So you can imagine something very similar to Zillow, where you want to know an attribute of an ecological asset, whether it's a carbon credit, or whether it's a biodiversity credit a conservation asset, and wanting to know all the attributes, the provenance of where they come from, and saying I care about these 10 attributes because that's my indicators of quality and I want to assign a price. But you can't do that without digital measurement reporting and verification with those granular attributes. And so we're focused on enabling these public goods, so folks can build world changing applications that help address the climate crisis at scale and that's really what our fund is about. And it's bringing the balance to the planet to the public ledger and auditable data and information discoverable and liquid way. What that means is we know the rules that we're accounting with. We know the actors who fulfill different roles within those rule sets in a measurement reporting verification process. We know the data that they produce, and it's not just data it's actually information that we can read, not just machine readable but human readable. And ultimately those attributes that are derived from it where assets are meant to be traded because some some assets aren't especially in place in industries like conservation and biodiversity, but where they are meant to be traded we can find the attributes, not just based on it being a static attribute but the trust model that's behind them. So we know if there's a reversal on one of those attributes we know if the data wasn't correct or it's built on a scientific model or estimate that may have led to a downstream credit being created that we may want to actually revisit at a later date. And so we think this is a requirement for the industry. And the reason why we got to those principles or reason why we got to that mission is because we're trying to address five key problems. One, we want to know, and we want to understand this auditability problem in finance because if we don't know where the money is coming from or where it's going to. We can't actually affect the change that we want to see. And if we don't see data and audit trails on that verification process or that validation process even prior to that, we're not going to know that those assets are being created with the funds that we're living. And ultimately leads to a problem where we have too many disparate systems for too many different parties that leads to estimates being used. And actually our prices aren't transparent because we don't have the right data that leads to better price discovery, because the assets we know are real, and it ultimately leads to depressed market prices. Of course, in ESG reporting if you're buying an asset you don't have the data, it's going to be a lot harder to report. And so we're trying to address these problems. And we're trying to do that by setting goals around finance auditability, open sourcing methodologies, scaling validation of verification, not just bringing verifiers into the digital world but bringing new verifiers into the local markets, where they operate or where these projects are existing, so that we can create an asset that actually has local benefits not just from an environmental sense, but from the other SDG senses as well. And thinking about how do I enable financial transparency, how do I enable equitable finance to flow back to the communities in a restructured way that works for those economies. And ultimately, by creating these bioregionally supported assets, we also want to see an improvement of the price of the credits and the flows of those increased prices go back to the communities. And actually proving that even in our ESG reporting applications because companies can now see this. And so we've set these five goals for the funds that are backed up by the principles that support the mission. And so, as we invest we looked at the entire market lifecycle, and we've looked at things like emissions reporting, how we invest in things like forwards and bonds which have measurement and reporting requirements, and where we can even use things like carbon and forwards to attach to green bonds, you know highlighting some of the great work that's been led by a message number to a, and ultimately he's being implemented on platforms like every city for forward attached green bonds to enable auditable projects that are actually linked to the larger sustainable debt markets that vastly are vastly larger than the carbon markets that exist which are roughly a $2 billion market from last year. So auditable projects that especially the voluntary carbon markets are ultimately discovered and traded, ideally in a liquid way, where we can not just trade trade assets but trade assets with context, based on the digitalized format or digitized format, and we can pull those same assets that are traded to be reported with the audit trails intact. So the same assets when they're traded, when they're retired, we use the same underlying information in an automated way, just like you would upload a document into something like TurboTax and get an automated output. That's where we want to get to. And we've mapped that to the market flow and we built our investment thesis across this. And we've divided how we invest from emissions reporting into sustainable finance to auditable projects to data management and tooling beyond that, meaning I don't just know where the funding is coming from because we're digitizing it. I know where that goes into something like a carbon forward that funds a project, maybe it's a nature based solution, or a biodiversity project that I can actually see the rules, the roles, the actors who fulfill those roles, the data that they produce, because it's available on an explorer where I can granularly see the information of the methodology, and any type of data that comes from that, that is going to be verified on a public ledger. And ultimately, all those attestations can be viewed, not just from a data audit trail perspective, but actually in evaluating the price for a trade. And we can include that into our trading and liquidity model to get better inferences and understandings of the value of one verifier versus another someone who has expertise to improve the outcomes of the project, versus someone who's maybe just learning, and ultimately understanding how we present that to customers who are enterprises who have large scale market demand that they're actually using the same guardian based technology to understand the rules. The roles within those rule sets, the actors who fulfill those roles and the data they produce for every single emissions token within a supply chain, so we can report both emissions tokens, and any voluntary ecological market asset into a reporting framework, whether it's a CDP framework, or any other type of ESG reporting capability, and all this is underpinned by open source infrastructure that makes it easy to do these activities. That's a really important note for folks who are thinking about building. Do not try to build something. If there's already an open source project, if you haven't tried that project first. If you say this doesn't answer my question, it's great to go build, but evaluate all the open source tools so you can build faster and get to climate outcomes quicker. Now what this looks like on Hedera, and I'm going to go through this a little bit quicker, is we have the guardian from a DMRV perspective, open source wallets, including verifiable credential wallets to make attestations. Following those token standards that feed into our open source markets and marketplaces, including things like marketplaces for forwards, you know, that exist out there in the ecosystem. For anyone who's interested, feel free to send me or anyone on my team a message we can send you the GitHub repo for any of these projects, as well as send you the open source methodology library within the guardian repo so you can get started quicker. On taking the rule sets that we have today and improving them for your own use, or using them out of the box or contributing a new, new artifact from a technology perspective. It looks like my screen's frozen one second. Okay, so when we get into the guardian, what we're ultimately getting to is attestation level information. On Hedera, we have three key services, our consensus service, our token service and our smart contract service. Our smart contract service is EVM compatible. However, in the guardian, we've actually found what's best fit is an identity based workflow system with full audit trails that you can review and explore. And that really starts with how you're monitoring information. Ideally, we want to get down to a sensor based level where it's very cheap and easy to report this information. Coming as an attestation from somebody with something like a decentralized identifier. Now, something of note decentralized identifiers and verifiable credentials are compatible. In many cases across chains, there's different standards set up by the W3C that enables resolving those credentials across different protocols. However, in the guardian, we actually use this to chain these together. So it's decentralized identifiers and verifiable credentials to be linked through every step of the methodology process to get to a verifiable outcome to know that the data has not been manipulated at any point. But it ultimately starts with a sensor. Maybe it's a solar panel saying I've transferred X amount of energy back to the grid and it's crossed a meter. It could be satellite data with signatures coming from a satellite. It could be any other type of information even manually attested or collected as something like a soil core sample. And it's going through that measurable reportable and verifiable process into the Guardian, which is looking at it against a rule set it's known. Of course, from these rule sets you could say well it's could be terribly complicated. On the Guardian we use JSON, which is not necessarily code but actually shorthand for that. And we use JSON, you know, human readable format to try to link all this information together so you can actually see the rules that that data is being processed against. And ultimately, at the end of those rules, we can actually verify the rule, the rules that you're following as well as the data that's been applied against them and the approval process. What that looks like is from the point of that sensor data generator information. That sensor has a did key where it can issue a credential or an attestation about the step in the process, and they could even be certified to issue certain types of information. And what that means is when that data is sent to Hedera, it's crossing our consensus service you can think of it as a message queue or publicly auditable log that you can look back and see the entire province of the history of the attestation where that sensor was given the right to admit data are certified to give data in the form of a credential, and that certifying authority was, whether it's a validation verification body, or another, another actor within the project has their own dids, and they can issue VCs for any, any part of the process or any data admitted, where they ultimately get that authority to do so in the form of a credential from a registry which means it contains its own did. Of course, that means you can build reputations for every single role and actor within the rules. You build your trust through a model where you can link back to where that route of trust is. Sometimes we call that a standard registry in the past it's been called a root authority, but that authority is where that trust is derived from. And finally, even in some cases, you may have folks who give credentials or reviews on that root authority, say it's bear our gold standard, you could even give a review or an attestation of what you think about them because it's on a public ledger. Of course, people may dismiss that you may say that's great because of your reputation with the industry, but it creates an open system for collaboration and review of assets whether it's a policy and the rules, whether it's a specific actor or a sensor type, or it's a data from a lab or data coming from a satellite, maybe even with an AI model over top of it, all of which can have reputations that live within this workflow. That means the workflow can be better understood interpreted. And if you choose to trust it you can trust any different part of it for your own asset, or you can look at another party asset and derive trust, which ultimately allows when the allows the assets go to the market and receive some type of reputation or review from a machine model to actually derive a price to say this asset should be trusted it should have this value, because our machine learning model says the market values these attributes that those parties are involved in. In a nutshell, that's the high level view of the Guardian, but what really makes the Guardian powerful is the ecosystem building on top of it, because you can think of it as a factory approach, building one set of rules. It's relatively hard, building 100 sets of rules is very hard, building 100 methodologies is very hard to repeat it. So you can actually compare them to each other and our ecosystem is building a very large methodology library we believe it's the largest in the world today of different types of MRV that exist, some of which come from, you know, copies of the CDM or Vera or gold standard, and others come from new types of measurement technologies whether it's Dovus contributions in the soil, or other other other policies like what InVision released which is a clone of Vera's VM 0017 which they did a webinar on last week, and they're actually going to be able to compare that coming up to a VM 0042, which is subsequent methodology and the improvement which is moving to. So we'll be able to see the old versus the new in types of assets that can be created, but that doesn't just live in voluntary ecological markets assets, it also can go into emissions policies, biodiversity policies, or conservation policies, not just about the finance, but this is really focused on the data in measurement reporting and verification, following a given standard or set of standards. What that may look like, when you get down to a JSON level, following the rules as you may have a set of rules that are very clearly the same as another set, but the cool thing about this is for the first time you can actually compare and quantify the difference of one rule set to another. So if you look at these two rules you can see some things that are are similar and they have subgroups that are exactly the same, but you can also see where this group has one big difference right here you have a new step in the application and you can actually quantify if we see the difference of what that looks like in a workflow and you can point out each different field. That's a difference, which means if you wanted to combine two methodologies, you could actually now quantify the difference of those two methodologies whether you're looking at bear or gold standard, or CDM to another another set of standards, and all of a sudden you can actually compare both the rules you're following and the answers that are followed on those rules. What you need to be able to do is look at the percentage difference and decide if that's a material difference or not. And for the first time, you don't just have digitized rules that are scalable across methodologies across projects, but you can actually compare them one to one in a quantifiable way, and get to an answer set. If you want to make this a little more granular I'm actually going to flip over to the emission side of the market and you say hey well how does this work in emissions tracking from a rule set and you say well I want to transparently show the sources of my emission roles and actors, following a methodology such as a product product standard from the greenhouse gas protocol. I could trace down to a sensor level and understand how those emissions are going to be allocated to a product. And if I want to display that to every factory that I have to every product that I have I can go ahead and do that, and pull the line by each of my project stakeholders, the regulators can see that I can show my customers an improvement for a reductions over time. And I can even show where the offsets come from an equally granular way because they're all following this model, which is comparable to actually see changes over time but also is comparable across peers in the industry because this is all in that public balance sheet being Hedera. And so with that, I'll take a quick pause to take a step back and look at a higher, higher view. So where does the Guardian fit within an ecosystem right so in the ecological markets asset ecological markets workflow. You have what I call as our work in progress diagram there's always folks being added this we tried to simplify it to make it easy to understand, but you have a project that would start. And one of the applications in our ecosystem some of the names you saw at the beginning Dovoo Timeless Cripsy there's many others would ultimately digitize a methodology, of course you could digitize a methodology as well, and work with folks like validation and verification bodies. For example, DMV is working with a company called ever city, who is creating forwards and publishing methodologies for specific types of assets which they announced that cop 27, and they can be working with a registry. And so for some, you know, one example is gold standard. And so if they were following a gold standard methodology with a VVB and a project, you could ultimately say I want to lay that out digitally. And I want to use the Guardian to compare that to the previous methodologies, where somebody can hold these assets, even in their forward stage if you were use ever city and afford a forward, but ultimately when it's redeemed to create a token. Once you've completed verification that could be held in account with a wallet on Hedera. We have many popular wallets on Hedera. We also have capabilities around for folks who are curious around what types of wallets. Happy to have that discussion including things like including things like metamask and how to use that on Hedera. And ultimately those assets are created, auditable down to the unit of value or metric tons so often an NFT on Hedera following the IWA standards so the interwork alliance for those who aren't familiar has created standards around voluntary ecological markets assets where the carbon removal unit is actually an NFT. And we can trade those assets and we could even wrap them and create fungible versions of those assets if we wanted different types of liquidity. But folks like Tolemore through our trade our trading those assets and their unique units of value to trade based on the attributes rather than based on the common allergies. We're still maintaining liquidity in the process, but you can see those assets on explorers on Hedera, and you can ultimately take them into ESG reporting applications one we recently announced was Corp Stage, who's building an ESG reporting application to understand carbon emissions tokens, as well as offsets for CDP based reporting. And so that's kind of what it looks like in the ecosystem I went very fast, but I'm always happy to take questions on this and get into a little bit further detail before I pass it over to Daniel. Thank you very much, Wes. Does anybody have any questions for us? I know it's a lot of information. One thing I was, I'm curious about Wes, you know, you've kind of captured a lot of the pain points and you're actively working on the what are some of the existing areas that nobody is tackling that. What are the what are the areas that you see some of the greatest opportunities and in this kind of in the ecosystem that you're building and the kind of general ecosystem overall. Yeah, I think maybe I'll go back to the ecosystem slide as we're talking about that. In our ecosystem we see fantastic opportunities for what I call data exploration. Previously, a lot of this information that was out there was locked in PDFs, whether it's at the traditional registries, or whether it was in, you know, specific project databases. And with the digitization of these processes and the opening of the data onto the public budget, there's this visualization opportunity. And one of the hardest things, not just within web three but in sustainability and climate action in general is building the hardest thing is in my view is communicating visually. What's happening out there in the real world and telling a compelling story that is tied to the data. I think we're very good at telling stories but not necessarily tying stories to specific data points to show actionable outcomes. And so within our ecosystem we're actively looking for folks who are able to take the data that is out there and visually communicate that. And that's something we're very excited about. We think there's also opportunities though on looking at those policy comparisons and telling the story to the academic community has been trying to tell in a way that is moving industry forward but tied to real world projects showing the improvement of where we were yesterday versus where we are today. And with new projects with on the ground activities that you know previously couldn't necessarily be communicated in such a granular manner. And that's something I think it's going to change how how the entire process around climate action works and telling stories that are grounded in real world information. Out of curiosity a lot a lot of the work seems to be focused on offsetting you did mention kind of some in setting up for opportunities obviously the application works just as well for that guardian work just as well for that. Do you have any real world use cases that you can kind of talk about where where guardians being used to kind of capture kind of emissions through a supply chain. How far along is that work. Yeah, absolutely so I'm very happy to talk about one of our governing council members on Hedera, Avery Denison. So, and that's that's one of many use cases I tried not to get bogged down into too many specific ones but Avery Denison, one of the world's largest companies in both manufacturing because Avery Denison and labeling are actually the world's largest RFID manufacturer, and they have a platform that they've started focused on digital birth certificates and identity identification called opma.io opma.io. Their goals to give a soul to every product, and not just carbon emissions tracking, but all the different attributes around any product that has a label and if you look in the back of your shirt. You may not know it but that's Avery Denison. And the idea of putting a digital trigger on that that has every step along the way baked into it that if you scan it with a QR that QR code or read an NFC tag, or an RFID tag, the ability to see the carbon footprint is a really cool and so they actually talked about their implementation of the Guardian through some of their announcements and the implementation of creating carbon emissions tokens at scale for some of those product life cycle events for different scopes. And so we're excited about that one, but a very different one is timeless out of Australia who's working with Queensland Government, and doing digital digital twins and did on manufacturing processes. And so they're measuring things like carbon emissions tokens on energy usage for buildings with the Queensland Government, but also looking at things like magna mining, which is actually producing green pig iron to measure carbon emissions tracking following these open source methodologies. And so they're really, really exciting opportunities because you can actually go today and look at how some of those folks have built those methodologies out on the Guardian repo. And you can see exactly how they've accounted for it and suggest improvements if you think there needs to be improvements but also use it for your own purposes in that process as well. Thank you for that was really interesting and I will go look at that. I see Bertrand, you have your hand raised. Yeah, sure. Thanks. And hi West. Thanks for presentation. I said a quick question regarding one of the last points you made about tokenizing offset our mission reductions, and you mentioned using the issuing them as a non fungible token I just wanted to ask you what was the reason for issuing these a non fungible token and actually raising a challenge. A non funger token by definition would be non non divisible. And you'd assume that it's here for missing a batch of offsets you'd want these to be to be divisible so if you could speak to that and, and how you're informing to that sort of NFT model. Absolutely. So, and there's a few, I'm going to take a quick step back and kind of give a lot of credit to the IWA for helping to bring these standards to bear for voluntary ecological markets as well as carbon emissions tokens. For all disclosure, I helped lead the carbon emissions task force for the IWA but have been a participant in the voluntary ecological markets task force since it launched a couple years ago now, maybe a year and a half ago. And as part of that, we came to the point of that same exact question, but we found that most registries do not issue fractional assets. However, some people decide to take those offsets and fractionalize them after the fact. They only have issued holes in terms of a metric ton count and oftentimes that's also, you know, that's also complemented by buffer pool models and things of that nature. We found that those assets need to be part of a serialized class of assets and those serialized classes had their own attributes. Of course, that could, you know, that could evolve over time, but that's the model that we've seen as adopted, and it makes it easier to track, track it in a similar way that you could track a good produced on a farm with unique attributes about it. Some of them may be similar, and some of them may be inherited as part of the class of assets, but many of the attributes could be different, whether it's a unit of time, a temporal pattern, or whether it's another asset that's from, or parts of that ecological asset class based on location or sublocation of that project. So there's many different ways you could divide it, and we wanted to leave that opportunity open from a standards perspective. From our ecosystem perspective, our ecosystem is really leaned into that to build markets based on those attributes and creating new ways to actually build liquidity models that are fungible. But if you do want a liquidity model that's fungible, you can build a wrapper, what we call a reference token in the IWA, and that reference token can point back to the underlying ecological asset will build a fungible layer on top of it as a tool. In the carbon emissions task force. It's a slightly different structure it's fungible unique is the proposed standard. It's a token perspective and that's unique to a site and a scope as part of that nested reporting structure and some of those elements are still being defined as part of that task force and you know we're looking forward to getting feedback on those and when we release the first version of the document which is still being worked on. So if I were to understand correctly what you what you're just describing using the NFT essentially as a reference pointer and anchor for the issuing of say a given asset so someone you know company organization issues and also based on you know ecological or say industrial reduction. And then that reference point can be used for you know some liquidity model which may include fractionalizing the sort of quantities embedded in that. Correct. Yeah, you can think of it as a building block. Most traditional exchanges will build baskets, and those baskets will come from many different classes but it's hard to trace back to the specific unit. Because it's a paper based a lot of times the paper based system in this way you you have a data package linked to every single asset. If there's something like a reversal you can figure out which assets are affected or not affected as well and it gives provenance to that that specific asset versus trying to divide it after the fact which may be harder. Yeah, that makes a lot of sense yeah thanks look forward to reading more about it. Absolutely and the version two of that the voluntary ecological market white paper from the IWA is out and available for those who are interested. Any other questions. Thank you us. Thank you. Let me go ahead and share my screen should be the very, the very bottom very center you do you see it. Oh no I I know how to share the screen and zoom. I'm just don't know how to share the screen on Google sites got it. Okay, one second. Let me start from the beginning so let me just talk to the slide. Yeah, absolutely. All right, well, thank you West thank you share would and thank you also to be sick for for hosting us. I'm trying to hit the resume share. Is it is it sharing or not. It's green sharing there we go. Yep, see it. Okay, fantastic. Yeah, so, you know, thanks again. We're, we're, we're really honored over here to go ahead and present this. I actually didn't know that the first presentation was was during the launch so it's not yet throwback Thursday but throwback Tuesday over here so it's good because actually have a slide here that talks about the progression that we had since the launch it up until now and some of the, some of the development milestones so I think it'll be great for everyone to see it for those that came in a little bit later I just wanted to say hi. This is Dan Orkin, one of the co founders here at Envision Blockchain. And for this presentation for for this side so West was talking about the entire ecosystem and and how, how, you know, the whole ecosystem is contributing to building sustainable markets on Hedera. There was, there was some focus there on the Guardian and some of the questions to in the chat about the Guardian. Let me just give a brief overview about the Guardian so that we can kind of set the stage before I give any sorts of demos. Essentially the Guardian. It's a solution that's built on top of the Hedera public ledger that's used to mint emission tokens, carbon offset tokens, renewable energy credit tokens. We call them digital environmental assets also. And again, it produces these audible traceable and reproducible records and it documents the entire life cycle of these assets that I was just talking about, which really aim to reduce the impact of the Guardian on the ESG market on top of other things as well. So, if you could just take a look at the screenshot that we have there on the right side and we're actually going to show you a live version of this. This, this is the picture of the trust chain and I think that the trust chain is one of the most novel ideas that came from the Guardian because there needs to be a way to move away from the traditional methods of just having these paper ads that are in traditional systems right now where for sure you can go ahead and request an audit but then you just stack with more paper and more, more difficulties and rigor and manual processes in order to go ahead and find how a certificate was issued, how an offset was issued, how carbon claims were made. And this is the trust chain that that has, you know, something that we call a verifiable presentation, which is a collection of more than one verifiable credentials. And we have a pretty neat way of displaying it so that you can actually see as you go through your methodology. West was talking about that we have a whole open source methodology library that you can go and pick or you can go and add your own. As you hit certain steps and certain workflows we issue verifiable credentials this trust chain links everything together for you to open it up and actually see and since this is on a public ledger, you can actually go and see everything that's happening. So again, just kind of diving into that concept just a little bit deeper. You know, we understand that the process of collecting and supporting these carbon claims. It's manual it's prone to error, there's data quality issues as lack of assurance, we have greenwashing in the news, double counting is a big issue that just an overall lack of trust. And this is where we've introduced this concept of a policy workflow engine that you see there in the right hand screen on the right hand side. And this policy workflow engine. It allows users, typically policy creators, if you will, or really anyone that wants to go ahead and mirror a standard body like a methodology from a standards regulation or a regulatory body. They can go ahead and do that via these workflow blocks it kind of looks like a BPM tool. And the best way to look at it is to kind of think of it like, you know, the, the, the policy that you see right there on the right side. It's a coded instance of a methodology. So whether it's a gold standard methodology, whether it's a vera methodology, CDM, or any of the GHG is as well. You can think of these little rectangles like Legos in order to design the desired workflow of what you want. We're going to show you how that looks like in the demo, but again, you can literally create them on the fly. And what's interesting about this is that everything that you can do in the Guardian application you could do in your application, because it's really an API product. So while we have a front end for, for the users to use, you can still do it through your front ends, through your applications, just connecting via the APIs that are provided. So this was the slide that I was talking about. So again, just kind of, you know, throwback Tuesday over here is that here's a little bit of the release history. So basically when we first released version 1.0.0 beta dot one, you can fast forward that we're all the way up to version 2.10 right now. We included a front end. It was a very basic there was a back end service there was some token functions and storage. There was a message broker that kind of tied everything together there was a policy workflow engine that allowed for for the creation of methodologies. In the course of time, we added API stable stability, different versioning importing the policies, the ability to publish them the ability to have decentralization improved documentation. So further along the later releases we added support for complex policy workflows, because you know these these methodologies, they're not simple by any means, you know, you're basically taking 300 pages of technical instructions, and you're trying to codify it, which is a fairly fairly cumbersome process, providing different features like aggregation blocks allows for the kind of mimicking that we're talking about revocation, which West was talking about, once a VC is issued, how can you take it away if there's errors if there's issues. So these event driven token policies, automation of token operations, dry run which is a cool one. I'm going to show you dry run in the demo multi multi user roles multi sig approvals. And if you fast forward to where we are today, which is the most recent launch of 2.10, which was just just last week, we added support for retirement of tokens, multi policy tokens disaster recovery policy modules and selective disclosure which is a blog that was just posted on the Hedera website, selective disclosure brings ZKP to the Guardian, so that if there's some information that needs to be held confidential, but not all information, we now have the ability to do that. So just kind of a glance into the future. Where do we see the Guardian going in the future. I think that, from my perspective, we would like to make it a lot easier for policy creators to go in, access the Guardian and be able to build methodologies, you know, like, in a matter of minutes. It definitely takes some time. It takes a mixture of having a sustainability background together with a development background. It's kind of hard to do one without the other, but we're constantly looking at ways of advancing that with policy modules that, you know, you're going to see that was just introduced. So you're going to have the ability to import already created policy modules again. The best part about having everything that we're doing open source is that you can actually look into this open source library, pick from it, and that kind of accelerates your speed to market because you know you don't have to rebuild something. I was just saying this earlier in a presentation, it's like why, why would you build something that's already built in his open source that you can continuously, you know, improve it, give it back to the community, and it kind of like that's like the cycle that just goes around. So again, there's there's two versions of the Guardian available, of course the open source Guardian which is what we talked about, it's available to fork and use immediately go to github.com backslash hashgraph backslash guardian. Everything's right there. We do have from Envision blockchain as the builders the maintainers of the Guardian, we realize that not every customer user wants to go ahead and work from the open source Guardian. They may not want to manage all the different components that the Guardian has. And it's important to be able to scale it when you're in a production environment as well. So that's what we offer is a managed Guardian service. It's the hosted environment. We provide you with the resources and the tools and the support. So once you sign up with us, again, you have two ways to get started you have all the API's you also have a front end, and you get all the SAS benefits as well such as multi tenancy you have 99.9% uptime SLAs. And obviously that we have the Kubernetes back solution so that you know as the infrastructure and the workload of our customers increases we have all this auto scaling to be able to support the consumption. So what we're going to do right now for the for the remainder of the time and I know that I have about 10 minutes so I'm going to go through this pretty quickly. Is we're going to provide a demo of how the Guardian digitizes methodologies and this is kind of like a really big one in the news right now. Basically, I'm stealing some of the slides from the previous webinar, but we recommend that if you're on the Guardian you have a policy development checklist. You know, you want to be able to find out what type of credit you're looking for VCUs, VRs, racks, etc. Then you need to go ahead and find the appropriate standard body, whether it be Vera or gold standard or CDM. Then you want to look through the methodologies within the standard body to find the one that fits best for the project is a forestry is an agriculture so on and so forth. Then, then from that point it gets a little technical that's when you're going to want to start working on schemas which is really the data model of the methodologies and the and and and and to be able to put all of these final calculations of the credits into some tool that allows you to then translate it into the Guardian so show you what we do. And then of course after that you want to create your workflow which is the policy workflow. This is essentially the set of technical requirements. First, then last who does what and what time and what's the rules behind it, and then you want to go ahead and build the policy and deploy it. So the one that we're going to show a demo on is is the Vera VM 0017, which estimates and monitors the greenhouse gas emissions of project activities. They reduce emissions and agriculture through adoption of sustainable land management practices and agriculture landscape. So we'll put a very funny notice that once we began to work on this webinar and Vera VM 0017, literally a week before the webinar, there are put a notification saying that it is, it is scheduled to be deactivated. So don't look too hard into this one to reuse it. But actually what we plan on doing is if you subscribe to the newsletter or to the YouTube channel that we have there, or just reach out to us the next month. So this month we're going to be doing Vera VM 0042, and we're going to use the policy differentiation tool that Wes was talking about to compare the differences between 17 and 42. So you'll be able to see the differences, not just at the business workflow rules level, but actually at the data structure and the schema level as well. And I, you know, we're, we're very excited about that one. So again, this is just an example of a schema. You have here, you know, required field schema types, whether they're strings or addresses or their integers or email addresses and date ranges. So for example, you're most likely there's going to be forms to fill out registrations to fill out. This is what we mean by schemas. If you go further down to the different requirements, you're going to have to take an accounting of the total based on emissions and removals and so on and so forth. And then what happens is once you have your schemas you have your data structures now you want to lay out the workflow. So the blue rectangles are the project proponents. The orange rectangle is Vera itself or the standard body. And the green is the VVBs of the verifier. So essentially the workflow that we have here that you can mimic in the Guardian is that you have these different roles and different actors. I know there was a question in the chat about it that performs certain actions that need approvals or need actions from other actors to happen. Everything happens in accordance with the workflow and the policy rules at the very end you see the last block where Vera issues the tokens, the VCUs and account records and registry and again I just want to put it out there that we're not acting as Vera by any means. We're just demonstrating how you can take a methodology and make it digital. So with that in mind, let me just stop sharing and I'll share my screen right now. Okay, so now you should be able to see the managed Guardian service that we have here. So again, we're showing this to you through the managed Guardian service, not the open source but again, the layout and functionality is identical. So we have these tabs along the top and here we're in the policy screen and you can see that we imported two different policies again so we have the remote work GHG policy. This is a policy that supports the estimation calculation tokenization of GHG missions resulting from remote work. Then we also have the Vera VM 0017, which is the one that we were just discussing. Again, we want to showcase this to show the flexibility that it's not just on one side of the markets. It's also on both sides of the markets and importing policies are very simple. You can import them from files, you can import them from IPFS, or you can import them through our preloaded policy picker that we have here that's connected to the methodology library that's an open source. Let me show you around a little bit. So we have schemas and schemas again so remember that Excel sheet that I was sharing with you. So you have the ability to create them right within the Guardian so you can create your schemas and if I were to, for example, open one up. I'm just going to pick a random one project estimates of soil organic carbon. You could see you can name your fields. You can give the fields a type none. If it's number integer strain, bullying, etc. And again, you have the different schemas for different policies. We have the different tokens. Okay, so again, because everything is on a public ledger, we can go ahead, click on any of these token IDs, you can see that these tokens are created. For example, the VCU is a non fungible token like what we're talking about before. Contracts is where we would create the retirement contracts we just didn't do it for purposes of the demo. And then we have contracts. This one is specific to the GHG policy because there's certain mapping, for example, country emission factors or region emission factors that need to be to put into account. And then we'll get into the policies and when we have our policy, the best way I could describe the workflows to actually show it to you. Again, I'll just go to this screen right here. And here's the high level workflow with with the different colors and rectangles. So when it's in the Guardian, you can see that this is the technical instructions of how this policy looks like from a workflow perspective. So just to show you how some of these things work, I'm just going to run really quickly through this through through the Vera policy. Again, this is in dry run modes and nothing is put on chain. Everything is actually put in memory so that this is a good way to test policies out. We know that there's three users here we have the blue, the orange and the green. So I'm going to go ahead and create two more users for user one. I'm going to give this one the project proponent role for user two. I'm going to give this the VVB role. I'm going to give it a title VVB waiting for approval sees this is some of the workflow steps so I'm going to go back to the standard registry who's the administrator right now I'm going to go ahead and approve it. The next step in this workflow is for the project proponent to submit their project. And again, just for purposes of this I'm just going to put in some really random data here. I put in a map in a date monitoring period. And I know I'm just at the top of the hour so you can just stick with me we can get through this pretty quickly. We have we had the room for another 30 minutes and we can we bleed over sometimes in these presentations. So I'm just going through obviously when you do this you're going to do this the right way. Doing this for demo purposes and going through you definitely don't want to be with me while I fill out every one of these form fields. Okay, so part of your managed services that you go through the entire process and process it for people is that part of it. Is that too. No, no, I'm completely I'm completely joking so so this this would come from the project proponent right so this would be the information that specific to to to the form that need to fill out in order to register their project with Vera. Okay, so once that was filled out just to show you so you see now here's the verifiable credentials that was just that was just created with all my fictitious data. Okay, so now what we're going to do is we're going to go to the VVB the VVB then sorry let's go back to the standard registry. The registry has to now, they're going to view the project. They're going to add it. At that point, the project proponent will assign the VVB to the approved VVB list, then the VVB then we'll need to validate the project. Again, they can go ahead and take a look at the VC will validate it will then go back to the project proponent. They can add the monitoring report. So it's all taken into IPFS through web three dot storage actually all Quinn green is it's what's integrated right now. We go back to the VVB. See they got the monitoring report. They can view it, verify it. And then of course the last step is for Vera to go ahead and take a look at it. They can look at the VC as well. And they can go ahead and mint it. And again, I just went through this like really, really fast, but this is essentially the entire workflow that VM 777 VM 17 is all about. So now that it's minted I just wanted to do all of this just to show the trust chain. So we can go to token history. And now we view the trust chain. And here's the beauty about the trust chain. The beauty is that from here. You can see every single step that happened. And who did what so this is, again, remember that we're in the dry run so nothing's on chain over here, you would have the real users if you published it and not done on on the dry run mode, but you would be able to see, you know this standard registry created the policy, they created their account. The project proponent created the project. You can actually go ahead and open it up and see the metadata and this is where selective disclosure would be important, because not everything needs to be in the public domain. This is where ZKP really really helps provide the confidentiality. However, you still need to be able to see that these things happened, these actions happened and by who and at what specific time, you know the project validation monitoring report. So on and so forth all the way through the minting of the BC use. So that that is really the power in the trust chain, and what it stands for, and why we think that all these tokens that are created via guardian will be able to, you know, have the good foundation where to create these sustainable markets that West was talking about. So that essentially is the end of the demo and the intro about the Guardian so I know that we're at time, but I'm more than happy to stay for a couple more minutes and answer any questions. Fantastic. Yeah, I'm sure they're pretty. Thank you for that really great presentation really just, you know, just a robust solution. And you guys have thought through a lot and it's just really well designed. And really appreciate you taking the time to walk through it. I'll start I've got some in the meeting chat but while I read this I'll say we're trying to see your hand raised. I'm going to go to some of the questions in the chat. Yeah, just a quick follow question you just mentioned the role of ZKP their knowledge proofs in sort of managing access to some of the metadata you were showing does the guardian service provide support for any ZKP like systems like garbled circuits or something like that. So, so right now, it's, it was, it was just released. Yes, actually. That includes bbs signatures is is the current is the current can repeat that I missed it bbs. Gotcha. Yeah, thank you. Yeah. I'm sort of sorry now trying to deploy it on my machine so I may be following up with you offline later. Yeah, I'm free to reach out and we're more than happy to help you through it all. Thank you. A project question for Daniel, he's taken still taking questions. Yeah, absolutely. I have a question. What's the responsibility from a data standpoint of the entity running it doing the project or generating either missions or credits and other words. For example, let's say I have a methane emissions that I'm trying to burn. Do I have to outside of MGS get the flow rates get the composition of the gas compute the amount of carbon being emitted or can I send those directly to your system and then you compute whatever carbon calculation center you have are using in place of my doing it myself. Yeah, I don't know if that's clear now but what's the possibility essentially clear to send data and okay. Sure. No, no, it's clear and that's and that's and that's a really good question so there's there's a couple of things that MGS is. Let me just say guardian because it's all it's all it's all one and the same if it's open source or or if it's MGS. There's certain things that guardian is and certain things guardian is not the guardian is not an MRV solution. Okay, think of us like think of us like in API API products so here for, for example, let me just put this out here. 99% of everything that you see here is the same as open source just a little bit different with API is that relate to tenants and things like that but it's all about the same so think of us like an API product. All right, so we still need you will still need the the need for an MRV application granted we do provide a front end like what you see here. But most would want to provide their own branding they might want to provide their own look and feel as well as all the extra features that that we just don't put in here that other MRV solutions will, which is kind of like what you're talking about. So, you know, if you have IOT sensors that are sending information. You don't need to worry about the calculation because the calculation is actually all built within the schema so we have the the we have the tools to be able to take in certain information and provide calculation via Yeah, so so there's certain policy workflow blocks that allow you to calculate as long as you have the right schema values the right data values coming into the Guardian, we can handle the calculation via policy. Okay, any sense. Okay. One, one thing that may be helpful to add there is if you're able to do the calculation based on a public set of rules. If you aren't deriving a calculation through some proprietary method or even if you are it gives some transparency to how you got there so if you're using something like an admission factor, or some type of some type of metric that has been scientifically but states are changes. The benefit of putting it on a public ledger is we can identify every element of the set of rules across the entire ecosystem. All the rules are in public, you can identify all the parts of the rules that may have system systemic failures that aren't that that's not the role of the publisher right to to do the scientific and academic verification of that element their really just be to create the digital workflow that creates audit ability. And then if we have the systems to identify those gaps in the rules, say for example, we look at the Higg index the Higg index made certain assumptions that proved not to be what everyone else thought they would be, and by folks making buying and purchasing decisions off of the Higg index, people's expectations weren't met, and that led to a systemic risk of how people behave because they thought it was driving the same thing could be said amongst any at any methodology across any, any of the registries how they're using implementations and if we bring those out in public, we get rid of some of the systemic risks in the sense of not being able to find them and correct them. Of course, we're going to have to do things maybe make adjustments, but those adjustments all become a lot easier once it's on the public ledger. Make sense. Okay. Yeah, thank you. Thanks, Jeff. Thank you James I see your hand raised. You're on mute. Yeah, apologies colleagues. I wasn't another meeting prior to this. Just two questions. You, you said that you're not a digital mv. Is that correct. Correct. So the guardian and what you're seeing over here, manage guardian service is just a tool set that that that the standard bodies, or the standard registries or issuers can use. And this. So, so if you already have that, just for clarity, if you already have your, your own mv, this then becomes a bolt on for tokenization. Is that what you say via API, via some sort of API integration. I think I think you cut out but I think you're saying that if you have your own methodology, can you create that in here. But if, if you already have your mv, right. You could use your API in order to tokenize whatever credits you want to produce. Is that what what you're saying. So I think I understand you correctly, but let me let me try to reiterate it. If you have your own methodology. Okay, you would want to be able to translate that methodology into a guardian policy. Right. Because because that way, as you follow through your methodology workflow, and all the rules that are part of your methodology, you then get to the token minting event. Without all of the previous information, you kind of lose the benefits of running everything through the policy workflow engine to create the trust chain. Okay. I think what you're saying is that you have a methodology. Can you just create the token and I would say that you would be better off if you were to bring that methodology into the guardian format. Right. Well, what if I'm trying to tokenize someone's estate. Okay, they have, they have treated a process by which, you know, they are converting waste into new products and services without producing any food emissions. How could I then, you know, bring that sensitive. Right. I think it's use of, of data compromise and challenges around that. So I think that the way that you would go about it is the same way that we discussed for example to go about digitizing the very methodology where you want to be able to create your schemas. Which is the data structure of what you're talking about. And to be able to design the workflow, who does what and what time by what approval. Using what schemas. And then again bringing translating the schema attributes into the guardian. Again, you can create them this way by creating your fields. And then you go and you create your policy, just like we created our policy, but now you would incorporate all of these workflow actions, referring back to the schemas that you created. Okay, so so you're not providing any sort of assurance that the original data is fit for purpose. What you're just doing is just the mechanism by which you convert that emissions data into some tokenized product. You're asking, yeah, you're asking essentially the, the, I know the blockchain hasn't been around for that long, but you're asking essentially the blockchain age old question how can we make sure that the data going in is accurate. Right. Is that the reputation, the reputation component that was mentioned. Exactly. So, so there's those components. But I would say, again, what's the most important thing about the guardian is the verification that when the, the workflow actions are approved. You have accountability for the actor that approved that. Right. Right. I got you. You know what I'm saying, because we're using dids and VCs. Let me see if this is still here, or if it reset. Yeah, good. So I think we still have this here. So for example, the VVB in this example, the VVB in this example, when they approved my nonsense of 111111111, they're going to be the ones held accountable for it. And this is what we mean to provide auditable records because let's be honest. We all can't sit here with a straight face and say that we can prove that the data coming in is always going to be accurate like garbage in garbage out that's just what it is. We know that the data coming in is going to be immutable. That's all good. But now we have a way that we can prove and have accountability for the verifiers or who approved what because literally everyone is signing these transactions. Just one thing to add on because it's an identity based workflow system. Those identities are portable. So if you have a verifier who's working on this trust chain, they may also be a verifier who's working on another trust chain on a completely different process. And you can see those reputations built over time. So for example, if you're doing an admissions assessment on a specific project or you're doing assurance to make sure that somebody's accounted properly, even though it may be a different process than what somebody else is doing. You can look at that reputation over time, see the credentials that they've received. Maybe you could even you and you could even look to the processes that they're tied to write the whole project and see if those projects have had reversals. Is it tied to the data that they've approved, and you can start to understand and develop the trustless, you know, kind of trustless viewpoint of well, I don't have to trust somebody else saying that they're good I can see it myself. I can actually prove that this means my standard because all the attributes they're tied to and all the credentials they're tied to within the process that they were working in are good. And the data that they've approved is good. If it doesn't meet your standards though you can say hey I don't I don't want to work with that with that group. If the data they're proving doesn't meet my standards or maybe the project that they decided to work with didn't meet my standards, and that's okay and you can work with other folks who maintain these identities and credentials or you can even onboard net new parties who have those credentials. That makes sense. Right killer question. What is the benefits of using this platform as opposed to fabric. So, from a architectural perspective. The Guardian is very different than fabric. I actually used to work at Oracle, which was a big hyper ledger still as a big hyper ledger fabric proponent. And then one of the big reasons I ended up joining Hedera back in 2019 was because of the need to keep some of these assets on a public ledger specifically things like emissions and offsets. And the reason why we say and the reason why our mission is bringing the balance sheet of the planet to the public ledger. You shouldn't be able to get rid of the history of your emissions, or the balance of your ecological activities by turning off your cloud subscription by turning off your local server. And that that permission to a private private network. That's our that's our viewpoint we think it should be fully auditable and information discoverable and that that entire province of information and immutability should withstand the test of time and not just how long you can afford to pay for the history of those assets. That being said, of course, if the history of the assets go away and you have the storage, you may have some of that from fabric. At the same time, you get the trust in viewed from the public network that's governed and we didn't really talk about governance of Hedera today but it's governed by Fortune 500 and blue chip companies within their industry across not just a single industry but their geographies you have companies like Google, Tata, IBM, DLA Piper Standard Bank, and all these organizations have signed up and said I'm going to run a note of this network. I'm going to participate in membership and bringing on new members who have use cases and improve corporate utilization, because these, these types of use cases actually change how their companies work. And the very cool thing is with the coin economics committee of work and public network Hedera, they've actually put the money where the mouth is and they said, I'm going to invest in ecosystem development. We're asking the sustainable impact fund we're actually the first network to launch a major sustainability fund, because all of these companies actually have robust ESG and sustainability goals and they want to invest in this area that they view as collaborative. They view bringing the balance sheet of the planet to the public ledger as a collaborative opportunity, rather than a competitive one right accounting for emissions meeting net zero goals and proving that in the public domain is something they're all, they're all very interested in and have invested in along those lines. All right, so it's the, it's the current question of whether to be public or to be private in terms of emissions data because one day this is going to be taxed. One day, and I think some companies have been very, are very reticent and putting that data out in the public because it'll be called up at some point in time so personally I'm yet to make a decision which way to go. Right, because it's, it's not being asked for as yet by, by, by the average business, we may not talking to the leaders here you know because we talking here just, let's say for example, construction company down the road they produce tons of emissions, but they're not asking about they're just asking for a basic report, but in the long run, I believe that there's something like this will be become common and as I shoot with a test, I'm very much out there sense checking and trying to understand exactly which direction to head. So, feel free to put, you know, links in the chat so that I can have a closer look at it. So, if you can. That's me. Yeah, absolutely. And I shared a link of time as he's working with a mining company just to give a mind of a constructive example in those physical industries. What I would say is, there's also something that we don't talk about by putting it on a public ledger which is this automation opportunity. If you have the map of the methodologies, you have the map of the project data. When you're going to do reporting to say CDP or GI, I have for us any of the big reporting entities that process of collecting data is very hard it's time, time burdensome by putting this on the public ledger. So, if you have standards which are our ecosystem is mapping out, you actually can automate the collection of that data and save companies time and money. So the ironic thing is even though this is a compliance effort by using public ledgers it actually can save folks money to by collecting information and saving them time so they can focus on more productive climate activities and that's that's one of the benefits we don't talk about enough on using full and robust DMR be across different processes, whether it's in purchasing offsets, or that are linked to that are not just fungible and not traceable to that point, or taking processes like carbon emissions where we're protecting the private information keeping it private using selective disclosure and ZK, but also mapping that to the proper carbon emissions accounting and other types of SDG related accounting. Fantastic. Sorry. Cheers. Thank you very much. I really appreciate all you guys taking out another 24 minutes of your day this is really, really valuable. And we actually have a large number of participants on the call so there's a lot of interest in this. Thank you. Thank you, Wes. And thank you Daniel will be posting this recording of this to you too as well for everybody like me that feels like going to have to watch this a second time to fully download everything that was shared. Again, with Daniel, thank you for this all. We really appreciate your time today. Absolutely. And for both folks who want to get more involved. If you're looking how can I use this. How can I get a funded opportunity. We have a hackathon we're doing right now on methodologies for SPS Boston we have 17 and a half thousand dollars of prizes just around guardian methodologies and tooling in addition to the other prizes I think the total hackathon pool of funding is over $40,000 right now. So if you want to say hey how can I use this please check that out. I think it's really exciting for folks to get involved. I just want to say thank you all again for having us it's been great to be here and we look forward to coming back in the future and participating in the calls going forward. Thank you. Thank you.